Click to learn more about author Craig Cullum.
In the “Will They Blend?” blog series, we experiment with the most interesting blends of data and tools.
Whether it’s mixing traditional sources with modern data lakes, open-source DevOps on the cloud with protected internal legacy tools, SQL with NoSQL, web-wisdom-of-the-crowd with in-house handwritten notes, or IoT sensor data with idle chatting, we’re curious to find out: Will they blend? Want to find out what happens when website texts and Word documents are compared?
Read the previous blog post in the series here.
Staying on top of your social media can be a daunting task. Twitter and Facebook are becoming the primary ways of interacting with your customers, but how do you keep track of every tweet, post, and mention? How do you make sure you’re jumping on the most critical issues and the customers with the biggest problems?
As Twitter has become one of the world’s preferred social media tools for communicating with businesses, companies are desperate to monitor mentions and messages to be able to address those that are negative. One way we can automate this process is through machine learning (ML), performing sentiment analysis on each tweet to help us prioritize the most important ones. However, building and training these models can be time-consuming and difficult.
There’s been an explosion in all of the big players (Microsoft, Google, Amazon) offering Machine Learning as a Service (MLaaS) or ML via an Application Programming Interface (API). This rapidly speeds up deployment, offering the ability to perform image recognition, sentiment analysis, and translation without having to train a single model or choose which machine learning library to use!
As great as all these APIs can be, they all have one thing in common. They require you to crack open an IDE and write code, create an application in Python, Java, or some other language.
What if you don’t have the time? What if you want to integrate these tools into your current workflows? The REST nodes in our Analytics Platform let us deploy a workflow and integrate with these services in a single node.
In this “Will They Blend?” article, we explore combining Twitter with Microsoft Azure’s Cognitive Services, specifically their Text Analytics API to perform sentiment analysis on recent tweets.
Topic: Use Microsoft Azure’s Cognitive Services with Twitter.
Challenge: Combine Twitter and Azure Cognitive Services to perform sentiment analysis on our recent tweets. Rank the most negative tweets and provide an interactive table for our social media and PR team to interact with.
Access mode/integrated tools: Twitter and Microsoft Azure Cognitive Services.
As we’re leveraging external services for this experiment, we will need:
You’ll need your Twitter developer account’s API key, secret, access token, and access token secret to use in the Twitter API Connector node. You’ll also want your Azure Cognitive Services subscription key.
Creating your Azure Cognitive Services account
When you log in to your Azure Portal, navigate to Cognitive Services and we’ll create a new service:
1. Click “add” and search for the Text Analytics service.
2. Click “create” to provision your service, giving it a name, location, and resource group. You may need to create a new resource group if this is your first Azure service.
Deploying this workflow is incredibly easy. In fact, it can be done in just 15 nodes.
The workflow contains three parts that take care of these tasks:
1. Extracting the data from Twitter and wrapping them into a JSON format that is compatible with the Cognitive Services API
2. Submitting that request to Cognitive Services
Azure expects the following JSON format:
Our Analytics Platform includes excellent Twitter nodes that are available from our Extensions, if you don’t already have them installed. This will allow you to quickly and easily connect to Twitter and download tweets based on your search terms.
We can take the output from Twitter, turn it into a JSON request in the above format, and submit. The Constant Value Column node and the JSON Row Combiner node wrap the Twitter output with the document element as expected.
The POST Request node makes it incredibly easy to interact with REST API services, providing the ability to easily submit POST requests.
You’ll need to grab the respective URL for your region. Here in Australia the URL is:
We can leave the authentication blank as we’ll be adding a couple of Request Headers.
We need to add for the Header Key:
and Header Value:
And another Header Key:
The Header Value for the Subscription Key will be the key provided as part of your Azure Cognitive Services you created.
If you’re using our Workflow as a guide, make sure to update the Twitter API Connector node with your Twitter API Key, API Secret, access token, and access token secret.
We can now take the response from Azure, ungroup it, and join these data with additional Twitter data such as username and number of followers to understand how influential this person is. The more influential they are, the more they may become a priority.
Reporting the Data
Once created you can use a Table View node to display the information in an interactive table, sorted by sentiment. This can be distributed to PR and social media teams for action, improving customer service.
To really supercharge your deployment and make this service truly accessible, you can use the WebPortal on our Server to create an interactive online sentiment service for your social media team, allowing them to refresh reports, submit their own Twitter queries, or provide alerting so your team can jump on issues.
So were we able to complete the challenge and merge Twitter and Azure in a single workflow? Yes, we were!
Coming Next …
If you enjoyed this, please share this generously and let us know your ideas for future blends.