Last week I published a video on how to How to analyze news articles and get a sentiment score with Watson NLU without writing any code. The video was a high-level overview of the no-code application. In this blog post I’m going to show you how to build this flow step-by-step.
Tools used
The tutorial is going to use the following tools:
- Parabola: no-code platform
- News API: API for searching news
- Watson Natural Language Understanding (NLU): API for analyzing news sentiment
- Google Sheets: to save the results
All tools have a free plan that you will be able to sign up and use. I’m assuming you already have a Google account and access to Google Sheets.
Getting News API account
In this section you are going to sign up for a free News APIaccount. News API provides API to get the latest news.
- Go to https://newsapi.org/ and create a free account. Once the account is created you will be able to see your API key here: https://newsapi.org/account
- You will be using Top Headlines API which you can find here: https://newsapi.org/docs/endpoints/top-headlines.
https://newsapi.org/v2/top-headlines?country=us&apiKey=your_api_key
Feel free to change the country or set any other parameters.
This page also shows you the results from calling this service (you can also test the API in a tool such as Postman or via a command line).
The next step is to create a Waston NLU service on IBM Cloud.
Creating a Watson NLU service on IBM Cloud
In this section you are going to create a Watson NLU service which is available on the IBM Cloud. IBM Cloud offers a completely free (forever) account. It’s called a Lite account.
- Register for a free IBM Cloud account (or click Log in on the same page to use an existing account)
- You should be on the Dashboard page. Just in case you are not there, click https://cloud.ibm.com/
- Click Create Resource button
- In the search field enter NLU and select Natural Language Understanding from the list
- Open the Select a region list and choose a region that’s closest to you. I’m in California so for me it will be Dallas
- Next click Create button to create the service
- Once the service is created, switch to Manage tab. On this tab you will see the service API key and its URL. You will use this information a little bit later
The next step is to start building the flow in Parabola.
Using Parabola to build the flow (application)
First a quick introduction to Parabola.
What is Parabola
Parabola is a no-code platform. It enables you to build flows (small application and automations) without writing any code. A flow consists of steps. Each step in a flow takes data as input, processes data and outputs data. Then this output is used as input for the next step and so on. Here is how Parabola describes its platform on the web site:
Parabola is a drag-and-drop productivity tool that runs in your browser. We have a library of customizable, prebuilt components designed for ecommerce operations and marketing teams to pull in data, combine and transform it in bulk, and automatically take action.
I look at it as a visual serverless platform. Every step reminds me of a small serverless function that does something specific. The steps are connected together to build a flow or a small program.
Registering for a Parabola account
The first step is to create a Parabola account. Go to https://parabola.io/app/signup and create a free account.
Once the account is created, click Add a new flow button.
You are going to build a flow that looks like this:
The flow steps are:
- API Import: invoke an API to get news (flow input)
- API Enrichment: get sentiment score for each news article
- Select columns and Rename columns: format the data
- Send to Google Sheets: send results to Google Sheets document (flow output)
Pull from an API step
The right side in the flow editor holds all the components (steps) you can use. You can find a step by searching for it. Start typing api:
Select Pull from an API and drag it on to the flow.
Double-click on the step to open its properties. Set the following properties:
- Request type: GET
- API Endpoint URL: https://newsapi.org/v2/top-headlines?country=us&category=general&pageSize=10
- Add a custom header x-api-key and set its value to your Newsorg API key
-
Top Level Key: articles
- This step unpacks your JSON arrays into nice into a nice table
Once you entered all the values, click Show Updated Results to invoke the API. Your actual results will be different but structure-wise will look like this:
If you look at other columns in the table you will see a url column. This is the data (column) you are going to pass to Watson NLU to determine an article sentiment score in the next step.
Enrich with API step
Enrich with API step takes every row and enriches it with an external API call. In this tutorial you need to pass the url column to Watson NLU service. Watson NLU service will then analyze the URL and return its sentiment score.
- Find Enrich with API step and add it to the flow
- Then make a connection from Pull from API to Enrich with API
- Double click on Enrich with API to open its settings
- For API endpoint URL, copy the URL value from Watson NLU service Manage tab and then also append v1/analyze?version=2019-07-12&url={url}&features=sentiment to the end. The final value should look like this:
https://api.us-south.natural-language-understanding.watson.cloud.ibm.com/instances/ce7cf874-xxxx-xxxx-xxxx-xxxxxxxxxxxx/v1/analyze?version=2019-07-12&url={url}&features=sentiment
- Authentication: Username and Password
- Username: apikey (as text value)
- Password: copy the API Key from Watson NLU Manage tab
- Click Show Updated Results button to test this step
If you look in the Input tab you will see the input passed to this step. Results tab shows results after Watson NLU API is invoked. I scrolled the results table to see the columns added from invoking the Watson service. The data we want is in api.sentiment.document.score and api.sentiment.document.label columns.
In the next step you are going to format the data so we don’t export all data (columns) to Google Sheets.
Select columns step
After the last step the table has 15 columns. But you don’t need all the columns. You are only interested in four of the columns. This step will allow you to continue working only with specified columns.
- Add Select columns step to the flow and create a connection
- Using Columns to Keep option select title, url, api.sentiment.document.score and api.sentiment.document.label columns
In the next step you are going to rename columns.
Rename columns step
You will rename two columns in this step.
- Add Rename columns step on to the flow
- Rename api.sentiment.document.score to sentiment score
- Rename api.sentiment.document.label to sentiment score label
This step made column names consistent and more user-friendly.
Now that you have all the data and it’s formatted, the last step is to send it to Google Sheets.
Send to Google Sheets step
In the last step you are going to send data in four columns to Google Sheets.
- Add Send to Google Sheets step to the flow.
- Double-click to on the step to open its settings. You will need to allow Parabola to connect to your Google account
- File: choose to which Google Sheets file to export to
- Sheet: choose which sheet to use from Google Sheets file
- Export rule: Overwrite Sheet (This will overwrite each run. You can also append each run)
You are ready to run the flow.
Running the flow
Let’s save all the changes and run the flow.
- Before running the flow you need to save and update the live version. Click Update Live Version button
- From the Live tab, click Run Flow Now button
Your page should look like this:
As the flow runs you will see each step highlighted during its execution.
In the Scheduling Rules section you can schedule the flow to run periodically. For example, once every hour or once a day.
If you look in Google Sheets, you should see the following (your actual news will be different):
The four columns in Google Sheets are the same columns (and names) from the Parabola flow. In the last column (sentiment score label) I added conditional format rules:
Summary
This tutorial showed you how to use Watson NLU service without writing any code. This no-code application gets the latest news articles, passes them to Watson NLU to get a sentiment score and the results are saved into a Google Sheets file. I mentioned this above, this application (or automation) can also be scheduled to run periodically. I hope you find this tutorial valuable. I’d love to hear your feedback in comments. If you have any other ideas for no-code applications – please let me know in comments.