Bring your OPC-UA tags to the cloud with AWS IoT and Kepware (3/3)

Matthieu Croissant
5 min readJul 10, 2020

The last article was about creating some simple analytics on our opc data, we now are about to complete the solution with a powerful and yet simple (and free)dashboarding solution called Kibana which is provided on top of elastic search.

This is the type of dashboard we are aiming for, the goal of this last part that would allow us to go from opc tags to actual graphics that could be shared and displayed publicly.

But we still have a bit of configuration to complete like creating our Elasticsearch service. First go to services -> Elasticsearch -> Create a new domain.

To keep things simple I went for a development and testing deployment, this also helps to keep the cost of this experiment down.

Give your domain a name and pick the right instance type, here I went for a t2.small instance again to keep cost low. You can of course play with larger data set and so set up a larger instance if needed. Keep the rest as is and click next. You will have to set the rest of the parameters for access yourself, you can of course set everything to the public if you run a short time experiment but this comes with a high risk as pointed out by amazon.

You can review your settings and confirm the Elasticsearch creation and go for a coffee as you wait for it to be up and running…

Once up and running you can click on the Create Firehose Delivery Stream button and once again on create delivery stream. Give you stream a name and leave everything as is until destination selection.

Select Elasticsearch service as a destination. You then need to select your previously created elasticsearch service an give an index name and type name

You will have to create also an S3 backup which will be used to save the failed records so that you could recover in case of error or bug in your application. Click next and create a new IAM role if needed then complete the firehose creation. Again it might take some time to complete but stay tuned, one more step and we can finally start the dashboarding work.

Now your elasticsearch is ready and your kinesis firehose also but no data is coming in yet, that’s why we will connect the Kinesis analytics to the firehose.

Go back to your kinesis analytics application and click on Connect to a destination.

Setup the right firehose destination stream and your in-application stream which is the one created through the SQL query. Leave the rest as is and click on save and continue.

After some time you can go into your Kinesis firehose and the monitoring tab to see if data is flowing

You can also check your elasticsearch service and the index you create, you should see that some data is there and that the structure has been detected.

We are now done with the AWS service setup and can finally grab the fruit of all our work and display those data. If you into your elasticsearch service you can find the Kibana link that we need to start setting the dashboards.

Once in go to management -> index pattern

Let’s create our index pattern and let kibana detect our data type

Your result should somehow look like above, we have yet a small thing to do and it’s setting a datetime field as for now we only have a Unix timestamp as a number.

Go to the scripted field tab and configure your scripted field as above.

OK, time for some rendering. Go to visualize and create some visualization. Let’s first create a simple line chart and select the opc index.

Configure your Y-axis as above and you X-axis as below

Click the run button and just enjoy your first near-real-time opc value visualization.

--

--

Matthieu Croissant

Software engineer with a focus on manufacturing and laboratory automation , scientific software https://www.linkedin.com/in/matthieu-croissant-0285a54a/