Bring your OPC-UA tags to the cloud with AWS IoT and Kepware (2/3)

Matthieu Croissant
4 min readJul 10, 2020

In the last article, we have set up our data flow from opc tags up to a Kinesis data stream. The data is there and is waiting for us to process it.

We will now process our opc tags and create some metrics over them, to later on, display those. To do so we will be using AWS Kinesis analytics. It lets you process your data streams using a SQL query like language with additional capabilities like the time windows that we will use.

But let’s first create our Kinesis analytics application.

Go to AWS -> Services -> Kinesis -> Data analytics -> Create applicaiton

Your application is created and you can now setup you kinesis stream as an input for your analytics application

Select your stream as an input and leave everything else as is

In the bottom of the page you will find a button to discover schema click on it and if you have enough data AWS will automatically deduct SQL schema from the MQTT messages

Once done AWS will show you a success message with a sample of your data once put into a tabular format

You can now save and continue, your data is there and the schema is ready to be acted upon we can now tackle the real analytic part of this setup

Click on the go to SQL editor section and start the application. AWS provides a nice set of SQL query samples over there that could let you compute most of your desired use cases. I could go over a lot of functionalities of this kinesis analytics service once there, but as I’m not yet an expert into I’ll leave it for another article. I have a lot of idea on how to use this tool but the most straight forward to me would be KPI computation like OEE or alerting over production line stop detecting the place where it stopped based on saturation bits

For this demo, I will limit myself to a simple average over the last 10 seconds over a tag simulating a sinus function. I can now click save and run and will get a glimpse into the generated result

We are now done with setting up this 2nd part of the solution. We have data monitored through OPC that is passed in near real-time to our IoT Core which itself redirects it to a Kinesis Data Stream that gets processed by our Kinesis analytics platform.

That makes a lot of pipes to bring our data from one end to another, however as your solution would get more complex you would find benefits in going through all of this as you can redirect your data flow from one part to another and monitor all of them using AWS tooling.

Check the detailed tutorials here :

--

--

Matthieu Croissant

Software engineer with a focus on manufacturing and laboratory automation , scientific software https://www.linkedin.com/in/matthieu-croissant-0285a54a/