Streaming Data Analytics with Azure IoT and Power BI

Streaming Data Analytics with Azure IoT and Power BI

Published

May 1, 2020

Data Analytics
Microsoft Azure
Power BI

Recently, we had a client reach out with an interesting challenge – to see a data stream from a device visualized in real-time without the need to refresh the page for updated results or even wait for a data pipeline to run its next scheduled job. The foundation of our solution was Microsoft Azure, where organizations can quickly deploy scalable, managed services around Internet of Things (IoT) applications. So, here is what we built – a combination of Azure components as the base architecture and Power BI’s streaming data set for visual analytics.

At a high level, four key pieces in Azure are required to get messages from a device to the live visualizations in Power BI – The IoT device sends a message to Azure IoT Hub. When this message is received, an Azure Function transforms the message into the proper JSON format and sends an output to an Event Hub. When the Event Hub receives the message, Azure Stream Analytics pushes the message to a custom Power BI dataset. Lastly, this allows the data to move in real-time in Power BI without a data-sync related delay. Let’s walk through these components in detail. 

Create an Azure IoT Hub

To build this solution, the first component needed is an Azure IoT Hub. This hub serves as a manager for all your IoT devices, provides security, and ensures reliable communication between the device and the hub. In a production scenario, these IoT devices could be any of the numerous smart devices found in a workplace. But for this example, we will be simulating a raspberry pi sending messages. Setting up an IoT hub is simple. Navigating to the IoT Hub pane within Azure, selecting “Add”, entering your preferred resource group, and creating a name for the hub will get the hub up and running. Creating a test device is just as easy. Within your new IoT Hub, select IoT devices and click “New”. There are settings for the device, but only an Id for the device is required to get it running. If done successfully, the device should appear in the list.

Figure 1- Azure IoT Hub Device List

To test the device, we used Visual Studio with the Azure IoT Hub extension enabled. Once connected with Azure, the test device should be visible. To test, right-click the device and select “Send D2C Message to IoT Hub” (Fig 2). This allows you to send plain text or a JSON template to the IoT Hub. The successful message count can be seen within the IoT Hub metrics by selecting the “Telemetry Messages Sent” metric.

Figure 2 – Visual Studio – Azure IoT Hub Extension D2C Messages

Configure an Azure Event Hub

Next, the Event Hub needs to be created. The Event Hub is an ingestion service that helps process streaming data. While this may not be necessary for all projects depending on the message source and format, we included it to do data transformations with an Azure Function. This Event Hub will take the output from our Azure Function and connect with an Azure Stream Analytics Job to send data to Power BI. Creating an Event Hub namespace is like IoT Hub, which requires a name, pricing option, and resource group. Once the namespace is created, an Event Hub entity can be added under the “Entities” pane.

Figure 3 – Azure Event Hub Namespace and Event Hub Entity

Develop an Azure Function

In our example, an Azure Function is necessary because the JSON format required by the Stream Analytics Job did not match the string format of the messages coming from the device. So, with the Azure Functions extension within Visual Studio, we created a new function with an IoTHubTrigger template. This trigger means that every time the IoT Hub receives a message, the function will run with using the message’s data. This function will connect with the IoT Hub using the “Event Hub-compatible endpoint” within IoT Hub’s Built-in Endpoints which can be found under your IoT Hub settings.

Figure 4 – Azure Function

This small function returns the data from the message in a JSON object format to the previously created Event Hub as a new message. The function connects to the Event Hub using a connection string found within the shared access policies for the Event Hub Namespace. Developers should store the connection strings within the local.settings.json file that is created when using the IoTHubTrigger template. After deploying the Azure Function, the connection strings should be entered within the configuration settings on Azure.

Connect an Azure Stream Analytics Job

The next piece required is an Azure Stream Analytics Job. This is what relays the data from the Event Hub to Power BI for live visualizations. Creating a new Stream Analytics Job also requires a name and resource group. Once created, an input and an output need to be configured to relay messages. Stream Analytics has three input options: Event Hub, IoT Hub, and Blob Storage. All settings should be applied when creating an input by selecting the existing Event Hub and choosing the JSON serialization format.

There are many options for the stream output, including Event Hubs, SQL Database, and Blob Storage. We will select Power BI as the output for this project. Once authorized with Power BI, all that needs to be entered is the dataset name and the table name. This will then automatically create the dataset in Power BI. Please note that the data set won’t be created until the first row of data is streamed.

The final step for Stream Analytics is writing the query to get the data desired. Only the EventProcessedUtcTime (needed to show time on a Power BI data visualization axis) and your data column are required but there is no limit. The query must specify the data is from the event hub input and is being pushed to the Power BI output using the same names specified when they were created.

Figure 5 – Azure Stream Analytics Query

Stream Data with Power BI

Figure 6 – Power BI Tile Settings

The final piece of the process is to create the dashboard in Power BI. This dashboard needs to be created in the same workspace where the Streaming Analytics Job dataset was created. When adding a tile, select Custom Streaming Data to choose the dataset and configure the settings. For tiles that require an axis, the EventProcessedUtcTime column must be selected as this allows for the axis to move as more messages are being sent over time. The time window to display setting sets the maximum time on the axis (Fig 6).  From our testing, shorter time frames respond better to frequent messages, but this setting can be changed for each visual so choose what best provides the intended effect.

Once all the components have been configured, it is time to test. If using the IoT test device, the best way to test is to send messages from Visual Studio using a D2C message with the Azure IoT extension. Sending many messages allows time for troubleshooting.

Troubleshooting

If data is not being shown in Power BI, there are a couple of things to double-check. First, try refreshing the Power BI page while messages are still being sent. Sometimes the original connection might not refresh once messages are starting to be sent. If there are still no messages, edit the dataset and turn off historical analysis. This setting allows for saving data and can cause issues if data types change. A third troubleshooting location is in the Stream Analytics Job. If data is not in the proper format, Stream Analytics will show a warning for the input or output. Data will not show in Power BI until it is in the expected format.

Functionality

While Azure and Power BI provide one of the best solutions for live streaming data in terms of ease of use and functionality, this setup is not perfect. Power BI does have some limitations when it comes to custom streaming datasets. The most prevalent issue is the lack of customization. There are only five tile choices for the dashboard – card, line chart, clustered bar chart, clustered column chart, and gauge. These are some of the simpler visualization options Power BI offers and limits the ability to create more complex dashboards for greater insights. There is no ability to change the visualization colors, which leaves users stuck with the default green color option. This prevents users from matching the theme of their Power BI environment causing the dashboard to clash with other pages.

The other common issue with this system is the lack of error reporting. When something is not working – data not loading in Power BI for example – there is often no error code presented to the user. The user must go step by step through the process until the issue is found rather than skipping directly to the error. While there are some limitations, these problems are more related to the user experience rather than the functionality of the system. Overall, this system does exactly what it is intended to do and can provide great live insights into your IoT devices.

OneSix is a Microsoft Aszure Consultant

We help companies solve their most complex problems with cloud technology and yield incredible results.