South Florida Code Camp – Virtual Reality and IoT

March 2nd 2019, I will be presenting Virtual Reality and IoT – Interacting with the changing world at the South Florida Code Camp in Davie, FL. You can register here – and its FREE. Here is the synopsis of the presentation:

Abstract

Whether it’s called mixed reality, augmented reality, or virtual reality; changing the perceived reality of the user is here. This presentation will look at how to create IoT devices, pipe the data created by them through the cloud, and use that data to drive a virtual or mixed reality experience. Creating applications that interact with the data created by the user’s environment should become straightforward after this introduction.

 

 

Authoring for Pluralsight

Coming soon I will be authoring a course for Pluralsight titled – “Identify Existing Products, Services and Technologies in Use For Microsoft Azure” . This course targets software developers who are looking to get started with Microsoft Azure services to build modern cloud-enabled solutions and want to further extend their knowledge of those services by learning how to use existing products, services, and technologies offered by Microsoft Azure.

Microsoft Azure is a host for almost any application, but determining how to use it within existing workflows is paramount for success. In this course, Identify Existing Products, Services and Technologies in Use, you will learn how to integrate existing workflows, technologies, and processes with Microsoft Azure.

We explore Microsoft Azure with the following technologies:

  • Languages, Frameworks, and IDEs –
    • IntelliJ IDEA
    • WebStorm
    • Visual Studio Code
    • .NET Core
    • C#
    • Java
    • JavaScript
    • Spring
    • NodeJS
    • Docker
  • Microsoft Azure Products
    • Azure App Services
    • Azure Kubernetes
    • Azure Functions
    • Azure IoT Hub

Hopefully we can take a developer familiar with the languages, frameworks, and ides available and make have them up and running on Microsoft Azure after this short course.

Update Conference Prague

I have been selected to speak at Update Conference Prague during 

  •  Enable IoT with Edge Computing and Machine Learning
  • Virtual Reality and IoT – Interacting with the changing world

Enable IoT with Edge Computing and Machine Learning

Being able to run compute cycles on local hardware is a practice predating silicon circuits. Mobile and Web technology has pushed computation away from local hardware and onto remote servers. As prices in the cloud have decreased, more and more of the remote servers have moved there. This technology cycle is coming full circle with pushing the computation that would be done in the cloud down to the client. The catalyst for the cycle completing is latency and cost. Running computations on local hardware softens the load in the cloud and reduces overall cost and architectural complexity.

The difference now is how the computational logic is sent to the device. As of now, we rely on app stores and browsers to deliver the logic the client will use. Delivery mechanisms are evolving into writing code once and having the ability to run that logic in the cloud and push that logic to the client through your application and have that logic run on the device. In this presentation, we will look at how to accomplish this with existing Azure technologies and how to prepare for upcoming technologies to run these workloads.

Virtual Reality and IoT – Interacting with the changing world

Using IoT Devices, powered by Windows 10 IoT and Raspbian, we can collect data from the world surrounding us. That data can be used to create interactive environments for mixed reality, augmented reality, or virtual reality. To move the captured data from the devices to the interactive environment, the data will travel through Microsoft’s Azure. First it will be ingested through the Azure IoT Hub which provides the security, bi-directional communication, and input rates needed for the solution. We will move the data directly from the IoT Hub to an Azure Service Bus Topic. The Topic allows for data to be sent to every Subscription listening for the data that was input. Azure Web Apps subscribe to the Topics and forward the data through a SignalR Hub that forwards the data to a client. For this demo, the client is a Unity Application that creates a Virtual Reality simulation showcasing that data.

Once finished with this introduction to these technologies, utilizing each component of this technology stack should be approachable. Before seeing the pieces come together, the technologies used in this demonstration may not seem useful to a developer. When combined, they create a powerful tool to share nearly unlimited amounts of incoming data across multiple channels.

Storing Event Data in Elastic Search

There was a CPU and network issue when the hosts uploaded data directly from the client to Elastic Search.  To allow for the same data load without the elastic overhead running on the client the following architecture was used:

  • Hosts use Event Hubs to upload the telemetry data
  • Consume Event Hub data with Stream Analytics
  • Output Stream Analytics query to Azure Function
  • Azure Function to upload output to Elastic Search

Event Hub

To start, the hosts needed an Event Hub to upload the data. For other projects Azure IoT Hub can be used due to Stream Analytics being able to ingest both. Event Hub was chosen so that each client would not need to provision as a device.

Create an Event Hubs namespace

  1. Log on to the Azure portal, and click Create a resource at the top left of the screen.
  2. Click Internet of Things, and then click Event Hubs.
  3. In Create namespace, enter a namespace name. The system immediately checks to see if the name is available.
  4. After making sure the namespace name is available, choose the pricing tier (Basic or Standard). Also, choose an Azure subscription, resource group, and location in which to create the resource.
  5. Click Create to create the namespace. You may have to wait a few minutes for the system to fully provision the resources.
  6. In the portal list of namespaces, click the newly created namespace.
  7. Click Shared access policies, and then click RootManageSharedAccessKey.
  8. Click the copy button to copy the RootManageSharedAccessKey connection string to the clipboard. Save this connection string in a temporary location, such as Notepad, to use later.

Create an event hub

  1. In the Event Hubs namespace list, click the newly created namespace.
  2. In the namespace blade, click Event Hubs.
  3. At the top of the blade, click Add Event Hub.
  4. Type a name for your event hub, then click Create.

Your event hub is now created, and you have the connection strings you need to send and receive events.

Stream Analytics

Create a Stream Analytics job

  1. In the Azure portal, click the plus sign and then type STREAM ANALYTICS in the text window to the right. Then select Stream Analytics job in the results list.Create a new Stream Analytics job
  2. Enter a unique job name and verify the subscription is the correct one for your job. Then either create a new resource group or select an existing one on your subscription.
  3. Then select a location for your job. For speed of processing and reduction of cost in data transfer selecting the same location as the resource group and intended storage account is recommended.Create a new Stream Analytics job details

    Note

    You should create this storage account only once per region. This storage will be shared across all Stream Analytics jobs that are created in that region.

  4. Check the box to place your job on your dashboard and then click CREATE.job creation in progress
  5. You should see a ‘Deployment started…’ displayed in the top right of your browser window. Soon it will change to a completed window as shown below.job creation in progress

Create an Azure Stream Analytics query

After your job is created it’s time to open it and build a query. You can easily access your job by clicking the tile for it.

Job tile

In the Job Topology pane click the QUERY box to go to the Query Editor. The QUERY editor allows you to enter a T-SQL query that performs the transformation over the incoming event data.

Query box

Create data stream input from Event Hubs

Azure Event Hubs provides highly scalable publish-subscribe event ingestors. An event hub can collect millions of events per second, so that you can process and analyze the massive amounts of data produced by your connected devices and applications. Event Hubs and Stream Analytics together provide you with an end-to-end solution for real-time analytics—Event Hubs let you feed events into Azure in real time, and Stream Analytics jobs can process those events in real time. For example, you can send web clicks, sensor readings, or online log events to Event Hubs. You can then create Stream Analytics jobs to use Event Hubs as the input data streams for real-time filtering, aggregating, and correlation.

The default timestamp of events coming from Event Hubs in Stream Analytics is the timestamp that the event arrived in the event hub, which is EventEnqueuedUtcTime. To process the data as a stream using a timestamp in the event payload, you must use the TIMESTAMP BY keyword.

Consumer groups

You should configure each Stream Analytics event hub input to have its own consumer group. When a job contains a self-join or when it has multiple inputs, some input might be read by more than one reader downstream. This situation impacts the number of readers in a single consumer group. To avoid exceeding the Event Hubs limit of five readers per consumer group per partition, it’s a best practice to designate a consumer group for each Stream Analytics job. There is also a limit of 20 consumer groups per event hub. For more information, see Event Hubs Programming Guide.

Configure an event hub as a data stream input

The following table explains each property in the New input blade in the Azure portal when you configure an event hub as input.

Property Description
Input alias A friendly name that you use in the job’s query to reference this input.
Service bus namespace An Azure Service Bus namespace, which is a container for a set of messaging entities. When you create a new event hub, you also create a Service Bus namespace.
Event hub name The name of the event hub to use as input.
Event hub policy name The shared access policy that provides access to the event hub. Each shared access policy has a name, permissions that you set, and access keys.
Event hub consumer group (optional) The consumer group to use to ingest data from the event hub. If no consumer group is specified, the Stream Analytics job uses the default consumer group. We recommend that you use a distinct consumer group for each Stream Analytics job.
Event serialization format The serialization format (JSON, CSV, or Avro) of the incoming data stream.
Encoding UTF-8 is currently the only supported encoding format.
Compression (optional) The compression type (None, GZip, or Deflate) of the incoming data stream.

When your data comes from an event hub, you have access to the following metadata fields in your Stream Analytics query:

Property Description
EventProcessedUtcTime The date and time that the event was processed by Stream Analytics.
EventEnqueuedUtcTime The date and time that the event was received by Event Hubs.
PartitionId The zero-based partition ID for the input adapter.

For example, using these fields, you can write a query like the following example:

SELECT
    EventProcessedUtcTime,
    EventEnqueuedUtcTime,
    PartitionId
FROM Input

Azure Functions (In Preview)

Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. It lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Azure Functions to respond to triggers makes it a natural output for an Azure Stream Analytics. This output adapter allows users to connect Stream Analytics to Azure Functions, and run a script or piece of code in response to a variety of events.

Azure Stream Analytics invokes Azure Functions via HTTP triggers. The new Azure Function Output adapter is available with the following configurable properties:

Property Name Description
Function App Name of your Azure Functions App
Function Name of the function in your Azure Functions App
Max Batch Size This property can be used to set the maximum size for each output batch that is sent to your Azure Function. By default, this value is 256 KB
Max Batch Count As the name indicates, this property lets you specify the maximum number of events in each batch that gets sent to Azure Functions. The default max batch count value is 100
Key If you want to use an Azure Function from another subscription, you can do so by providing the key to access your function

Note that when Azure Stream Analytics receives 413 (http Request Entity Too Large) exception from Azure function, it reduces the size of the batches it sends to Azure Functions. In your Azure function code, use this exception to make sure that Azure Stream Analytics doesn’t send oversized batches. Also, make sure that the max batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.

Also, in a situation where there is no event landing in a time window, no output is generated. As a result, computeResult function is not called. This behavior is consistent with the built-in windowed aggregate functions.

Query

The query itself if basic for now. There is no need for the advanced query features of Stream Analytics for the host data at the moment however it will be used later for creating workflows for spawning and reducing hosts.

Currently, the query will batch the data outputs from event hub every second. This is simple to accomplish this using the windowing functions provided by Stream Analytics. In the Westworld of Warcraft host query, a tumbling window batches the data every one second. The query looks as follows:

SELECT
    Collect()
INTO
    ElasticUploadFunction
FROM
    HostIncomingData
GROUP BY TumblingWindow(Duration(second, 1), Offset(millisecond, -1))

 

Azure Function

Create a function app

You must have a function app to host the execution of your functions. A function app lets you group functions as a logic unit for easier management, deployment, and sharing of resources.

  1. Click Create a resource in the upper left-hand corner of the Azure portal, then select Compute > Function App.Create a function app in the Azure portal
  2. Use the function app settings as specified in the table below the image.Define new function app settings
    Setting Suggested value Description
    App name Globally unique name Name that identifies your new function app. Valid characters are a-z, 0-9, and -.
    Subscription Your subscription The subscription under which this new function app is created.
    Resource Group myResourceGroup Name for the new resource group in which to create your function app.
    OS Windows Serverless hosting is currently only available when running on Windows. For Linux hosting, see Create your first function running on Linux using the Azure CLI.
    Hosting plan Consumption plan Hosting plan that defines how resources are allocated to your function app. In the default Consumption Plan, resources are added dynamically as required by your functions. In this serverless hosting, you only pay for the time your functions run.
    Location West Europe Choose a region near you or near other services your functions access.
    Storage account Globally unique name Name of the new storage account used by your function app. Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only. You can also use an existing account.
  3. Click Create to provision and deploy the new function app. You can monitor the status of the deployment by clicking the Notification icon in the upper-right corner of the portal.Define new function app settingsClicking Go to resource takes you to your new function app.

Elastic Search

Now the data is in Elastic Search which if the instructions in the Elastic Search setup post were followed, should be accessible from the Kibana endpoint.

Create Azure Function to process IoT Hub file upload

 

On the Wren Solutions project, there was need to sync a large data set from a device and merge data from it into an existing data set in Microsoft Azure. To accomplish this we decided to use the following workflow:

Upload the file using Azure IoT Hub

If you haven’t, first you have to create an Azure IoT Hub.

Associate an Azure Storage account to IoT Hub

When you associate an Azure Storage account with an IoT hub, the IoT hub generates a SAS URI. A device can use this SAS URI to securely upload a file to a blob container. The IoT Hub service and the device SDKs coordinate the process that generates the SAS URI and makes it available to a device to use to upload a file.

Make sure that a blob container is associated with your IoT hub and that file notifications are enabled.

To use the file upload functionality in IoT Hub, you must first associate an Azure Storage account with your hub. Select File upload to display a list of file upload properties for the IoT hub that is being modified.

file-upload-settings

Storage container: Use the Azure portal to select a blob container in an Azure Storage account in your current Azure subscription to associate with your IoT Hub. If necessary, you can create an Azure Storage account on the Storage accounts blade and blob container on the Containersblade. IoT Hub automatically generates SAS URIs with write permissions to this blob container for devices to use when they upload files.

file-upload-settings

enable-file-notifications

Use Azure IoT SDK to upload blob

Use the Azure IoT Hub C# SDK to upload the file. Below is a Gist of a code sample showing how to upload using the SDK. The code showcases how to utilize UploadToBlobAsync method on the Device Client. To use the sample replace the DeviceConnectionString and the FilePath variable.

Trigger a function on the Azure Blob creation

A Blob storage trigger starts an Azure Function when a new or updated blob is detected. The blob contents are provided as input to the function. Setup the blob trigger to use the container we linked to the Azure IoT Hub previously. First lets configure and manage your function apps in the Azure portal.

To begin, go to the Azure portal and sign in to your Azure account. In the search bar at the top of the portal, type the name of your function app and select it from the list. After selecting your function app, you see the following page:

azure-function-app-main

Go to the Platform Features tab by clicking the tab of the same name.

azure-function-app-features-tab

Function apps run in, and are maintained, by the Azure App Service platform. As such, your function apps have access to most of the features of Azure’s core web hosting platform. The Platform features tab is where you access the many features of the App Service platform that you can use in your function apps.

configure-function-app-settings

Add a connection string from the blob storage account as an app setting. For the sack of this demo lets name it MyStorageAccountAppSetting. Reference that in your JSON  for you Blob Trigger. Then use that blob name as a reference to that blob in your function.

 

 

Your app is currently in read-only mode because you have published a generated function.json

If you published your Azure function from Visual Studio and are seeing the message:

Your app is currently in read-only mode because you have published a generated function.json

Then do the following steps:

From the functions page click Platform Features.

Click platform features

After you go to the platform features page, click on App Service Editor

platform featuresAfter that, find your function in the list of functions. In the image below the function name is “IoTUploadProcessingFunction”. Expand the files underneath it and select the function.json file. Delete the line “generatedBy”: “Microsoft.NET.Sdk.Functions-1.0.0.0”.

Delete this line

After that your function should be running. If not go back to the functions screen in Azure and start it.