Music City Code – IoT with Mobile

Featured

This year I will be at Music City Code presenting Configure, Control, and Manage IoT with Mobile.

Configure, Control, and Manage IoT with Mobile

Abstract

The internet of things allows for communication with devices through various means (without touch, mouse, keyboard, or a screen). Mobile devices give users a dynamic interactive experience with these devices by communicating over several different wireless protocols or through the cloud. In this presentation, we will see how to use Xamarin to create a cross platform mobile application to control devices of all shapes and sizes. After this presentation, attendees should be able to create a basic mobile application and have that application communicate with peripherals over Bluetooth and the cloud.

Description

This presentation is to showcase creating mobile applications with Xamarin and how those applications can interact with both off the shelf and with custom hardware. First, we will create a Xamarin Forms application; for iOS, Android, and Windows; that will interact with both Microsoft Azure and Bluetooth Low Energy to create an interactive experience with the hardware and the cloud. To get a better understanding, we will discuss mobile communication with the cloud and hardware to get a picture of how mobile can act as a bridge between the two.

 

Third Annual Internet of Things Startup Showcase

Join the Greater Atlanta Internet of Things at the Third Annual Internet of Things Startup Showcase on May 15th at Atlanta Tech Park.

We will host tables for IOT Startups, Innovation Teams, Companies and Hobbyists to show their IOT products and services.

12:00 pm to 5:00 pm – Atlanta Tech Park will host an open house for IOT members to learn about the ATP facility and its services to its members.

5:00 pm to 6:00 pm – Tech Connect Hub will host a Session on Collaboration between Startups and Corporate Teams

6:00 pm – Showcase will include tables for demonstrating IOT solutions for startups, hobbyists, Innovation Teams and Companies.

6:00 pm – Seminar for Startups and Sponsors to pitch their IOT Solutions (details will be forthcoming)

8:00 pm – Wrapup

Tables are still available
Sponsorships are available
Thanks to Atanta Tech Park http://www.atlantatechpark.com

http://ow.ly/xHLU30jHrt0

Mixed3d Scans

About four days ago Lamar and I went to Mixed3D to get some scans done for his birthday.  They have an interesting booth filled with Raspberry Pis that generate a 3D model. They also offer to print those 3D models in full color using sandstone (which we took them up on).  From the scans we picked 2 that best fit the theme of the day and were most usable in Lamar’s Unity projects.

Also, Adam got one done to use around the office

IRIS Conference

April 14th is the Integrative Research and Ideas Symposium (IRIS) hosted by the UGA Graduate-Professional Student Association. I will be speaking on three separate topics at the event:

  • Virtual Reality and IoT – Interacting with the Changing World
  • Enable IoT with Edge Computing and Machine Learning
  • Alternative Device Interfaces and Machine Learning

More than that though I look forward to hearing about the innovations and research provided by the graduate students and professionals at UGA. Here is their synopsis of IRIS:

The UGA Graduate-Professional Student Association is proud to announce IRIS 2018, a unique and exciting opportunity for students and other researchers from throughout the UGA community. 

This initiative’s focus on community-building, cross-pollination of ideas, transferrable skills, and service will:

  • Provide an excellent opportunity to enhance research communication skills and present research to an interdisciplinary audience. 
  • Expose students to cutting-edge scholarship, industry professionals, and rich professional development opportunities.
  • Help attendees refine the content and language of their C.V.’s and resumés through career workshops. 
  • Encourage shared scholarship, research, and service.
  • Equip attendees with new knowledge and skills which can strengthen teaching, learning, and career outcomes. 
  • Empower attendees to translate skills and research interests into career competencies. 

Azure Global Bootcamp Atlanta

This year will be the 4th annual Azure Global Bootcamp in Atlanta. If you don’t know about Azure Global Bootcamp here is their snippet:

Welcome to Global Azure Bootcamp! All around the world user groups and communities want to learn about Azure and Cloud Computing!

On April 21, 2018, all communities will come together once again in the sixth great Global Azure Bootcamp event! Each user group will organize their own one day deep dive class on Azure the way they see fit and how it works for their members. The result is that thousands of people get to learn about Azure and join together online under the social hashtag #GlobalAzure!

Join hundreds of other organizers to help out and be part of the experience! Check out the different locations worldwide and if there is no location near you, why not organize one?

I will be leading off the IoT track after the Keynote. All around there is an amazing line up of speakers and it looks to be a great experience for both developers and decision makers. There are a limited number of seats so don’t waste time waiting to sign up.

Home Control Flex Major Release

After nearly a year of hard work, the Home Control Flex application has finally reached a new release point. There have been major improvements around the use of Xamarin Forms and the use of mobile features. There were major changes around framework dependencies and utilization of navigation pages.

The major problems in the previous version was poor usage of navigation pages, dependencies on old frameworks, and lack of sharing of global resources. Adding all of these failures together resulted in an unstable application that crashed on multiple pages. Fixes to those crashes were a slow roll out of shims and hacks to keep the previous decisions working.

The largest problem was the poor usage of navigation pages. For some reason, to implement a tabbed page where the tabs were at the bottom on Android, the previous developers decided to use a ContentPage, and make the tab pages within the ContentPage  ContentViews and swap those views out whenever a tab was changed. This caused almost every major problem that could not be resolved in the app moving forward. To fix it, the BottomNavigationBarXF Nuget packages was used. The base renderer was overridden to implement some custom functionality but overall it was a clean integration or at least as clean as such a big overhaul to the navigation system can handle.

Since the pages were being swapped out whenever a tab was changed, the previous developers mush have decided that instead of needing navigation pages, they would just continue to change the view out and have their own navigation stack. Without using NavigationPage within their app, the page lifecycle was completely off and the were object disposed exceptions that were being thrown by the Forms framework due to the fact that the views lifecycle was not correctly managed. Xamarin Forms couldn’t track whether a view was to be reused or not and would collect on disappeared views that were going to come back later. Once NavigationPage was used this was no longer a problem.

When I inherited the app there were multiple frameworks being used in the application. It seemed to have a javascript approach where a framework may be brought in for some partial functionality or even a single method. Xamarin Forms Labs was the biggest offender when I inherited the app. The previous developers had referenced it to use it for one control and two converters. Once it was removed, the application was much more light weight on disk. At the time it was removed there was no noticeable performance gain but that was most likely due to the lack of utilization within the app and the fact that I had only been with the app for a month.

This app was riddled with copy and paste code reuse. Every page shared the same Style declaration with the same name (which was the style for that page). Every page had a declaration of a Converter for inverting a boolean. All these “shared” resources were moved to the App.XAML for reuse by every page within the application.

After fixing the above issues, changing a variety of pages within the app, and adding a load of new features, the app should finally be a stable release with market effects that was expected out of its first release. I hope to continue to improve on the line of applications from Telular including this app.

CodeStock

I’m proud to be presenting Alternative Device Interfaces and Machine Learning at CodeStock this year. With AI becoming more and more ubiquitous, it is important to note the effect on a user’s experience. This presentation is meant to show how to create modern applications using machine learning provided by a third party and showcase what some third parties provide.

In this presentation, we will look at the how users interface with machines without the use of touch. These different types of interaction have their benefits and pitfalls. To showcase the power of these user interactions we will explore: Voice commands with mobile applications, Speech Recognition, and Computer Vision. After this presentation, attendees will have the knowledge to create applications that can utilize voice, video, and machine learning.

Users use voice (Alexa, Cortana, Google Now) or video as a mode of interaction with applications. More than a fad, this is a natural interface for users and is becoming more and more common with the ever-decreasing size of hardware.

Different types of interaction have their benefits and pitfalls. To showcase the power of these user interactions we will explore: Voice commands with two app types: UWP and Xamarin Forms (iOS and Android). Speech Recognition with Cognitive Services: Verifying the speaker with Speaker Recognition API. Computer Vision with Cognitive Services: Verifying a user with Face API.

By utilizing UWP, Xamarin, and Cognitive services; a device with the ultimate in customization for user interactions will be created. Come and see how!

Storing Event Data in Elastic Search

There was a CPU and network issue when the hosts uploaded data directly from the client to Elastic Search.  To allow for the same data load without the elastic overhead running on the client the following architecture was used:

  • Hosts use Event Hubs to upload the telemetry data
  • Consume Event Hub data with Stream Analytics
  • Output Stream Analytics query to Azure Function
  • Azure Function to upload output to Elastic Search

Event Hub

To start, the hosts needed an Event Hub to upload the data. For other projects Azure IoT Hub can be used due to Stream Analytics being able to ingest both. Event Hub was chosen so that each client would not need to provision as a device.

Create an Event Hubs namespace

  1. Log on to the Azure portal, and click Create a resource at the top left of the screen.
  2. Click Internet of Things, and then click Event Hubs.
  3. In Create namespace, enter a namespace name. The system immediately checks to see if the name is available.
  4. After making sure the namespace name is available, choose the pricing tier (Basic or Standard). Also, choose an Azure subscription, resource group, and location in which to create the resource.
  5. Click Create to create the namespace. You may have to wait a few minutes for the system to fully provision the resources.
  6. In the portal list of namespaces, click the newly created namespace.
  7. Click Shared access policies, and then click RootManageSharedAccessKey.
  8. Click the copy button to copy the RootManageSharedAccessKey connection string to the clipboard. Save this connection string in a temporary location, such as Notepad, to use later.

Create an event hub

  1. In the Event Hubs namespace list, click the newly created namespace.
  2. In the namespace blade, click Event Hubs.
  3. At the top of the blade, click Add Event Hub.
  4. Type a name for your event hub, then click Create.

Your event hub is now created, and you have the connection strings you need to send and receive events.

Stream Analytics

Create a Stream Analytics job

  1. In the Azure portal, click the plus sign and then type STREAM ANALYTICS in the text window to the right. Then select Stream Analytics job in the results list.Create a new Stream Analytics job
  2. Enter a unique job name and verify the subscription is the correct one for your job. Then either create a new resource group or select an existing one on your subscription.
  3. Then select a location for your job. For speed of processing and reduction of cost in data transfer selecting the same location as the resource group and intended storage account is recommended.Create a new Stream Analytics job details

    Note

    You should create this storage account only once per region. This storage will be shared across all Stream Analytics jobs that are created in that region.

  4. Check the box to place your job on your dashboard and then click CREATE.job creation in progress
  5. You should see a ‘Deployment started…’ displayed in the top right of your browser window. Soon it will change to a completed window as shown below.job creation in progress

Create an Azure Stream Analytics query

After your job is created it’s time to open it and build a query. You can easily access your job by clicking the tile for it.

Job tile

In the Job Topology pane click the QUERY box to go to the Query Editor. The QUERY editor allows you to enter a T-SQL query that performs the transformation over the incoming event data.

Query box

Create data stream input from Event Hubs

Azure Event Hubs provides highly scalable publish-subscribe event ingestors. An event hub can collect millions of events per second, so that you can process and analyze the massive amounts of data produced by your connected devices and applications. Event Hubs and Stream Analytics together provide you with an end-to-end solution for real-time analytics—Event Hubs let you feed events into Azure in real time, and Stream Analytics jobs can process those events in real time. For example, you can send web clicks, sensor readings, or online log events to Event Hubs. You can then create Stream Analytics jobs to use Event Hubs as the input data streams for real-time filtering, aggregating, and correlation.

The default timestamp of events coming from Event Hubs in Stream Analytics is the timestamp that the event arrived in the event hub, which is EventEnqueuedUtcTime. To process the data as a stream using a timestamp in the event payload, you must use the TIMESTAMP BY keyword.

Consumer groups

You should configure each Stream Analytics event hub input to have its own consumer group. When a job contains a self-join or when it has multiple inputs, some input might be read by more than one reader downstream. This situation impacts the number of readers in a single consumer group. To avoid exceeding the Event Hubs limit of five readers per consumer group per partition, it’s a best practice to designate a consumer group for each Stream Analytics job. There is also a limit of 20 consumer groups per event hub. For more information, see Event Hubs Programming Guide.

Configure an event hub as a data stream input

The following table explains each property in the New input blade in the Azure portal when you configure an event hub as input.

Property Description
Input alias A friendly name that you use in the job’s query to reference this input.
Service bus namespace An Azure Service Bus namespace, which is a container for a set of messaging entities. When you create a new event hub, you also create a Service Bus namespace.
Event hub name The name of the event hub to use as input.
Event hub policy name The shared access policy that provides access to the event hub. Each shared access policy has a name, permissions that you set, and access keys.
Event hub consumer group (optional) The consumer group to use to ingest data from the event hub. If no consumer group is specified, the Stream Analytics job uses the default consumer group. We recommend that you use a distinct consumer group for each Stream Analytics job.
Event serialization format The serialization format (JSON, CSV, or Avro) of the incoming data stream.
Encoding UTF-8 is currently the only supported encoding format.
Compression (optional) The compression type (None, GZip, or Deflate) of the incoming data stream.

When your data comes from an event hub, you have access to the following metadata fields in your Stream Analytics query:

Property Description
EventProcessedUtcTime The date and time that the event was processed by Stream Analytics.
EventEnqueuedUtcTime The date and time that the event was received by Event Hubs.
PartitionId The zero-based partition ID for the input adapter.

For example, using these fields, you can write a query like the following example:

SELECT
    EventProcessedUtcTime,
    EventEnqueuedUtcTime,
    PartitionId
FROM Input

Azure Functions (In Preview)

Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. It lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Azure Functions to respond to triggers makes it a natural output for an Azure Stream Analytics. This output adapter allows users to connect Stream Analytics to Azure Functions, and run a script or piece of code in response to a variety of events.

Azure Stream Analytics invokes Azure Functions via HTTP triggers. The new Azure Function Output adapter is available with the following configurable properties:

Property Name Description
Function App Name of your Azure Functions App
Function Name of the function in your Azure Functions App
Max Batch Size This property can be used to set the maximum size for each output batch that is sent to your Azure Function. By default, this value is 256 KB
Max Batch Count As the name indicates, this property lets you specify the maximum number of events in each batch that gets sent to Azure Functions. The default max batch count value is 100
Key If you want to use an Azure Function from another subscription, you can do so by providing the key to access your function

Note that when Azure Stream Analytics receives 413 (http Request Entity Too Large) exception from Azure function, it reduces the size of the batches it sends to Azure Functions. In your Azure function code, use this exception to make sure that Azure Stream Analytics doesn’t send oversized batches. Also, make sure that the max batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.

Also, in a situation where there is no event landing in a time window, no output is generated. As a result, computeResult function is not called. This behavior is consistent with the built-in windowed aggregate functions.

Query

The query itself if basic for now. There is no need for the advanced query features of Stream Analytics for the host data at the moment however it will be used later for creating workflows for spawning and reducing hosts.

Currently, the query will batch the data outputs from event hub every second. This is simple to accomplish this using the windowing functions provided by Stream Analytics. In the Westworld of Warcraft host query, a tumbling window batches the data every one second. The query looks as follows:

SELECT
    Collect()
INTO
    ElasticUploadFunction
FROM
    HostIncomingData
GROUP BY TumblingWindow(Duration(second, 1), Offset(millisecond, -1))

 

Azure Function

Create a function app

You must have a function app to host the execution of your functions. A function app lets you group functions as a logic unit for easier management, deployment, and sharing of resources.

  1. Click Create a resource in the upper left-hand corner of the Azure portal, then select Compute > Function App.Create a function app in the Azure portal
  2. Use the function app settings as specified in the table below the image.Define new function app settings
    Setting Suggested value Description
    App name Globally unique name Name that identifies your new function app. Valid characters are a-z, 0-9, and -.
    Subscription Your subscription The subscription under which this new function app is created.
    Resource Group myResourceGroup Name for the new resource group in which to create your function app.
    OS Windows Serverless hosting is currently only available when running on Windows. For Linux hosting, see Create your first function running on Linux using the Azure CLI.
    Hosting plan Consumption plan Hosting plan that defines how resources are allocated to your function app. In the default Consumption Plan, resources are added dynamically as required by your functions. In this serverless hosting, you only pay for the time your functions run.
    Location West Europe Choose a region near you or near other services your functions access.
    Storage account Globally unique name Name of the new storage account used by your function app. Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only. You can also use an existing account.
  3. Click Create to provision and deploy the new function app. You can monitor the status of the deployment by clicking the Notification icon in the upper-right corner of the portal.Define new function app settingsClicking Go to resource takes you to your new function app.

Elastic Search

Now the data is in Elastic Search which if the instructions in the Elastic Search setup post were followed, should be accessible from the Kibana endpoint.

Getting started with Elastic Search in Azure

For the Westworld of Warcraft project, a data store for the host data is required and due to needing to learn Elastic Search for another client, it was chosen. To get started an Elastic Search cluster needed to be deployed in the Azure environment. There is a template in the Azure Marketplace that makes setup easy.

Both Azure docs and Elastic have getting started guides that should be looked over before setting up an enterprise cluster in Azure. For the Westworld of Warcraft project, there only needed to be a public end point for ingest, a public end point for consumption, and a public end point for a jump box. Using the Azure Marketplace template, Kibana was selected as the jump and the load balancer was set to external. The load balancer set to external was specific to this project because the Westworld of Warcraft clients needed a public endpoint to upload to (this changed after a better solution was found).

ElasticInstallKibanaLoadBalance

Once the cluster successfully deployed, a public IP address is created for the load balancer. That IP address is used by the Elastic NEST library in the application. To connect to the external load balancer, use a URI that follows the following format:

{username}:{password}@{external loadbalancer ip}:9200

Now test your configuration with the following code (replacing the URI string with your components):

public class Person
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
}
var settings = new ConnectionSettings
     (new Uri("{username}:{password}@{external loadbalancer ip}:9200"))
    .DefaultIndex("people");

var client = new ElasticClient(settings);

var person = new Person
{
    Id = 1,
    FirstName = "Martijn",
    LastName = "Laarman"
};

var indexResponse = client.IndexDocument(person);