Voice from the field

Donwloads: Raspbian Windows IoT

IoT Predictive Maintenance

October 18 2018

Following hands-on scenario will help you setup and research Predictive Maintenance solution implemented for IoT devices on textile factory.

Scenario

Following hands-on scenario will help you setup and research Predictive Maintenance solution implemented for IoT devices on textile factory.

 

In the following hands on, you will build an .Net Core application to control textile factory color mixture. Color mixture responsible for mixing colors for variety of threads used for producing fabric. Quality of the produced colors depends on sped of mixing and distribution of the color pigments in final paint. Simply the best mixing process should product paint of required color and equal distribution of color for the full palette. For simplicity only two metrics will be monitored: rotation speed and output color palette of paints. Again, for simplicity, the color measurement will be collected as stream instead of individual measurement of each paint's color. If the produced colors are not corresponded to required, it means that mixture quality decrease and mixer required maintenance, fix or replacing. The same speed of the mixer rotter could be increased or decreased from base measurement which is indicate consistency of liquids and state of the mixer rotter. The output from the color sensor and rotation speed sensor will be collected by IoT device and analyzed in Azure ML.

Architecture

The system you are about to build consists of the following components and will works in the following manner:

1.  The IoT Hub will act as the main gateway for ingesting data from connected devices. The IoT device need to be register on IoT Hub before sending any telemetry.

2.  The Raspberry Pi with attached sensors.

3.  The rotation speed sensor will be used to measure speed of mixer

4.  The color sensor will be used for measuring output color palette.

5.  The collected data will be uploaded to IoT Hub and Azure as JSON messages. The application will be able to monitor raw data collected from sensors.

6.  The raw data will be ingested into Azure Stream Analytics.

7.  The metrics be analyzed by Machine Learning model. Based on the metrics ML model produce maintenance insides.

8.  The final calculation will be presented in Power BI.

Processing flow

·  the device obtains device id and access key

·  the color measurement will be collected by RGB Adafruit sensor

·  rotation speed will be measured by rotation sensor HC-20K  

·  telemetry data will send to the IoT Hub

·  Import data from IoT Hub by stream analytics.

·  trained model requested though Web API with collected previously data to provide Maintenance decision

·  the maintains calculation monitored only by Power BI dashboard

Basic Hardware Setup

To build solution described above, you will need the following hardware items to build this scenario:

·  Microsoft IoT Pack for Raspberry Pi 3 - w/ Raspberry Pi 3

·  1 x Photoclinometer

Image result for potentiometer

·  1 x Motor with code disc.

·  1 x 9v battery with holder.

Image result for 9v battery

·  1 x Capacitor

Image result for Capacitor

·  1 x Adafruit GRB sensor TCS34725

Image result for TCS34725

·  1 x Rotation speed sensor HC-020K

https://sites.google.com/site/myscratchbooks/home/projects/project-11-infrared-speed-sensing-module/HC-020kjpg.jpg

 

Basic Circuit

 

For your reference, these are the PINs on the Raspberry Pi. The speed sensor is attached to GPIO Pin 21 - this is referenced later in the code.

 

Pi 3 connection schema

Raspberry Pi 3 PinOut Reference

 

Live PoC Implementation

Pre-requirements

·  Azure subscription. New trial can be started for free - http://portal.azure.com

·  A working PowerBI subscription also trial can be started for free - http://www.powerbi.com

·  Visual Studio 2017/19

Development Machine and IoT device Setup

·  Ensure your local development machine is setup according to these instructions: Azure IoT Development machine setup.

·  Part of the above document describes installing the "Device Explorer" tool - make sure that you do follow these instructions and you'll need that tool later on.

·  Ensure you have installed the Connected Service for Azure IoT Hub Visual Studio Extension

·  Ensure you have followed the instructions to Use the Windows 10 IoT Core Dashboard to setup your Raspberry Pi.

Step 1 - Build IoTHub

    1.  Open the Azure Portal. http://portal.azure.com

    2.  Click "+ Create Resource" then type "IoT Hub".

    3.  Enter a unique name for the IoT Hub, choose a Scale tier (note that Free has been chosen here), select or create a Resource Group and datacenter location and Click Create.

    4.  Once the IoTHub has been created, ensure you make a copy of the iothubowner Connection String - this is shown via the Shared Access Policies-->iothubowner blade.

    5. Finally, you should also make a copy of the Event Hub-compatible name & Event Hub-compatible endpoint values. You'll need these values later, when data will be read from IoT Hub.

Step 2 - Register your device with IoT Hub

For your device to connect to IoT Hub it must have its own Device Identity (aka set of credentials). The process of obtaining it is known as Registering your Device. Currently there is no way to do this via the Azure Portal but there is a remote API available. Rather than writing a custom application to connect & register you are going to use Device Explorer which is part of the IoT SDK. You can also register a device via the IoT Dashboard application or use iothub-explorer, another tool from the IoT SDK written in node.js.

Step 3 - Create an App for your device

For Windows IOT

download link on the top of the page

Open existed project "SpeedSensor.sln" in Visual Studio 2017 or later. Project is targeted to Windows IoT build 10.0.14393 version you can upgrade the project to your current version.

 

For Raspbian

download link on the top of the page

Open existed project "SpeedSensor.sln" in Visual Studio 2017 or later. Project is targeted to dotnet core 2.2 you can upgrade the project to your current dotnet core version.

    1.  Before you start building the project you need to update IoT connection string.

    2.  Open "AzureIoTHub.cs"  and update "deviceConnectionString" with the value you have copied from Device Explore on previous step. You should get something like following:

    const string deviceConnectionString = "HostName=PredictiveMaintenance.azure-devices.net;DeviceId=SpeedSensor;SharedAccessKey=sx2GNrUw3244CX4VKnoAinKoZ0Q+div+Dnaf+LVV5Am8=";

    3.  For first stage of the preparation you need to make device not send data to the cloud but output the data in Visual Studio window. Where you can copy & paste the data in text file later. To switch code in collection mode you need to uncomment following flag in the StartupTask.cs

    #define Collect

    4.  Now code should be compiled and deployed to device.

Step 4 – Collect test and train data.

    1.  Before you start SpeedSensor project on your device make sure you have changed flag

    #define Collect

    2.  You need to prepare "palette color". This palette will use later for collection train data and run PoC.  You can find one in Internet or print our "palette color.png"

    3.  Finally, you need to glue the pallet to the plate or table and disconnect RGB sensor to make it easy to move on the palette.

    5.  To collect data you need to start project in Visual Studio and run motor with Maximum rotation speed. Meanwhile you need to put RGB sensor on the white spot in top left corner of the palette.  Make sure you get the data like the following.

    "2          1            300        65535 49598 33520 65535"

    Firs number is device id. You can change it from code. Every data collection should has unique id.  Second number is Cycle. Cycle should be started from 1 to 100 or 200.  Third number is rotation speed can be different depend on the motor and power you provided through potentiometer.

Machine failure data collection

In this part, you need to collect data from 10-15 experiments with 30 records for each.  This data will represent the failure of the machine.

    1.  On the first stage, you need to make rotor spin on maximum speed. Then you need to decrease and increase speed back and forth during the test. You need to prevent full stop of the motor to the end of the test.

    2.  While you are playing with motor speed you need move RGB sensor across the palette from one corner to another. Ideally when you reach "black" spot on the left bottom corner your motor should stop. Here example of the route of the RGB sensor. You may use (1) or (2) with return to white spot or without.

    There is a sample data collected from one of the experiments.

    You can see spots where the speed of the motor is dropped to 30 and spots where motor stopped and spot where RGB sensor reach black color.

    3.  In following way, you need to collect 10-15 experiments to make Machine learning model more accurate. All experiments data can be paste in the one file with name "train-data.txt"

Machine non-failure data collection

In this part, you need to collect data from 7-10 experiments without any failure. It means without reach black spots on the color palette.  

    1.  You need to prepare file with good experiments. It means that RGB sensor should not reach a dark spots and motor should not stop.

    2.  Record 7-10 experiments by moving RGB sensor to right bottom corner but not reach dark spots. Meanwhile rotation speed should be decreased. You need to stop experiment before full motor stop and before RGB color reach black spot.  The new file should be named "test-data.txt"

    3.  You need to measure number of cycles before final stop. For example, you recorded 40 cycles and speed of the motor decrees by 50% it means that it needs about another 40 cycle before motor fully stop.

    4.  You need to record numbers of cycles to the end of experiment. Create file "true-cycle.txt". Each of experiment record new line. Here is one of example.      

Step 5 – Train model.

In this step, you will upload collected data in storage account in Azure and modify existed Predative Maintenance model to accept your data. Finally, you will train and publish model to subscribe from Stream Analytics.

ML Experiment Step #1

    1.  Create or use existed storage account.  Upload collected on previous step data files in text format in the storage account blob container. This container needs to have public access.

    2.  You might use Storage Account Explorer to upload and copy URL of the files and paste it in IE to check if the files are accessible.

    3.  Navigate and sign up. If you did not have an account in Machine Learning studio. You can create one for free. https://studio.azureml.net

    4.  Navigate to Predictive Maintenance gallery experiment. https://gallery.cortanaintelligence.com/Collection/Predictive-Maintenance-Template-3

    5.  You need to add your ML studio "Step 1 of 3" then "Step 2B of 3" and finally "Step 3B of 3". You also click on "read full description" in case to get reference information.

    6.  Open first experiment then click "Open in the Studio"

    7.  Then you can add it to new project to not confuse with existed experiments.

    8.  Finally, you should have all 3 items.

    9.  Open first experiment and change the source of the training data. It should be reference to your storage account you have copy from Storage Explorer on previous step.

    10.   Repeat the same for rest of the "Import Data". It will be "Test" and "True-cycle" files

    11.   You also needed to change R-Script items and change the columns count and names.

    12.   All reference to columns needs to be replaced the following:

    colnames <- c("id","cycle","setting1","s1","s2","s3","s4","s5")

    13.   Repeat the same changes for next "R script" item.

    14.   You need to perform some modification of exited experiment before run it. You need to drug and drop "Convert to CVS" action and connect it to last "R script" object. Repeat the same with next "R script"

    15.   Run first experiment and make sure that it will be completed with success.

    16.   Click on "Save as Dataset" to save CVS output in file from every "Convert to CVS" action.

    17.   The saved Data Set should come up in the list of the dataset in your right. It will be used in future.

    18.   The "transformation" also needs to be saved. The following figure shows how a transformation is saved during Step 1. The saved transformation will be shown in the "Transforms" tab on the left-hand side panel in the Azure ML studio. It can then be applied to the scoring experiments.

    19.   After applying trainings session for model, the transformation has to be saved in the experiment. In this case, we performed data normalization to the training data using Normalize Data module. In order to apply the same data normalization on the testing data, we use the Apply Transformation module to the testing data. If the testing data need to be prepared in a separate experiment, we have to right click the second output port of the Normalize Data module and save this transformation by selecting "Save as Transformation" option in the menu.

    step 1 save Transform

ML Experiment Step #2

    1.  You need to replace current "Import Data" items with data set you have saved on previous step. Make sure that test data set will replace with test import data and the same for training dataset.

    2.  Run experiment. It should be finished with success.

    3.  When the experiments are completed,  as shown in box 1 in following figure, we train and evaluate four binary classification models: Two-Class Logistic Regression, Two-Class Boosted Decision Tree, Two-Class Decision Forest, and Two-Class Neural Network. Second, we show in box 2 how to balance the class distribution by down sampling the records with majority class. step 2B

    step 2B

    4.  The following figure compares the results from the four models to determine the best model. The algorithm "Two-Class Neural Network" performs best in terms of four metrics: "Accuracy", "Precision, "Recall", and "F-Score".

    5.  Further you will see what algorithm provide you better accuracy. "Decision Forest" in following case.

    6.  You also can see result of the "2" part and find out accuracy and false positive/negative decisions.

    7.  Finally, you need to select appreciate train model and save it as candidate for publishing.

ML Experiment Step #3

    1.  You need to update reference to the test data with file you have uploaded on step #1.

    2.  Then you need to replace default model with model that you selected and saved on step #2.

    3.  You also need to replace transformation you saved on step #1

    4.  Retarget Web Input to next item after R-script like show on following picture.

    5.  Finally, you also need to update "R script" step. It needs to be done right after import data and replacing columns likes you did on step #1

    colnames <- c("id","cycle","setting1","s1","s2","s3","s4","s5")

    6.  Run experiment and when it will be completed without errors you can publish as "web service"

    7.  The web service will be set up for you and you will get API key and reference to use in stream analytics. E.g.:

    42jQGadWcFaDYgnr6LSVdoQ9vQKvfxlsK4KJ7qB1o4BrPY1RdyyOyCo1oG0udG+TLWx+hMGierWxR6pQKcqXKw==

    https://ussouthcentral.services.azureml.net/workspaces/502e698d8cb64244b3a52eb647545f7b/services/03396598359d468ab16c2292a92d1ab5/execute?api-version=2.0&format=swagger

    8.  In additional you can test your trained model by clicking on "test" button

    9.  You can provide a data line form on of the test file you collected before to make sure it will return more than 0 probability. Like 0.5 from following screen.

Step 6 – Set up Stream Analytics Job.

In the following step, you will set up Steam analytics job to process data from IoT hub and export result in Power BI

    1.  Logon to azure portal  http://portal.azure.com

    2.  Create new Stream Analysis job like following:

    3.  Create input job where is source your IOT hub you create on Step #1.

    4.  Create output job where is target your PowerBI workspace.

    5.  Create function with alias "ml-newdata"  and import option "from different subscription". You need to provide key and URL from the published ML model on step #5.

    6.  The parameters needed to pull out form ML web site and they should come up in function description:

    7.  Specify Query for the analytics job as following:

    WITH [ml-newdata] AS (

      SELECT ID,Cycle,RSpeed,ColorR,ColorG,ColorB,ColorC,[ml-newdata](ID,Cycle,SettingV,RSpeed,ColorR,ColorG,ColorB,ColorC) as result from [iotHub]

    )

    SELECT

      ID,Cycle,RSpeed,ColorR,ColorG,ColorB,ColorC,

      CAST ( result.[Scored Labels] AS bigint) as 'Label' ,

      CAST ( result.[Scored Probabilities]  as float)  as 'Probability'

    INTO

      [PowerBI]

    FROM

      [ml-newdata]

    8.  Finally, you can start job.

Step 7 – Run experiment.

On this step, you will run code on IoT device and produce data for analyzing by Machine learning function. You will review Output of the function in PowerBI.

    1.  You need to update code of the project which you use to collect train and test data. Open the project in visual studio.

    2.  Make sure you comment #define Collect

    3.  Update Device ID to any new number.

    4.  Re-deploy solution to have cycle file clean up.

    5.  Start rotter and put RGB sensor on white spot.

    6.  Run project in debug mode and wait until output values will come up output window.

    7.  For monitoring device date you can use Device explore:

    8.  Check your PowerBI Portal and find "Predictive-Maintenance" you have provided on previous step.

    9.  Click one icon and create new report.

    10.   Add "Live chart" for Axis: cycle; Values: probability, label

    11.   Add tables to view the latest data and order the data by cycle. Make sure that Values do not have "Do not summarize" mark.

    12.   Add "gauge" element to preview average probability.

    13.   In the same way you can add "live chart" for "Speed by cycle" and "Colors by cycle"

    Here is one of the examples of the experiment from 20-120 cycles with full stop of the engine. You can notice that "label" which is predict maintenance (yellow line) is going to 1 at the end of the experiments. You also can notify that polarity of maintenance (brown line) growing from 0 to 0.6-0.7 at the end of the experiments as well. You can also monitor colors output and speed by cycle and experiment data.

Appendix A – Troubleshooting.

Monitoring Stream Analytics job

Monitoring can be done from overview page. You can find "Input Events", "OutputEvents" and "Runtime Errors". If you observe that "Runtime errors" not equal 0 it means that ML function does not work correctly or PowerBI required reauthorization.  

You can find better trace on "Metric" page.

Prototype sensor activity

The Speed sensor code contains simulation code which can be used for mocking real sensor activity. Uncomment "#define Mock" to start simulation process without connecting sensors. The code need to deployed and run from IoT device. Simulation mode can produce data and send it to the IoT device.

Test Machine Learning function

Console utility "MLTestConsole" can be used for testing Machine learning output. You need to update API key and ML's URL accordantly in app.config. Utility will take a data from provided TXT files and sent it for analyzing in Azure ML.

Here is an example of output.