Giter Site home page Giter Site logo

microsoft / mcw-innovate-and-modernize-apps-with-data-and-ai Goto Github PK

View Code? Open in Web Editor NEW
46.0 10.0 54.0 25.07 MB

Innovate and modernize apps with Data and AI

License: MIT License

C# 62.82% Python 0.73% Dockerfile 3.53% HTML 2.67% JavaScript 5.02% TypeScript 24.76% CSS 0.46%

mcw-innovate-and-modernize-apps-with-data-and-ai's Introduction

Innovate and modernize apps with Data and AI

This workshop is archived and no longer being maintained. Content is read-only.

Wide World Importers (WWI) is a global manufacturing company that handles distribution worldwide. They manufacture more than 9,000 different SKUs for two types of businesses, B2B and B2C. WWI has 5 factories each with about 10,000 sensors each, meaning 50,000 sensors sending data in real time.

Their sensor data is collected into a Kafka cluster, collected via a custom consumer application that aggregates the events and writes the results to PostgreSQL. They have an event data store that currently runs in PostgreSQL. The status of the factory floor is reported using a web app hosted on-premises that connects to PostgreSQL.

They are running into scalability issues as they add manufacturing capacity, but while addressing this concern they would like to take the opportunity to modernize their infrastructure. In particular, they would like to modernize their solution to use microservices and apply the Event Sourcing and Command Query Responsibility Segregation (CQRS) patterns.

November 2021

Target audience

  • Application developer
  • AI developer
  • Data engineer
  • Data scientist

Abstract

Workshop

In this workshop, you will look at the process of implementing a modern application with Azure services. The workshop will cover event sourcing and the Command and Query Responsibility Segregation (CQRS) pattern, data loading, data preparation, data transformation, data serving, anomaly detection, creation of a predictive maintenance model, and real-time scoring of a predictive maintenance model.

Whiteboard design session

In this whiteboard design session, you will work with a group to design a solution for ingesting and preparing manufacturing device sensor data, as well as detecting anomalies in sensor data and creating, training, and deploying a machine learning model which can predict when device maintenance will become necessary.

At the end of this whiteboard design session, you will have learned how to capture Internet of Things (IoT) device data with Azure IoT Hub, process device data with Azure Stream Analytics, apply the Command and Query Responsibility Segregation (CQRS) pattern with Azure Functions, build a predictive maintenance model using Azure Synapse Analytics Spark notebooks, deploy the model to an Azure Machine Learning model registry, deploy the model to an Azure Container Instance, and generate predictions with Azure Functions accessing a Cosmos DB change feed. These skills will help you modernize applications and integrate Artificial Intelligence into the application.

Hands-on lab

In this hands-on lab, you will build a cloud processing and machine learning solution for IoT data. We will begin by deploying a factory load simulator which represents sensor data collected from a stamping press machine, which cuts, shapes, and imprints sheet metal. The rest of the lab will show how to implement an event sourcing architecture using Azure technologies ranging from Cosmos DB to Stream Analytics to Azure Functions and more.

Using factory-generated data, you will learn how to use the Anomaly Detection service built into Stream Analytics to observe and report on abnormal machine temperature readings. You will also learn how to apply historical machine temperature and stamping pressure values in the creation of a machine learning model to identify potential issues which might require machine adjustment. You will deploy this predictive maintenance model and generate predictions on simulated stamp press data.

Azure services and related products

  • Azure Container Registry
  • Function App
  • Cosmos DB with Synapse Link
  • Azure Synapse Analytics
  • Azure Data Lake Storage Gen2
  • Azure Machine Learning
  • Azure Kubernetes Service
  • Event Hubs
  • Azure Stream Analytics
  • Power BI

Related references

Help & Support

We welcome feedback and comments from Microsoft SMEs & learning partners who deliver MCWs.

Having trouble?

  • First, verify you have followed all written lab instructions (including the Before the Hands-on lab document).
  • Next, submit an issue with a detailed description of the problem.
  • Do not submit pull requests. Our content authors will make all changes and submit pull requests for approval.

If you are planning to present a workshop, review and test the materials early! We recommend at least two weeks prior.

Please allow 5 - 10 business days for review and resolution of issues.

mcw-innovate-and-modernize-apps-with-data-and-ai's People

Contributors

benwurthcb avatar dawnmariedesjardins avatar dependabot[bot] avatar feaselkl avatar joelhulen avatar microsoftopensource avatar saimachi avatar waltermyersiii avatar zoinertejada avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mcw-innovate-and-modernize-apps-with-data-and-ai's Issues

Some recommendations for instructions updates

Some of the screenshots and instructions are getting a little out-of-date, these are some recommendations from a recent (last 2 weeks) test run-through:

In: https://github.com/microsoft/MCW-Innovate-and-modernize-apps-with-Data-and-AI/blob/main/Hands-on%20lab/Before%20the%20HOL%20-%20Innovate%20and%20modernize%20apps%20with%20Data%20and%20AI.md

'Before the hands-on lab'

  • Task 2, Step 4 - 'Access tier' should be removed from table, it is default now and not on Basics tab any longer.

  • Task 3, Step 4 - 'Select Size and scale from the menu.' There is no longer a 'Size and scale' tab, has moved to 'Management' tab.
    (azure-create-iot-hub-2_new)
    azure-create-iot-hub-2_new

  • Task 6, Step 3 - The option 'Notebooks' in the table is no longer present.

  • Task 7, Step 3 - 'Runtime stack' is now just '.net' and not '.net core'

  • Task 10, Step 3 - There are other options given that aren't specified - Storage account, key vault, app insights, and container registry. Workspace edition is not an option

In: https://github.com/microsoft/MCW-Innovate-and-modernize-apps-with-Data-and-AI/blob/main/Hands-on%20lab/HOL%20step-by%20step%20-%20Innovate%20and%20modernize%20apps%20with%20Data%20and%20AI.md

'Hand On Lab'

  • Exercise 3, Task 2, Step 2 - 'New Container' and not 'Add Collection'
    (azure-cosmos-db-add-collection_new)
    azure-cosmos-db-add-collection_new

  • Exercise 3, Task 4, Step 8 - Instructions might do with a little more clarification of exactly which connection strings could be used where.

  • Exercise 5, Task 1, Step 4 - Had to enable system-assigned managed identity on the stream analytics job to complete this step

  • Exercise 8, Task 1, Step 3 - 'Select Launch Synapse Studio from the Synapse workspace page.' could be updated to 'Open Synapse Studio'
    (azure-synapse-launch-studio_new)
    azure-synapse-launch-studio_new

Issue in exercise 3 task 3

Exercise-3 task-3 step- 12&13: As per the instructions after replacing the content in the file WriteEventsToTelemetryContainer.cs with the content given in the lab guide, build operation is failing with below error,

C:\Temp\Azure Functions\WriteEventsToTelemetryContainer.cs(58,53): error CS0246: The type or namespace name 'EventData' could not be found (are you missing a using directive or an assembly reference?) [C:\Temp\Azure Functions\Azure Functions.csproj]

Please find the attached screenshot below for reference :

image

Issue in Exercise 5 Task 3

In Exercise 5 Task 3 step 5 when running the dotnet build command getting the below error:

The type or namespace name 'Machine' could not be found (are you missing a using directive or an assembly reference?) [C:\Temp\Azure Functions\Azure Functions.csproj]
ProcessTelemetryEvents.cs(37,16): error CS0246: The type or namespace name 'Ambient' could not be found (are you missing a using directive or an assembly reference?) [C:\Temp\Azure Functions\Azure Functions.csproj]
ProcessTelemetryEvents.cs(81,23): error CS0246: The type or namespace name 'TelemetryOutput' could not be found (are you missing a using directive or an assembly reference?) [C:\Temp\Azure Functions\Azure Functions.csproj]

Please find the screenshot reference below:

image

Issue in Exercise 2, Task 4

In Exercise 2, Task 4, Step 4 while deploying predictive model the Deployment state is not updated from Transitioning to Healthy even after one hour of deployment. Receiving the Endpoint deployment failed error. Please see the below screenshot for the reference.
image
image

Scoring data format issue in ProcessTelemetryEvents

In "Task 3: Create an Azure Function based on a Cosmos DB trigger" I get an exception when processing (likely all, but I'm not sure) scored results from Azure ML.

Row 104 comment in ProcessTelemetryEvents.cs says:

// Azure ML returns an array of integer results, one per input item.

But in previous "Task 5: Test the predictive maintenance model" the response from Azure ML is a array of floats: [1.0, 0.0, 4.0]

After changing the "int maintenanceRecommendation" declarations to "float maintenanceRecommendation" in ProcessTelemetryEvents.cs the exceptions did occur anymore. I don't know if I get into trouble later on with this change.

Message Exception while executing function: ProcessTelemetryEvents Input string '0.0' is not a valid integer. Path '[0]', line 1, position 4. ย 
Exception type Newtonsoft.Json.JsonReaderException

Issue in Exercise 5 and Exercise 6

  1. In Exercise -5> Task -3> step -9, I didn't get the expected output in Azure Data Studio.

    Expected Output:
    image

    Actual Output:
    image

  2. In Exercise -6> Task -1> Step -12, Input created in Azure Stream Analytics job didn't work as expected. While running the query received an error saying that no data was loaded.

    Actual Ouput:
    image

Exer 2/Task 3/Step 6: 'stamp_press_model' not appearing in 'Models' section. (or, any means of recovery-in-place?)

Can anyone who has taken this course recently confirm that it should still be expected to work, despite any Azure changes that may have occurred since original release?

Specifically, I have run into an issue (that may well be a mistake on my part) In Exer 2, Task 3, Step 6, where I am unable to get a proper 'run cell' result to then populate 'stamp_press_model' in the 'Models' section as expected in Step 7.

  • After the first failure, I tried to re-run the steps exactly as written, but now consistently receive this error:

image

  • I have tried deleting any traces of this file specifically, and re-running the Exercise, but rec'v identical error.

  • I have tried deleting all the items created in Exercise 2 and re-running the identical steps using the names for the items as given in the instructions, but rec'v the same error.

  • Lastly, I have tried re-running Exercise 2 from the start, this time using new and different names (e.g. 'maintenancedata2', 'modernizeappstorage2', 'Stamp Press Maintenance Model2', etc., but still receive the same error.

  • It appears as if there is a construct somewhere that I do not know to delete, that is preventing progress.

Note that all of the above steps were attempted in the same instance, rather then restarting from scratch from the very beginning, principally because it takes roughly 1.5 hours simply to provision all the resources needed in Azure before even attempting the course. (Because of the ongoing cost to run these resources, it is also not necessarily economical to leave them running for days for troubleshooting....so if one thinks a mistake has been made and cannot identify how to recover -- must start from scratch....spending the same 1.5 hours provisioning time, again...plus the time it takes to get back to where you were.)

As an alternative question: is it possible for the course authors to perhaps create ARM templates, or other means, that will automate the provisioning of any/all resources required in Azure such that one might be able to re-start the course without having to continually spend time manually provisioning?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.