Giter Site home page Giter Site logo

azure-samples / digital-twins-explorer Goto Github PK

View Code? Open in Web Editor NEW
180.0 26.0 118.0 4.66 MB

A code sample for visualizing Azure Digital Twins graphs as a web application to create, edit, view, and diagnose digital twins, models, and relationships.

License: MIT License

JavaScript 86.06% HTML 0.29% C# 2.87% SCSS 10.65% Dockerfile 0.14%

digital-twins-explorer's Introduction

page_type languages products name description urlFragment
sample
javascript
typescript
azure-digital-twins
Azure Digital Twins explorer
A code sample for visualizing and managing an Azure Digital Twins instance
digital-twins-explorer

Azure Digital Twins Explorer

Azure Digital Twins Explorer is a developer tool for the Azure Digital Twins service. It lets you connect to an Azure Digital Twins instance to understand, visualize and modify your digital twin data.

Image of digital-twins-explorer

Azure Digital Twins Explorer is written as a single-page JavaScript application. This repository holds the code for the hosted version of Azure Digital Twins Explorer, which is accessible through the Azure portal and at explorer.digitaltwins.azure.net. You can also run the application locally as a node.js application.

This README contains information and guidance specific to hosting this codebase locally, including:

For general documentation on the Azure Digital Twins Explorer features for both the hosted version and local codebase, see the Azure Digital Twins documentation:

Azure Digital Twins Explorer is licensed under the MIT license. Please see the Microsoft Open Source Code of Conduct

Requirements

Node.js 10+

Run Azure Digital Twins Explorer locally

  1. Set up an Azure Digital Twins service instance and give yourself permissions (e.g. Azure Digital Twins Owner). For instructions, please see the following how-to article:
  2. When running locally, Azure Digital Twins Explorer will use Azure default credentials. In order to authenticate, you can run, for example, az login in any command prompt. When you later run Azure Digital Twins Explorer, it will pick up the credentials. Alternatively, you can sign into Visual Studio Code.
  3. Select the Download ZIP button to download a .zip file of this sample code to your machine. Unzip the digital-twins-explorer-.zip folder, and extract the files. Alternatively, you can clone the repository.
  4. From a command prompt in the client/src folder, run npm install. This will retrieve all dependencies

    IMPORTANT! Due to a dependency on the npm-force-resolutions package to mitigate an underlying security issue you will not be able to install under any path that contains a space. For more information, see this GitHub issue.

  5. From the same command prompt, run npm run start.

    By default, the app runs on port 3000. To customize the port, change the run command. For example, to use port 8080:

    • Linux/Mac (Bash): PORT=8080 npm run start
    • Windows (cmd): set PORT=8080 && npm run start Note: Your Azure Digital Twins app registration must have a reply URL using the same port you are using - e.g. localhost:7000 if that is the port you are using.
  6. Your browser should open and the app should appear.

See below for instructions on how to run digital-twins-explorer using docker.

Run Azure Digital Twins Explorer with Docker

  1. From a command prompt in the root folder, run docker build -t azure-digital-twins-explorer .. This will build the Docker image for the Azure Digital Twins Explorer.
  2. From the same command prompt, run docker run -it -p3000:3000 azure-digital-twins-explorer.

    By default, the app runs on port 3000. To customize the port, change the run command. For example, to use port 8080 run docker run -it -p8080:3000 azure-digital-twins-explorer. A message will appear on the console asking you to login using a code in the Microsoft device login page with your web browser; after doing so the Azure Digital Twins Explorer should start.

    • Note: When run successfully the application will display a message showing you the URL & port that you must open to browse the app. When running the app inside Docker this information might not be accurate, as other port might have been exposed. Be sure to use the port you chose before.
  3. You can now open your web browser and browse to http://localhost:3000 (change 3000 for the appropriate port, if you changed it).

Sign in on first run

Initial authentication is triggered by either of the following actions:

  1. Clicking on the Azure Digital Twins URL button in the top right

    sign-in icon
  2. Clicking on an operation that requires calling the service. When you click the first command, Azure Digital Twins Explorer will open a dialog that prompts you for connection information to your service instance.

To continue, you will need to provide the URL of the Azure Digital Twins instance you want to access, in the form of the instance's host name prefixed with "https://". You can find the instance's host name in the Azure portal overview page for your Azure Digital Twins instance.

sign-in dialog

To change the instance URL to connect to another instance of Azure Digital Twins, click on the sign in button in the top right.

Experimental features

In addition to local operation, you can also run Azure Digital Twins Explorer as a cloud application. In the cloud, you can use push notifications from Azure Digital Twins, sent via the Azure SignalR service, to update your digital-twins-explorer in real time.

Running in the cloud

You might want to run your Azure Digital Twins Explorer app in the cloud to host your custom version of the explorer for your organization, or to access an Azure Digital Twins instance that uses Private Link to disable public access.

  1. Deploy the ARM template called template.json located under the deployment folder into your Azure subscription.
  2. Package the client app using npm run build. You may need to set NODE_OPTIONS=--max_old_space_size=4096 if you receive memory-related errors.
  3. From the new build file, upload each file to the web container in the new storage account created by the ARM template.
  4. Package the functions app using dotnet publish -c Release -o ./publish.
  5. Zip the contents of the ./publish folder. E.g. from within the publish folder, run zip -r DigitalTwinsExplorerFunctions.zip *.
  6. Publish the functions app using the CLI: az functionapp deployment source config-zip -g <resource_group> -n <app_name> --src <zip_file_path>.
  7. [Optional] For each Azure Digital Twins environment used with the tool where live telemetry through SignalR is required, deploy the template-eventgrid.json template in your Azure subscription.
  8. Setup a system assigned identity to allow Functions proxy to access Azure Digital Twins Service.
    1. In Azure, open the Function App resource from your resource group.
    2. Click Identity from the left hand blade.
    3. Under the System assigned tab Turn the Status toggle to on.
    4. From your resource group, select the Azure Digital Twins resource.
    5. Click Access Control (IAM) from the left blade.
    6. Click + Add then Add role assignment.
    7. Select Azure Digital Twins Data Owner as the Role.
    8. Assign access to System assigned managed identity - Functions App.
    9. Select your Functions App from the list.
    10. Click Save.

Advanced

When running locally, the Event Grid and SignalR services required for telemetry streaming are not available. However, if you have completed the cloud deployment, you can leverage these services locally to enable the full set of capabilities.

This requires setting the REACT_APP_BASE_ADT_URL environment variable to point to your Azure Functions host (e.g. https://adtexplorer-<your suffix>.azurewebsites.net). This can be set in the shell environment before starting npm or by creating a .env file in the client folder with REACT_APP_BASE_ADT_URL=https://....

Also, the local URL needs to be added to the allowed origins for the Azure Function and SignalR service. In the ARM template, the default http://localhost:3000 path is added during deployment; however, if the site is run on a different port locally then both services will need to be updated through the Azure Portal.

Extensibility points

Import

Import plugins are found in src/services/plugins directory within the client code base. Each plugin should be defined as a class with a single function:

tryLoad(file: File): Promise<ImportModel | boolean>

If the plugin can import the file, it should return an ImportModel. If it cannot import the file, it should return false so the import service can share the file with other plugins.

The ImportModel should be structured as follows:

class DataModel {
  digitalTwinsFileInfo: DataFileInfoModel;
  digitalTwinsGraph: DataGraphModel;
  digitalTwinsModels: DigitalTwinModel[];
}

class DataFileInfoModel {
  fileVersion: string; // should be "1.0.0"
}

class DataGraphModel {
  digitalTwins: DigitalTwin[]; // objects align with structure returned by API
  relationships: DigitalTwinRelationship[]; // objects align with structure returned by API
}

New plugins need to be registered in ImportPlugins collection at the top of the src/services/ImportService.js file.

Currently, import plugins for Excel and JSON are provided. To support custom formats of either, the new plugins would need to be placed first in the ImportPlugins collection or they would need to be extended to detect the custom format (and either parse in place or return false to allow another plugin to parse).

The ExcelImportPlugin is designed to support additional Excel-based formats. Currently, all files are parsed through the StandardExcelImportFormat class; however, it would be relatively straightforward to inspect cell content to detect specific structures and call an alternative import class instead.

Export

Graphs can be exported as JSON files (which can then be re-imported). The structured of the files follows the DataModel class described in the previous section.

Export is managed by the ExportService class in src/services/ExportService.js file.

To alter the export format structure, the existing logic within the ExportService to extract the contents of the graph could be reused and then re-formatted as desired.

Views

All panels are defined in the src/App.js file. These configuration objects are defined by the requirements of the Golden Layout Component.

For temporary panels within the application (e.g. import preview), two approaches can be considered:

  1. For panels like output & console, the new panel can be added to the optionalComponentsConfig collection. This allows the panel's state (i.e. open or closed) to be managed through the app state, regardless of whether it is closed via the 'X' on the tab or when it is closed via configuration (like available in the preferences dialog).
  2. For panels like import preview, these can be manually added on demand to the layout. This can be cleanly done via the pub/sub mechanism (see below and the componentDidMount method in App.js).

View commands

Where a view has commands, it's suggested that a dedicated command bar component is created (based on components like that found in src/components/GraphViewerComponent/GraphViewerCommandBarComponent.js). These leverage the Office Fabric UI CommandBar component and either expose callbacks for functionality via props or manage operations directly.

Pub/Sub

The Golden Layout Component includes a pub/sub message bus for communication between components. This is key part of the Azure Digital Twins Explorer and is used to pass messages between components.

All events - via publish and subscribe methods - are defined in the src/services/EventService.js file. Additional events can be defined by adding to this file.

The pub/sub message bus is not immediately available on application load; however, the event service will buffer any pub or sub requests during this period and then apply them once available.

Services

Local

When running locally, all requests to the Azure Digital Twins service are proxied through the same local web server used for hosting the client app. This is configured in the client/src/setupProxy.js file.

Cloud

When running in the cloud, Azure Functions hosts three services to support the front end application:

  1. Proxy: this proxies requests through the Azure Digital Twins service (much in the same way as the proxy used when running locally).
  2. SignalR: this allows clients to retrieve credentials to access the SignalR service for live telemetry updates. It also validates that the endpoint and route required to stream information from the Azure Digital Twins service to the Azure Digital Twins Explorer app is in place. If the managed service identity for the Function is configured correctly (i.e. has write permissions on the resource group and can administer the Azure Digital Twins service), then it can create these itself.
  3. EventGrid: this receives messages from the Event Grid to broadcasts them to any listening clients using SignalR. The messages are sent from Azure Digital Twins to the function via Azure Digital Twins endpoint and route.

NOTE: If you have hosting the application somewhere other than Azure Functions. Then we recommend you add the Content Security Policy to you environment as defined in the proxies.json file.

digital-twins-explorer's People

Contributors

alexkarcher-msft avatar baanders avatar bjorkstromm avatar briancr-ms avatar ccrowley96 avatar cschormann avatar darsney avatar dependabot[bot] avatar hyoshioka0128 avatar jamescarpinter avatar jamiewilbraham avatar joseba-altran avatar kartben avatar microsoft-github-operations[bot] avatar microsoftopensource avatar miltonzarmada avatar montgomp avatar niusoff avatar pasanchmsft avatar pjgpetecodes avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

digital-twins-explorer's Issues

Documentation/examples bug

Import excel columns are:

  • ModelId: The complete dtmi for the model that should be instantiated.
  • ID: The unique ID for the twin to be created
  • Relationship: A twin id with an outgoing relationship to the new twin
  • Relationship name: The name for the outgoing relationship from the twin in the previous column
  • Init data: A JSON string that contains Property settings for the twins to be created

However, in the power system example, PowerLine feeds Receiver and the import line is:
dtmi:example:grid:transmission:powerLine;1 | pl_main | p_industry_01 | feeds

There is no relationship from Receiver to PowerLine in the model. So "Relationship name" is incoming and not outgoing.
-- | -- | -- | --

Add manual reference creation via GUI

Feature request: I haven't found this feature, but it may exist: Add relationship between to entities within the GUI (without importing). This would be good way to enable newbies to learn the proper syntax of the import/export files.

Cannot upload an array of models containing in single .json file

When I try to upload a single .json file containing the array of models, I get this error on the browser:

*** Upload error: RestError: Operation failed as models provided was empty or of a type that is not supported.. Check that your Model Array contains at least one item and it is of a supported type. See link(http://aka.ms/ADTv2Models) for details.

After viewing the request payload to the server it explains why:
ADT-explorer expects to have a single model in the .json file since it grabs one or more .json files and pushes content (model)
into the internal array and sends it to the server. Since array of models is not flattened it results into invalid payload and upload fails.

Please add ability to upload the array of models containing in a single .json file which will simplify the process of uploading converted RDF/OWL models to ADT.

Thanks!

Error loading example model

I followed the readme installation instructions and encountered no errors.

On start up of adt-explorer, I get:

[HPM] Proxy created: / -> /
i 「wds」: Project is running at http://192.168.1.117/
i 「wds」: webpack output is served from
i 「wds」: Content not from webpack is served from C:\Users\myname\digital-twins-explorer-master\client\public
i 「wds」: 404s will fallback to /
Starting the development server...
Compiled successfully!

You can now view adt-explorer in the browser.

Local: http://localhost:3000
On Your Network: http://192.168.1.117:3000

However, when I load the example power system model or even a single file such as baseplant, I get the following error: "*** UploadError; Error "SyntaxError: unexpected token E in JSON at position 0" occured with parsing the response body - Error occured while trying to proxy to: localhost:3000/models? includeModelDefintion=true&api-version=2020-05-31-preview.

I used http://localhost:3000 for my ADT URL to sign in and the client app is running on localhost:3000. Connect seems to work fine, but I get a spinner in the model view pane as soon as I log in and before running any query. Eventually a very similar error as above pops up. "*** Error fetching models: Error; Error "SyntaxError: unexpected token E in JSON at position 0" occured with parsing the response body - Error occured while trying to proxy to: localhost:3000/models? includeModelDefintion=true&api-version=2020-05-31-preview.

PowerShell shows:
[HPM] Error occurred while trying to proxy request /models?includeModelDefinition=true&api-version=2020-05-31-preview from localhost:3000 to / (ECONNREFUSED) (https://nodejs.org/api/errors.html#errors_common_system_errors)
or
[HPM] Error occurred while trying to proxy request /query?api-version=2020-05-31-preview from localhost:3000 to / (ECONNREFUSED) (https://nodejs.org/api/errors.html#errors_common_system_errors)

This happens on two different client machines.

local computer adt authentication error

I met the authentication error on the local computer and fixed it by updating package.json @azure/identity field with recent version. ("@azure/identity": "^1.2.0-beta.2")
Would you check it and update the package.json file for npm install?

AuthenticationError trying to reuse credentials from az-cli

Running in WSL2 with node LTS (15.3.0)

I installed az-cli (2.15.1) and authenticated with az login and selected the same subscription with my ADT instance.

After npm run start I got

AuthenticationError: ManagedIdentityCredential authentication failed.(status code 400).
More details:
request to http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fdigitaltwins.azure.net&api-version=2018-02-01 failed, reason: connect ECONNREFUSED 169.254.169.254:80
    at ManagedIdentityCredential.<anonymous> (/home/rido/code/digital-twins-explorer/client/node_modules/@azure/identity/dist/index.js:1077:23)
    at Generator.throw (<anonymous>)
    at rejected (/home/rido/code/digital-twins-explorer/client/node_modules/@azure/identity/node_modules/tslib/tslib.js:112:69)
    at processTicksAndRejections (node:internal/process/task_queues:93:5)

Model Filtering

Feature Request: I realize that you can use SELECT * FROM DIGITALTWINS WHERE IS_OF_MODEL('dtmi:sample:thing;1') to filter based on models. But this is such a common requirement that it would be nice for the GUI to have a drop down list box or tree control that allowed you to pick a model that would filter data in the model and graph view.

fail to install on windows

Following the tutorial on https://docs.microsoft.com/en-us/samples/azure-samples/digital-twins-explorer/digital-twins-explorer/

Fail to install dependencies on a Windows machine.

###\client>npm i

[email protected] preinstall C:\Users\eksko\OneDrive\work\repos\adtxplorer\client
npx npm-force-resolutions

npx: installed 5 in 2.216s

[email protected] install C:\Users\eksko\OneDrive\work\repos\adtxplorer\client\node_modules\node-sass
node scripts/install.js

internal/fs/utils.js:298
throw err;
^

Error: UNKNOWN: unknown error, read
at Object.readSync (fs.js:581:3)
at tryReadSync (fs.js:357:20)
at Object.readFileSync (fs.js:386:19)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1271:22)
at Module.load (internal/modules/cjs/loader.js:1100:32)
at Function.Module._load (internal/modules/cjs/loader.js:962:14)
at Module.require (internal/modules/cjs/loader.js:1140:19)
at require (internal/modules/cjs/helpers.js:75:18)
at Object. (C:\Users\eksko\OneDrive\work\repos\adtxplorer\client\node_modules\glob\glob.js:51:18)
at Module._compile (internal/modules/cjs/loader.js:1251:30) {
errno: -4094,
syscall: 'read',
code: 'UNKNOWN'
}
npm WARN [email protected] requires a peer of @typescript-eslint/[email protected] but none is installed. You must install peer dependencies yourself.
npm WARN [email protected] requires a peer of @typescript-eslint/[email protected] but none is installed. You must install peer dependencies yourself.
npm WARN [email protected] requires a peer of jsoneditor@^7.1.0 but none is installed. You must install peer dependencies yourself.
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\chokidar\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\react-scripts\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: node scripts/install.js
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:

ADT Explorer can't fetch models

No matter what I do I get this:
image

When I access the twin URL I get ERROR 404, not 401 or 400, so curious if DT service should be accessible this way or I should use another region for the deployment (currently EU-west).

SETUP:
Followed the setup instructions from: https://docs.microsoft.com/en-us/azure/digital-twins/how-to-set-up-instance-portal
and https://docs.microsoft.com/en-us/azure/digital-twins/quickstart-adt-explorer
and used the Web app permission with callback url set to http://localhost:3000 as suggested in readme.md for the ADT explorer.

The subscription is VS Professional and AD/user are under the same subscription.

Could you please help us with that, I'm not the only one having this problem.

Error on model upload

Trying ADT Explorer on macOS 10.15.3. When I try to upload one of the example models, I get this error on the browser:

Unhandled Rejection (TypeError): Cannot read property 'on' of undefined
SignalRService.subscribe
src/services/SignalRService.js:41
38 | this.connection.on(action, callback);
39 | } else {
40 | await this.initialize();

41 | this.connection.on(action, callback);
| ^ 42 | }
43 | }
44 |

Appreciate any help. Thanks!

Failed to install npm packages in node 14

C:\code\digital-twins-explorer\client\src [master ≡ +1 ~0 -0 !]> npm i

> [email protected] preinstall C:\code\digital-twins-explorer\client
> npx npm-force-resolutions

npx: installed 5 in 3.135s

> [email protected] install C:\code\digital-twins-explorer\client\node_modules\keytar
> prebuild-install || node-gyp rebuild

prebuild-install WARN install No prebuilt binaries found (target=14.12.0 runtime=node arch=x64 libc= platform=win32)

C:\code\digital-twins-explorer\client\node_modules\keytar>if not defined npm_config_node_gyp (node "C:\Program Files\nodejs\node_modules\npm\node_modules\npm-lifecycle\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" rebuild )  else (node "C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\bin\node-gyp.js" rebuild )
gyp ERR! find Python
gyp ERR! find Python Python is not set from command line or npm configuration
gyp ERR! find Python Python is not set from environment variable PYTHON
gyp ERR! find Python checking if "python" can be used
gyp ERR! find Python - "python" is not in PATH or produced an error
gyp ERR! find Python checking if "python2" can be used
gyp ERR! find Python - "python2" is not in PATH or produced an error
gyp ERR! find Python checking if "python3" can be used
gyp ERR! find Python - "python3" is not in PATH or produced an error
gyp ERR! find Python checking if the py launcher can be used to find Python 2
gyp ERR! find Python - "py.exe" is not in PATH or produced an error
gyp ERR! find Python checking if Python is C:\Python27\python.exe
gyp ERR! find Python - "C:\Python27\python.exe" could not be run
gyp ERR! find Python checking if Python is C:\Python37\python.exe
gyp ERR! find Python - "C:\Python37\python.exe" could not be run
gyp ERR! find Python
gyp ERR! find Python **********************************************************
gyp ERR! find Python You need to install the latest version of Python.
gyp ERR! find Python Node-gyp should be able to find and use Python. If not,
gyp ERR! find Python you can try one of the following options:
gyp ERR! find Python - Use the switch --python="C:\Path\To\python.exe"
gyp ERR! find Python   (accepted by both node-gyp and npm)
gyp ERR! find Python - Set the environment variable PYTHON
gyp ERR! find Python - Set the npm configuration variable python:
gyp ERR! find Python   npm config set python "C:\Path\To\python.exe"
gyp ERR! find Python For more information consult the documentation at:
gyp ERR! find Python https://github.com/nodejs/node-gyp#installation
gyp ERR! find Python **********************************************************
gyp ERR! find Python
gyp ERR! configure error
gyp ERR! stack Error: Could not find any Python installation to use
gyp ERR! stack     at PythonFinder.fail (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\find-python.js:307:47)
gyp ERR! stack     at PythonFinder.runChecks (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\find-python.js:136:21)
gyp ERR! stack     at PythonFinder.<anonymous> (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\find-python.js:225:16)
gyp ERR! stack     at PythonFinder.execFileCallback (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\find-python.js:271:16)
gyp ERR! stack     at exithandler (child_process.js:315:5)
gyp ERR! stack     at ChildProcess.errorhandler (child_process.js:327:5)
gyp ERR! stack     at ChildProcess.emit (events.js:314:20)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:274:12)
gyp ERR! stack     at onErrorNT (internal/child_process.js:464:16)
gyp ERR! stack     at processTicksAndRejections (internal/process/task_queues.js:80:21)
gyp ERR! System Windows_NT 10.0.19042
gyp ERR! command "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "rebuild"
gyp ERR! cwd C:\code\digital-twins-explorer\client\node_modules\keytar
gyp ERR! node-gyp -v v5.1.0
gyp ERR! not ok

Enumerations do not work

When enumeration is defined, only the first value of enumeration is accepted

Example 1.
Issue: could not change from ExtraHighVolatge to HighVoltage neither via UI or API. In UI it just falls back to the first enumeration value and API gives an error that twin does not match schema.
data_model: https://github.com/Azure-Samples/digital-twins-explorer/blob/master/client/examples/PowerLine.json
image

Example 2.
Issue: Cant change phase code from http://iec.ch/TC57/2013/CIM-schema-cim16#PhaseCode.ABCN to anything else.
data_model: https://github.com/Haigutus/USVDM/blob/master/Tools/RDF_PARSER/entsoe_v2.4.15_2014-08-07_DTDL_V2/Terminal.json
image

local docker adt authentication error in development branch

I encountered an authentication error when running ADT explorer locally with Docker from the development branch. The error message says

AggregateAuthenticationError: undefined
Error: EnvironmentCredential is unavailable. Environment variables are not fully configured.
Error: ManagedIdentityCredential - No MSI credential available
Error: Azure CLI could not be found.  Please visit https://aka.ms/azure-cli for installation instructions and then, once installed, authenticate to your Azure account using 'az login'.
Error: Visual Studio Code credential requires the optional dependency 'keytar' to work correctly
    at DefaultAzureCredential.<anonymous> (/usr/src/app/client/node_modules/@azure/identity/dist/index.js:285:29)
    at Generator.throw (<anonymous>)
    at rejected (/usr/src/app/client/node_modules/@azure/identity/node_modules/tslib/tslib.js:115:69)
AggregateAuthenticationError: undefined

I had logged in using az login and Visual Studio Code before creating the container but no luck.

Cannot create Twin using UI

Hi! I was trying to pass Cholocate Factory Demo using MS tutorial. I got error on step where I need to create Twin for models that are uploaded and visible in Digital Twins Instance.

I can hit + and input name of Twin
image

When I click Save button I am getting error:
image

Same error was mentioned in Stack Overflow https://stackoverflow.com/questions/66453982/adt-explorer-error-cant-put-models-into-graph-view

At same time I can create Twin using RBAC-update branch, while Oct-Bash return same error as Master branch.

Any ideas why this error may happen?

Empty contents array

Digital Twins Explorer throws the following error when a model is uploaded containing an empty contents array. Note, the same model passes validation by the DTDL parser and can be successfully uploaded to ADT.

  {
    "@id": "dtmi:com:example:PizzaTopping;2",
    "@type": "Interface",
    "contents": [
    ],
    "@context": "dtmi:dtdl:context;2",
    "displayName": "PizzaTopping",
    "comment": "Class representing the toppings on a Pizza^^http://www.w3.org/2001/XMLSchema#string"
  }

image

'console' patchtwin command not working

I'm trying to use the 'patchtwin' command in the adt-explorer console to update a twin property. I have a simple property defined in my model like this..

{
"@type": "Property",
"name": "avgHumidity",
"schema": "double"
},

and a twin instance that leverages that property.

image

When i try to update the 'avgHumidity' property via the patchtwin command, I always get the error below...

image

I've tried several combinations of referencing the property, setting the value, etc. Have also tried it against multiple twin instances. Same error every time (Property doesn't exist).

updating the same property in the same twin instance via the AZ CLI works fine as seen below.

image

minor typo in CUSTOM_AUTH_ERROR_MESSAGE

In client/src/services/Constant.js:
We have:
"....on your host machine, or example by running..."
but it should be:
"....on your host machine, for example by running..."

Add support of batch upload if model upload exceeds limit in 250 records

When I try to upload more than 250 models it fails with the error:

*** Upload error: RestError: The number of models you are trying to upload exceeds the supported limit of 250. Try breaking up the upload in smaller batches.

It would be great if you could detect this case and do upload in batches of 250 model to simplify the process of uploading models that exceeds that limit.

Thanks!

Sign In failed with UNABLE_TO_GET_ISSUER_CERT_LOCALLY in ADT explorer

Cx was following the MS doc Quickstart - Explore a sample scenario - Azure Digital Twins | Microsoft Docs
Confirmed that they have setup the ADT instance and the role assignment.
And they got errors when went through this step:

image

and then got:

image

image

All the office laptops from Cx side got this issue.
The environment from their side is:
linux --> Docker container --> Windows10, The security application from their side is Zscaler. not sure if it is an issue due to using docker.

Set default branch to main

It looks like the repo has migrated the primary branch from "master" to "main" but the default in Github is still set to "master".

There is a "Branches" tab under settings and beneath that is an option to set the default branch.
image

List of elements of `InterfaceB` in a definition of `InterfaceA` for "elementSchema" of a schema with "@type": "Array" not supported

According to the dtdl v2 specification

  • a complex schema can be used e.g. for elementSchema of a schema with "@type":"Array" but
  • an existing Interface cannot be used as complex schema.

This restriction leads to not beeing able to create a list of items which are of a defined Interface. Interfaces can be combined and reused using "@type": "Component", however it's not possible to create a list.

It's designed in that manner to force the usage of Relationship in that case?
Is there any best-practice for the necessity to include a list of 0-n items of an Interface in another Interface?
Are there any plans to support it?

Tested it with:

In order to reproduce the following model can be used:

[
  {
    "@context": "dtmi:dtdl:context;2",
    "@id": "dtmi:com:example:TestRoot;1",
    "@type": "Interface",
    "displayName": "TestRoot",
    "contents": [
      {
        "@type": "Property",
        "name": "Prop1",
        "schema": "string"
      },
      {
        "@type": "Telemetry",
        "name": "TelementryArray",
        "schema": {
          "@type": "Array",
          "elementSchema": "dtmi:com:example:TestElement;1"
        },
        {
        "@type": "Relationship",
        "name": "TestElements",
        "minMultiplicity": 0,
        "maxMultiplicity": 1,
        "target": "dtmi:com:example:TestElement;1"
        },
      }
    ]
  },
  {
    "@context": "dtmi:dtdl:context;2",
    "@id": "dtmi:com:example:TestElement;1",
    "@type": "Interface",
    "displayName": "TestElement",
    "contents": [
      {
        "@type": "Property",
        "name": "Base64String",
        "schema": "string",
        "displayName": "Value",
        "description": "Base64 encoded value"
      }
    ]
  }
]

The following error occurs:

Upload error: RestError: None of the models in this request could be created due to a problem with one or more models: dtmi:com:example:TestRoot;1 has 'contents' value with name 'TelementryArray' which has 'schema' value whose property 'elementSchema' has value dtmi:com:example:TestElement;1 that does not have @type of Array, Enum, Map, or Object, nor is it a standard value for this property. 
Provide a value for property 'elementSchema' with @type in the set of allowable types, or choose one of the following values for property 'elementSchema': boolean, string, double, float, long, integer, date, dateTime, duration, time.. 
See model documentation(http://aka.ms/ADTv2Models) for supported format.

SignalR HubConnection failed to start with status code 400

Description

POST https://adtexplorer-<uid>.service.signalr.net/client/negotiate?hub=adtexplorer-<uid>&negotiateVersion=1 returns 400 status code

image

image

The error was invoked by following line:

await this.connection.start();

Reproduce

  1. Create ADT instance.
  2. Deploy both template.json and template-eventgrid.json.
  3. Login into the explorer web app from either cloud or localhost.
  4. Open browser console.
  5. Run DT query.

unable to install with python 3.8.3

When I running npm install the build fails at
npm ERR! gyp ERR! configure error
npm ERR! gyp ERR! stack Error: Command failed: C:...\Programs\Python\Python38-32\python.EXE -c import sys; print "%s.%s.%s" % sys.version_info[:3];
npm ERR! gyp ERR! stack File "", line 1
npm ERR! gyp ERR! stack import sys; print "%s.%s.%s" % sys.version_info[:3];
npm ERR! gyp ERR! stack ^
npm ERR! gyp ERR! stack SyntaxError: invalid syntax
npm ERR! gyp ERR! stack at ChildProcess.exithandler (node:child_process:333:12)
npm ERR! gyp ERR! stack at ChildProcess.emit (node:events:376:20)
npm ERR! gyp ERR! stack at maybeClose (node:internal/child_process:1063:16)
npm ERR! gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:295:5)
npm ERR! gyp ERR! System Windows_NT 10.0.19042
npm ERR! gyp ERR! command "C:\Program Files\nodejs\node.exe" "C:\ADT\ADT_explorer\client\node_modules\node-gyp\bin\node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
npm ERR! gyp ERR! cwd C:\ADT\ADT_explorer\client\node_modules\node-sass
npm ERR! gyp ERR! node -v v15.5.0 npm ERR! gyp ERR! node-gyp -v v3.8.0
npm ERR! gyp ERR! not ok
npm ERR! Build failed with error code: 1

Seems related to node-gyp v3.8.0 dependency of node-sass (<5) what supports only python2 (EOL)
sass/node-sass#3013
sass/node-sass#2877
Also interesting Libsaas and Node-Saas are deprecated. Not better to switch to sass as recommended?

Model upload process should be improved

I have a single json file with 450 models to be uploaded, since api is not allowing upload the array of models exceeding the 250 items I can split the array into batches during upload but there is a chance of having model referencing another model which is missing from this upload batch as the result - I cannot complete the upload at all.

There should be a step of building a model dependency graph before upload which will prioritize models to upload in batches without dependencies so the next batch upload will succeed based on already uploaded dependencies.

Thanks!

(dev-branch) Model Explorer not loading without page refresh

I start with an empty ADT instance and then upload a set of model files. If I go straight to the Model View tab, nothing shows up. The model will only show up after I refresh the entire ADT-Explorer web page with the browser refresh button. I would expect viewing a newly uploaded model to work without a page refresh

image

Docker config step missing? 400 reason: connect ECONNREFUSED

I was running the docker version of this twin explorer, and getting a 400 error from the Identity.

per the docs I did:

  • Access tokens enabled
  • URI callback http://localhost:3000 added

Application runs fine, however the credential issue below.

AuthenticationError: ManagedIdentityCredential authentication failed.(status code 400).
More details:
request to http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fdigitaltwins.azure.net&api-version=2018-02-01 failed, reason: connect ECONNREFUSED 169.254.169.254:80
    at ManagedIdentityCredential.<anonymous> (/usr/src/app/client/node_modules/@azure/identity/dist/index.js:1077:23)
    at Generator.throw (<anonymous>)
    at rejected (/usr/src/app/client/node_modules/@azure/identity/node_modules/tslib/tslib.js:112:69)
    at processTicksAndRejections (node:internal/process/task_queues:93:5)
  • is it required to add 169.254.169.254 to the callback URI in the app, in addition to localhost?
  • is there another place in the system setup where you need to add the DT credentials?

Authentication Issue in Local Machine

Hi,
I am new to ADT, I was exploring the ADT Explorer, but ran into some authentication issue, and apparently cannot workaround. I follow the steps mentioned in the documentation about creating an instance, authentication, role assignment etc. Despite that I get CustomAuthErrorMessage each time I launch the sample client application:

image

I have my Azure account logged in my local machine:
image

and in VS code:
image

I am using the same account that I created the ADT instance, and I have the owner role over the instance (inherited).

Despite all, I cannot still figure out what is wrong. Would be happy if anyone helps. Thanks in advance

Object view not updating

I am running the explorer locally. If I update the state of a twin externally then refresh the explorer by re-running the query "SELECT * FROM digitaltwins", then in the Output window I see the updated state of the twin. However if I have the RHS object view panel open, this is not updated.

Error in importing graph

Hi,

When I try upload a graph I receive the following error:

"Error in importing graph: ClientAuthError: AcquireToken_In_Progress: Error during login call - login is already in progress"

How can I solve this problem?

adt-explorer - Invalid relationship

I'm trying to build device twins model and network for a Water fleet management. I created the JSON models required based on the examples and created a data file as per the format specified. When I try to import the data it says,

Twin relationship does not align with the model. Please ensure that the relationship is in alignment with the model. See section on listing models http://aka.ms/adtv2models.

I am using the Network tab of the Developer tools to identify and fix any issues. But when I create this model manually it works completely fine. Its only when I import it gives me the above error.

This is the result when I do it manually,
image

This is the result I'm getting when I upload the data file
image

I'm attaching the model files that I created for my requirement
wsf-models.zip

Data file
WaterSupplyFleet.xlsx

Also attaching the HAR file of the all the requests and responses during the request
adt-explorer-har.zip

Please let me know if you need any other information.

Thanks

Make interface inheritance relationships viewable in the graph

Feature request: Today, interface inheritance relationships are not viewable in the graph (or the Model View window). While querying for instance type is straightforward, Interface Inheritance seems to only be encoded in the extends attribute (and informally (?) in @id). This suggests that analytics need to do string parsing to figure out inheritance. If interface inheritance was displayed as "first class relationship" that was directly queryable, analytics would be simplified. This suggests that instance to type "lines" could also be optionally shown in the graph and not just displayed in the property explorer window. It might be nice to allow the Model View to be shown as a flat list or tree...

Trying Sign In ... Unhandled Rejection (TypeError): Cannot read property 'on' of undefined

I fill the Client ID, Tenant ID and ADT URL to connect, but when I press the "Connect" button I receive this error:

Unhandled Rejection (TypeError): Cannot read property 'on' of undefined
SignalRService.subscribe
C:/Projects/Azure_Digital_Twins__ADT__explorer/client/src/services/SignalRService.js:41

Do you know how to fix this problem? I attach a capture.
Unhandled

Default initialization of properties

Currently when you create a new instance of a Twin, you have to manually set values for all properties or they are not generated. This means that a function populating the Twin using "replace" operations will fail. Would be nice to have a mode where are created instances have default values set for all properties. To do the equivalent of:

az dt twin create --dt-name IOTechTwins -m "dtmi:com:iotechsys:devicevirtual;1" -t device-virtual --properties '{
"OutputBool": false,
"OutputUint8": 0,
"OutputUint16": 0,
"OutputUint32": 0,
"OutputUint64": 0,
"OutputInt8": 0,
"OutputInt16": 0,
"OutputInt32": 0,
"OutputInt64": 0,
"OutputFloat32": 0.0,
"OutputFloat64": 0,
"InputBool": false
}'

Failed to create instance from a PnP Device model

The Model imports correctly, but creating a twin fails.

The Device Model has multiple components and each Component exists as a separate Model.

SensorSync Model in the catalog

Example:

{
  "@context": "dtmi:dtdl:context;2",
  "@id": "dtmi:MeshSystems:tXs:SensorSync;1",
  "@type": "Interface",
  "displayName": {
    "en": "Valentine"
  },
  "contents": [
    {
      "@type": "Component",
      "name": "DeviceInformation",
      "schema": "dtmi:azure:DeviceManagement:DeviceInformation;1"
    },
    {
      "@type": "Component",
      "displayName": {
        "en": "Exterior"
      },
      "name": "Exterior",
      "schema": "dtmi:MeshSystems:txs:blereader;1"
    },

And there are models for the different Schemas:

  • dtmi:azure:DeviceManagement:DeviceInformation;1
  • dtmi:MeshSystems:txs:blereader;1

For completion, I tried to create the twin using await client.CreateOrReplaceDigitalTwinAsync<BasicDigitalTwin>

And this is the response

Create twin error: 400:Service request failed.
Status: 400 (Bad Request)

Content:

{
	"error": {
		"code": "ValidationFailed",
		"message": "Invalid twin specified",
		"details": [
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component DeviceInformation."
			},
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component Exterior."
			},
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component Peripherals."
			},
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component Location."
			},
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component Cellular."
			},
			{
				"code": "ValidationFailed",
				"message": "Twin is missing mandatory component Modbus."
			}
		]
	}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.