Giter Site home page Giter Site logo

project-monai / monai-deploy Goto Github PK

View Code? Open in Web Editor NEW
95.0 25.0 21.0 226.47 MB

MONAI Deploy aims to become the de-facto standard for developing, packaging, testing, deploying and running medical AI applications in clinical production.

License: Apache License 2.0

C# 15.67% Dockerfile 1.09% JavaScript 14.10% Shell 36.01% Python 28.25% Batchfile 0.68% Smarty 4.20%
monai ai medical-imaging inference healthcare guidelines open-standard ai-application-development ai-application-deployment deep-learning

monai-deploy's Introduction

project-monai

MONAI Deploy Working Group

MONAI Deploy aims to become the de-facto standard for developing, packaging, testing, deploying and running medical AI applications in clinical production.

If you want to know more about its purpose and vision, please review the MONAI Deploy WG wiki.

Focus

MONAI Deploy builds on the foundation set by MONAI.

Where MONAI is focused on training and creating models, MONAI Deploy is focused on defining the journey from research innovation to clinical production environments in hospitals. Our guiding principles are:

  • Implementation mindset. Create tangible assets: tools, applications and demos/prototypes.
  • Radiology first, then other modalities like Pathology.
  • Interoperability with clinical systems. Starting with DICOM, then FHIR.
  • Central repository to facilitate collaboration among institutions.

Status

MONAI Deploy was released at MICCAI 2021 and was part of the MONAI 2021 Bootcamp. Since then we have released several versions of some of the sub-systems, while others are being actively developed. Please check out the next section.

Key assets

Community

To participate, please join the MONAI Deploy WG weekly meetings on the calendar. All the recordings and meeting notes since day zero can be found at MONAI Deploy WG master doc

Join our Slack channel or join the conversation on Twitter @ProjectMONAI.

Ask and answer questions over on MONAI Deploy's GitHub Discussions tab or MONAI's GitHub Discussions tab.

Links

monai-deploy's People

Contributors

awsjpleger avatar bhatt-piyush avatar danulf avatar dbericat avatar ericspod avatar gigony avatar greyseawolf avatar hshuaib90 avatar jackschofield23 avatar jhancox avatar joebatt1989 avatar kavinkrishnan avatar mmelqin avatar mocsharp avatar pritishnahar avatar pritishnahar95 avatar rahul-imaging avatar remakingeden avatar ristoh avatar rupeshs avatar slbryson avatar whoisj avatar woodheadio avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

monai-deploy's Issues

Create MONAI Application SDK

Data scientists and developers should have an easy way to develop, debug and optimize Monai Applications easily from pytorch/MONAI models.

[FEA] Support for IHE AIW-I Task Performer

IHE AIW-I defines the workflow for running AI applications in production using a "ticketing" process, based in an interoperable DICOMweb technology called UPS-RS. See
https://www.ihe.net/uploadedFiles/Documents/Radiology/IHE_RAD_Suppl_AIW-I.pdf

It would be ideal to have a service within MONAI Deploy to function as the "Task Performer", to either be notified (via WebSocket) or through polling of a job to be done, claim the job, process the job and send the results, and mark the job as complete.

I have created and tested a module that could potentially be integrated into a MONAI Deploy service, happy to share if it's helpful!

MD Express error when starting from within a docker

I can start and run MD Express from a local host. However I get an error from docker compose not finding files when I try to run MD express through a docker in docker after exposing the socket
-v /var/run/docker.sock:/var/run/docker.sock

it seems like MD express launches dockers of its own as I get error trying to look for config files that are there on host but are mounted in different directory in the docker

Work around:
I map the host dir to exactly the same as is into the docker as
-v /home/harouni/demos/MDExpress:/home/harouni/demos/MDExpress

Proposed solution
we can add a variable in the .env file as HOST_PATH which would be defaulted to $PWD but I can then change to read from an environment variable as
-e HOST_PATH=/home/harouni/demos/MDExpress
That way you can launch the sub dockers using this variable HOST_PATH

[FEA] Support for IHE AIW-I Task Manager

IHE AIW-I defines the workflow for running AI applications in production using a "ticketing" process, based in an interoperable DICOMweb technology called UPS-RS. See
https://www.ihe.net/uploadedFiles/Documents/Radiology/IHE_RAD_Suppl_AIW-I.pdf

It would be ideal to have a service within MONAI Deploy to function as the "Task Manager", recording requests for jobs, responding to requests and updates, and notifying AI applications of new jobs to be performed.

I have created and tested a module that could potentially be integrated into a MONAI Deploy service, happy to share if it's helpful!

e2e test scenarios: KeyError: 'nifti_affine_transform' in liver_seg

Hi,

I set up a monai cluster following this instructions Project-MONAI/monai-deploy-workflow-manager#666 . In order to test it, I am using this clinical workflow from the e2e test scenarios https://github.com/Project-MONAI/monai-deploy/blob/main/e2e-testing/test-scenarios/Clinical_Workflows/liver_seg.json.

I download the images from this Medical Decathlon, as explained in https://github.com/Project-MONAI/monai-deploy/tree/main/deploy/monai-deploy-express#running-a-monai-deploy-workflow .

I rename the image work/monai-deploy-workflow-manager/deploy/examples/monai-deploy-lite-liver-ct/000587.dcm to input-dicom and uploaded this to Orthanc.

Then, I send the study to the Dicom modality and I get this error in the liver segmentation container:

Going to initiate execution of operator DICOMDataLoaderOperator
Executing operator DICOMDataLoaderOperator (Process ID: 25, Operator ID: e9628084-9846-44aa-b89
a-efb844fe1855)
[2023-02-08 13:36:52,828] [WARNING] (root) - Ignored /var/monai/input/1.2.826.0.1.3680043.2.112
5.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565
085991567/1.2.826.0.1.3680043.2.1125.1.6173393152365463986791105716239686.dcm.json, reason bein
g: File is missing DICOM File Meta Information header or the 'DICM' prefix is missing from the
header. Use force=True to force reading.
Done performing execution of operator DICOMDataLoaderOperator

Going to initiate execution of operator DICOMSeriesSelectorOperator
Executing operator DICOMSeriesSelectorOperator (Process ID: 25, Operator ID: f6def06a-5cc6-4e0c
-9eaf-2c44340db6d9)
[2023-02-08 13:36:52,831] [INFO] (root) - Finding series for Selection named: CT Series
[2023-02-08 13:36:52,831] [INFO] (root) - Searching study, : 1.2.826.0.1.3680043.2.1125.1.19616
861412188316212577695277886020
  # of series: 1
[2023-02-08 13:36:52,831] [INFO] (root) - Working on series, instance UID: 1.2.826.0.1.3680043.
2.1125.1.34918616334750294149839565085991567
[2023-02-08 13:36:52,831] [INFO] (root) - On attribute: 'Modality' to match value: '(?i)CT'
[2023-02-08 13:36:52,831] [INFO] (root) -     Series attribute Modality value: CT
[2023-02-08 13:36:52,831] [INFO] (root) - Series attribute string value did not match. Try regE
x.
[2023-02-08 13:36:52,831] [INFO] (root) - On attribute: 'ImageType' to match value: '['PRIMARY'
, 'ORIGINAL']'
[2023-02-08 13:36:52,831] [INFO] (root) -     Series attribute ImageType value: None
[2023-02-08 13:36:52,831] [INFO] (root) - On attribute: 'PhotometricInterpretation' to match va
lue: 'MONOCHROME2'
[2023-02-08 13:36:52,831] [INFO] (root) -     Series attribute PhotometricInterpretation value:
 None
[2023-02-08 13:36:52,831] [INFO] (root) - Selected Series, UID: 1.2.826.0.1.3680043.2.1125.1.34
918616334750294149839565085991567
Done performing execution of operator DICOMSeriesSelectorOperator

Going to initiate execution of operator DICOMSeriesToVolumeOperator
Executing operator DICOMSeriesToVolumeOperator (Process ID: 25, Operator ID: 5036c2da-fcc6-4f9a
-abf8-a59b70c28dd6)
Done performing execution of operator DICOMSeriesToVolumeOperator

Going to initiate execution of operator LiverTumorSegOperator
Executing operator LiverTumorSegOperator (Process ID: 25, Operator ID: 0b577306-331d-48f7-a974-
9b462b54a155)
Operator output folder path: /var/monai/operators/0b577306-331d-48f7-a974-9b462b54a155/0/output
/saved_images_folder
Converted Image object metadata:
SeriesInstanceUID: 1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567, type <clas
s 'str'>
SeriesDate: 20201014, type <class 'str'>
SeriesTime: 162720, type <class 'str'>
Modality: CT, type <class 'str'>
SeriesDescription: CT series for liver tumor from nii 014, type <class 'str'>
PatientPosition: HFS, type <class 'str'>
SeriesNumber: 1, type <class 'int'>
row_pixel_spacing: 0.685546875, type <class 'float'>
col_pixel_spacing: 0.685546875, type <class 'float'>
depth_pixel_spacing: 1.0, type <class 'float'>
row_direction_cosine: [1.0, 0.0, 0.0], type <class 'list'>
col_direction_cosine: [0.0, 1.0, 0.0], type <class 'list'>
StudyInstanceUID: 1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020, type <class
 'str'>
StudyID: SLICER10001, type <class 'str'>
StudyDate: 20201014, type <class 'str'>
StudyTime: 162720, type <class 'str'>
StudyDescription: CT Study for liver_14, type <class 'str'>
AccessionNumber: 1, type <class 'str'>
selection_name: CT Series, type <class 'str'>
2023-02-08 13:36:52,872 INFO image_writer.py:193 - writing: /var/monai/operators/0b577306-331d-
48f7-a974-9b462b54a155/0/output/saved_images_folder/1.2.826.0.1.3680043.2.1125.1/1.2.826.0.1.36
80043.2.1125.1.nii.gz
2023-02-08 13:36:58,139 INFO image_writer.py:193 - writing: /var/monai/operators/0b577306-331d-
48f7-a974-9b462b54a155/0/output/saved_images_folder/1.2.826.0.1.3680043.2.1125.1/1.2.826.0.1.36
80043.2.1125.1_seg.nii.gz
Output Seg image numpy array shaped: (1, 512, 512)
Output Seg image pixel max value: 0
Done performing execution of operator LiverTumorSegOperator

Going to initiate execution of operator DICOMSegmentationWriterOperator
Executing operator DICOMSegmentationWriterOperator (Process ID: 25, Operator ID: e616ac55-6c84-
4e00-a535-f26243e0b670)
[2023-02-08 13:36:58,481] [WARNING] (highdicom.seg.sop) - Encoding an empty segmentation with "
omit_empty_frames" set to True. Reverting to encoding all frames since omitting all frames is n
ot possible.
[2023-02-08 13:36:58,482] [INFO] (highdicom.seg.sop) - add plane #0 for segment #1
[2023-02-08 13:36:58,483] [INFO] (highdicom.seg.sop) - add plane #0 for segment #2
[2023-02-08 13:36:58,484] [INFO] (highdicom.base) - copy Image-related attributes from dataset
"1.2.826.0.1.3680043.2.1125.1.6173393152365463986791105716239686"
[2023-02-08 13:36:58,484] [INFO] (highdicom.base) - copy attributes of module "Specimen"
[2023-02-08 13:36:58,484] [INFO] (highdicom.base) - copy Patient-related attributes from datase
t "1.2.826.0.1.3680043.2.1125.1.6173393152365463986791105716239686"
[2023-02-08 13:36:58,484] [INFO] (highdicom.base) - copy attributes of module "Patient"
[2023-02-08 13:36:58,485] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial S
ubject"
[2023-02-08 13:36:58,485] [INFO] (highdicom.base) - copy Study-related attributes from dataset
"1.2.826.0.1.3680043.2.1125.1.6173393152365463986791105716239686"
[2023-02-08 13:36:58,485] [INFO] (highdicom.base) - copy attributes of module "General Study"
[2023-02-08 13:36:58,485] [INFO] (highdicom.base) - copy attributes of module "Patient Study"
[2023-02-08 13:36:58,485] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial S
tudy"
Done performing execution of operator DICOMSegmentationWriterOperator

Going to initiate execution of operator STLConversionOperator
Executing operator STLConversionOperator (Process ID: 25, Operator ID: 42c57850-90f7-4eef-8b75-5a00f43a97b9)
[2023-02-08 13:36:58,495] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConversionOperator) - Output will be saved in file /var/monai/output/stl/liver_seg.stl.
Traceback (most recent call last):
  File "/opt/monai/app/app.py", line 142, in <module>
    app_instance.run()
  File "/opt/monai/app/app.py", line 62, in run
    super().run(*args, **kwargs)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/core/application.py", line 429, in run
    executor_obj.run()
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/core/executors/single_process_executor.py", line 125, in run
    op.compute(op_exec_context.input_context, op_exec_context.output_context, op_exec_context)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 95, in compute
    stl_bytes = self._convert(input_image, _output_file)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 121, in _convert
    return self._converter.convert(
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 165, in convert
    s_image = self.SpatialImage(image)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 297, in __init__
    self._read_from_in_mem_image(self._image)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 394, in _read_from_in_mem_image
    img_array, affine, original_affine, shape, spacing, itk_image = self._load_data(image)
  File "/root/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 366, in _load_data
    original_affine = img_meta_dict["nifti_affine_transform"]
KeyError: 'nifti_affine_transform'

What am I doing wrong?

Thanks

MD Express dicom seg is flipped when displayed on OHIF viewer

I have added OHIF viewer 2.x used by monai label. I made OHIF use orthanc instance used by MD express.
Things worked great, but unfortunately the returned dicom seg appears to be flipped as shown below
image

I tired with some other TCIA data but it seems to be always flipped or sometimes OHIF can't even view the dicom seg giving the error

DICOM Segmentation Loader
Source Image Sequence information missing: individual SEG frames are out of plane. This is not yet supported. Aborting segmentation loading.

It seems like the dicom seg is some how out of the ROI or plane.
@MMelQin pointed out it might be related to this issue
Tagging @dbericat

Move to Public Repository

We need to move to a public repo, which requires:

  • renaming repo
  • complete #5
  • complete #8
  • agree how to use Wiki
  • publish roadmap/milestones

Considerations For Using MAPs In Research

As discussed recently there are a few specific details and points of discussion relating to how MAPs are used in research, specifically using them as means of distributing research networks not meant for front line use to users with a range of environments and skill levels. I would suggest that a few of the points here should be integrated into our guideline documents in some way but also just stand as a set of use cases that I would encounter as a researcher, but I imagine a lot of this is present in our current design and documentation.

Considering a researcher who has developed a MAP there are different audience categories for that software:

  • Students studying a biomedical degree which may or may not include research activities
  • Fellow collaborator researcher doing similar activities
  • Other researcher doing similar activities
  • Competition/challenge/workshop manager receiving software submissions from participants
  • Reviewer participating in a peer review process

What's important to consider with these audiences is what they would want to get from the MAP, their technical capability, how they would host the MAP, and what requirements they might have the MAP must satisfy. Students probably would use MAPs just to get results using a dataset they've been given as an exercise, or part of research where the network is not the subject of study directly but just a tool. Other collaborators would have a deeper understanding of the network and would have likely contributed to its development. Other researchers not involved in developing the MAP would want to use it for testing, comparison against their own software, or as part of their own research. Competition managers would want to run the MAP as part of a testing phase using hidden data participants can't access. Reviewers would want to run the MAP to verify the claimed results in submissions are true.

For technical capability, knowledge of Docker is essential but otherwise no knowledge of deep learning or other expertise would be required, a student for example may need to run a MAP to get results for data they have been given so need only know how to run it. Other collaborators or external researchers would want to use the MAP for professional work rather than educational exercises, but their expertise would vary as well but could be reasonably expected to understand what the packaged network is and how it functions in general terms. Reviewers similarly may not be familiar with MONAI Deploy but would be gathering a deep understanding of the network itself in the process of reviewing submitted work. Many challenges and competitions require participants to submit their software as Docker or Singularity images because inference must be run on hidden test data, these will specify how the image is meant to behave so that testing can be automated. How MAPs actually function would require adjustment most likely to meet such requirements but the expectation is that the managers themselves would have significant understanding of Docker and the networks the images host.

How a MAP is hosted or run would for the most part involving running the image on the command line. Larger infrastructure like Kubernetes likely wouldn't be used and wouldn't be needed, so consideration must be given to how a user would invoke the MAP directly with or without the Deploy App SDK. Users would expect to provide input files on the command line and get results in an output folder, all these require mount points through Docker. For students especially this interface has to be quite straight forward with input and output done through common file types (eg. Dicom or Nifti for image data). Extending MAPs to include in-built modes for apply inference to batches of files or to run as a web service would provide very useful options as well.

Actually distributing a MAP would be through a Docker registry but could be easiest as a saved export file despite its size. Within an institutional network this wouldn't be a problem but a compelling alternative is running the MAP as a web service on one's own compuiter and made accessible across this network. This is much more lightweight than setting up a Kubernetes cluster or other infrastructure since it can be done using workstation level computers as described in the guidelines. It's important for scientific reproducibility to make ones code and data available whenever possible, MAPs can help with this task most of all by being straight forward to use with a minimal amount of understanding of MONAI Deploy itself.

Automated functional e2e test

Create a automated functional end to end test which will verify that MONAI deploy has been deployed successfully.

OneTimeSetup

  • Hit healthcheck endpoints (MIG, MWM, MTM, Argo, Mongo, Minio, Rabbit)
  • Delete any workflows by AET

Before Test

  • Creation of a bucket in Minio (monaideploy)
  • Creation of a Workflow in Mongo
  • Create of an Argo workflow
  • Creation of AET in MIG

Act

  • Send a DICOM slice to MIG

Assert

  • WorflowInstance created
  • Argo Task ran successfully

Test applications for validating systems that use MONAI Deploy components

Hi all, as discussed in the MONAI deploy WG meeting 7-Apr-22 I think there would be value in creating a suite of MONAI application packages for testing the capabilities of a system that uses MONAI deploy components. These applications would serve the same purpose for MONAI deploy platforms as phantoms do for QA/acceptance testing in imaging.

To give a bit of background: at GSTT we will be deploying the AIDE platform, which is built using the MONAI informatics gateway and MONAI workflow manager components, we will also use the MONAI application runner plugin for the WFM. When AIDE is deployed at GSTT we will need to perform some kind of acceptance testing to ensure that the entire system is fit and safe to use in a clinical setting. When I refer to the system here, I mean the specific combination of software, hardware, network and other local factors that make up the local environment. The testing would aim to verify that this system as a whole can operate reliably under pressure. The testing we perform will aim to explore (non-exhaustive):

  • System loading/stress tests: max number of applications that the system (local software + hardware combination) can support. This would probably use many instances of several different test MAPs (small, medium, large resource use)
  • Performance metrics of applications when system is running at low-high loads. Metrics including things like startup time, processing time delta, etc
  • understand modes of failure

I'm not sure exactly what operations the MAPs themselves would perform, but we would need to test the CPU and GPU resources in isolation and in combination.

I'd like to start a discussion here to gauge whether the community feels these applications would be valuable as part of MONAI and whether there is anything missing from the concept. Please leave your thoughts below!

MD Express: Issues with .env and docker-compose file

There are 2 minor issues with the MD Express docker-compose.yml and .env file that prevents launching the services without modifications:

  1. In the .env file, the TASK_MANAGER_DATA has an additional $PWD that is not present in other DATA variables and throws an error when launched. The error arises due to docker-compose assuming $PWD as a named volume.

Named volume "$PWD/.md/mdtm:/var/lib/monai:rw" is used in service "task-manager" but no declaration was found in the volumes section.

Edit - Saw this issue in the FAQ section, Why does only the TASK_MANAGER_DATA need $PWD though? isn't .md/ already assuming the docker-compose up is run from the MDE working directory?

  1. If the idea is to use the .env file to set the desired data directories, the data volume for elasticsearch service should also be defined there, instead of directly assigning it to .md/esdata in the docker-compose.yml file.

MAP built with App SDK v0.6 fails when running on MONAI Deploy Express

Observed:
Configured a workflow to run an example map built with MONAI Deploy App SDK v0.6, e.g., ghcr.io/mmelqin/monai_ai_livertumor_seg_app_no_stl-x64-workstation-dgpu-linux-amd64:2.0, but workflow instance failed with "no output"

Expected:
MD Express should support the MAPs built with App SDK v0.6 and after; these MAPs are built differently than those with MONAI Deploy App SDK v0.5.x, and have additional requirements on the permission on the host folders (which are to be mapped as MAP output"

Preliminary Analysis:
Root cause is in the writer permission, or the lack thereof, on the output folder in the host file system.

Workaround attempted but did not working due to TM/Docker behavior:

MD Express TaskManager container itself maps host volume to its internal folder, and also uses them for MAP input and output, so tried to first create the host folder with the logged-on user's security context, e.g. mkdir -p .md/mdtm. However, when the TaskManager creates further sub-folder for a triggered task, the folder is created as owned by root, without sufficient permission for the MAP to write output to.

(.test) mqin@mingq-dt:~/md-express/deploy/monai-deploy-express$ ll .md
total 32
drwxrwxr-x   8 mqin             mqin 4096 Oct 24 19:17 ./
drwxr-xr-x   8 mqin             mqin 4096 Oct 24 19:16 ../
drwxr-xr-x   4 root             root 4096 Oct 24 19:17 mdig/
drwxrwxr-x   2 mqin             mqin 4096 Oct 24 19:16 mdtm/
drwxr-xr-x   4 root             root 4096 Oct 24 19:16 minio/
drwxr-xr-x   4 systemd-coredump root 4096 Oct 24 19:17 mongodb/
drwxr-xr-x 231 root             root 4096 Oct 24 19:18 orthanc/
drwxr-xr-x   4 systemd-coredump root 4096 Oct 24 19:17 rabbitmq/
(.test) mqin@mingq-dt:~/md-express/deploy/monai-deploy-express$ ll .md/mdtm
total 12
drwxrwxr-x 3 mqin mqin 4096 Oct 24 19:19 ./
drwxrwxr-x 8 mqin mqin 4096 Oct 24 19:17 ../
drwxr-xr-x 4 root root 4096 Oct 24 19:19 960636a5-d7be-415c-9fe3-1d93b0e783c6/

Step to reproduce:
Create an workflow definition as below (note, the attributes starting with env_ do not matter as the actual values are all correct and the MAP has default env vars. However, different error happened, see Additional Info)
Download the MD Express, docker compose up
Register the workflow, per instruction on MD Express Readme
Upload the a CT abdomen series, per MD Express Readme
Send the series to DICOM device destination, monai-deploy
Examine the console output of docker compose

{
        "name": "ai-liver-seg-2",
        "version": "1.0.0",
        "description": "AI Liver Segmentation - 2",
        "informatics_gateway": {
                "ae_title": "MONAI-DEPLOY",
                "data_origins": [
                        "ORTHANC"
                ],
                "export_destinations": [
                        "ORTHANC"
                ]
        },
        "tasks": [
                {
                        "id": "router",
                        "description": "Ensure series description contains liver",
                        "type": "router",
                        "task_destinations": [
                                {
                                        "name": "liver",
                                        "conditions": ["{{ context.dicom.series.any('0008','103E')}} == 'CT series for liver tumor from nii 014'"]
                                }
                        ]
                },
                {
                        "id": "liver",
                        "description": "Execute Liver Segmentation MAP",
                        "type": "docker",
                        "args": {
                                "container_image": "ghcr.io/mmelqin/monai_ai_livertumor_seg_app_no_stl-x64-workstation-dgpu-linux-amd64:2.0",
                                "server_url": "unix:///var/run/docker.sock",
                                "entrypoint": "/bin/bash,-c",
                                "command": "python3 -u /opt/holoscan/app/app.py",
                                "task_timeout_minutes": "5",
                                "temp_storage_container_path": "/var/lib/monai/",
                                "env_MONAI_INPUTPATH": "/var/holoscan/input/",
                                "env_MONAI_OUTPUTPATH": "/var/holoscan/output/",
                                "env_MONAI_MODELPATH": "/opt/holoscan/models/",
                                "env_MONAI_WORKDIR": "/var/holoscan/"
                        },
                        "artifacts": {
                                "input": [
                                        {
                                                "name": "env_MONAI_INPUTPATH",
                                                "value": "{{ context.input.dicom }}"
                                        }
                                ],
                                "output": [
                                        {
                                                "name": "env_MONAI_OUTPUTPATH",
                                                "mandatory": true
                                        }
                                ]
                        },
                        "task_destinations": [
                                {
                                        "name": "export-liver-seg"
                                }
                        ]
                },
                {
                        "id": "export-liver-seg",
                        "description": "Export Segmentation Storage Object",
                        "type": "export",
                        "export_destinations": [
                                {
                                        "Name": "ORTHANC"
                                }
                        ],
                        "artifacts": {
                                "input": [
                                        {
                                                "name": "export-dicom",
                                                "value": "{{ context.executions.liver.artifacts.env_MONAI_OUTPUTPATH }}",
                                                "mandatory": true
                                        }
                                ],
                                "output": []
                        }
                }
        ]
}

Additional Info:
Used a config that replaces all monai with holoscan, but encountered a different error.
This one is because that in the docker-compose.yml file, the TaskManager container itself has the volume mapping set to its own /var/lib/monai/, so the workflow def cannot use "temp_storage_container_path": "/var/lib/holoscan/"

So, stick with "temp_storage_container_path": "/var/lib/monai/", and change it in the following config, to avoid this other binding error.

{
        "name": "ai-liver-seg-HS",
        "version": "1.0.0",
        "description": "AI Liver Segmentation - HS",
        "informatics_gateway": {
                "ae_title": "MONAI-DEPLOY",
                "data_origins": [
                        "ORTHANC"
                ],
                "export_destinations": [
                        "ORTHANC"
                ]
        },
        "tasks": [
                {
                        "id": "router",
                        "description": "Ensure series description contains liver",
                        "type": "router",
                        "task_destinations": [
                                {
                                        "name": "liver",
                                        "conditions": ["{{ context.dicom.series.any('0008','103E')}} == 'CT series for liver tumor from nii 014'"]
                                }
                        ]
                },
                {
                        "id": "liver",
                        "description": "Execute Liver Segmentation MAP",
                        "type": "docker",
                        "args": {
                                "container_image": "ghcr.io/mmelqin/monai_ai_livertumor_seg_app_no_stl-x64-workstation-dgpu-linux-amd64:2.0",
                                "server_url": "unix:///var/run/docker.sock",
                                "entrypoint": "/bin/bash,-c",
                                "command": "python3 -u /opt/holoscan/app/app.py",
                                "task_timeout_minutes": "5",
                                "temp_storage_container_path": "/var/lib/holoscan/",
                                "env_HOLOSCAN_INPUT_PATH": "/var/holoscan/input/",
                                "env_HOLOSCAN_OUTPUT_PATH": "/var/holoscan/output/",
                                "env_HOLOSCAN_MODEL_PATH": "/opt/holoscan/models/",
                                "env_HOLOSCAN_WORKDIR": "/var/holoscan/"
                        },
                        "artifacts": {
                                "input": [
                                        {
                                                "name": "env_HOLOSCAN_INPUT_PATH",
                                                "value": "{{ context.input.dicom }}"
                                        }
                                ],
                                "output": [
                                        {
                                                "name": "env_HOLOSCAN_OUTPUT_PATH",
                                                "mandatory": true
                                        }
                                ]
                        },
                        "task_destinations": [
                                {
                                        "name": "export-liver-seg"
                                }
                        ]
                },
                {
                        "id": "export-liver-seg",
                        "description": "Export Segmentation Storage Object",
                        "type": "export",
                        "export_destinations": [
                                {
                                        "Name": "ORTHANC"
                                }
                        ],
                        "artifacts": {
                                "input": [
                                        {
                                                "name": "export-dicom",
                                                "value": "{{ context.executions.liver.artifacts.env_HOLOSCAN_OUTPUT_PATH }}",
                                                "mandatory": true
                                        }
                                ],
                                "output": []
                        }
                }
        ]
}

But got a different error from task manager on binding the volumes

mdl-tm        | 2023-10-25 00: 52: 09.7745|1010|DEBUG|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|source=1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm, target=/var/lib/holoscan/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm, EventId=1010, EventName=DownloadingArtifactFromStorageService, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Downloading artifact 1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm to /var/lib/holoscan/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm. 
mdl-tm        | 2023-10-25 00: 52: 09.7890|1010|DEBUG|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|source=1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm.json, target=/var/lib/holoscan/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm.json, EventId=1010, EventName=DownloadingArtifactFromStorageService, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Downloading artifact 1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm.json to /var/lib/holoscan/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.99874075217122757899127966289752952.dcm.json. 
mdl-tm        | 2023-10-25 00: 52: 10.0870|1006|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|hostPath=/home/mqin/md-express/deploy/monai-deploy-express/sample-workflows/.md/mdtm/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH, containerPath=/var/holoscan/input/, EventId=1006, EventName=DockerInputMapped, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Input volume mapping host==/home/mqin/md-express/deploy/monai-deploy-express/sample-workflows/.md/mdtm/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH, container=/var/holoscan/input/. 
mdl-tm        | 2023-10-25 00: 52: 10.0872|1007|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|hostPath=/home/mqin/md-express/deploy/monai-deploy-express/sample-workflows/.md/mdtm/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/outputs/env_HOLOSCAN_OUTPUT_PATH, containerPath=/var/holoscan/output/, EventId=1007, EventName=DockerOutputMapped, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Output volume mapping host==/home/mqin/md-express/deploy/monai-deploy-express/sample-workflows/.md/mdtm/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/outputs/env_HOLOSCAN_OUTPUT_PATH, container=/var/holoscan/output/. 
mdl-tm        | 2023-10-25 00: 52: 10.0873|1008|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|key=HOLOSCAN_INPUT_PATH, value=/var/holoscan/input/, EventId=1008, EventName=DockerEnvironmentVariableAdded, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Environment variabled added HOLOSCAN_INPUT_PATH=/var/holoscan/input/. 
mdl-tm        | 2023-10-25 00: 52: 10.0874|1008|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|key=HOLOSCAN_OUTPUT_PATH, value=/var/holoscan/output/, EventId=1008, EventName=DockerEnvironmentVariableAdded, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Environment variabled added HOLOSCAN_OUTPUT_PATH=/var/holoscan/output/. 
mdl-tm        | 2023-10-25 00: 52: 10.0874|1008|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|key=HOLOSCAN_MODEL_PATH, value=/opt/holoscan/models/, EventId=1008, EventName=DockerEnvironmentVariableAdded, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Environment variabled added HOLOSCAN_MODEL_PATH=/opt/holoscan/models/. 
mdl-tm        | 2023-10-25 00: 52: 10.0875|1008|INFO|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|key=HOLOSCAN_WORKDIR, value=/var/holoscan/, EventId=1008, EventName=DockerEnvironmentVariableAdded, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Environment variabled added HOLOSCAN_WORKDIR=/var/holoscan/. 
mdl-tm        | 2023-10-25 00: 52: 10.0891|1002|ERROR|Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin|EventId=1002, EventName=ErrorDeployingContainer, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent, workflowInstanceId=12cb526e-6fe3-4518-a941-ca558be54d4f, taskId=liver, executionId=7e8816df-3fd4-4cae-b083-2fb2a074b8a5|Error deploying Container. Docker.DotNet.DockerApiException: Docker API responded with status code=BadRequest, response={
    "message": "invalid mount config for type \"bind\": bind source path does not exist: /home/mqin/md-express/deploy/monai-deploy-express/sample-workflows/.md/mdtm/7e8816df-3fd4-4cae-b083-2fb2a074b8a5/inputs/env_HOLOSCAN_INPUT_PATH"
}
mdl-tm        | 
mdl-tm        |    at Docker.DotNet.DockerClient.HandleIfErrorResponseAsync(HttpStatusCode statusCode, HttpResponseMessage response, IEnumerable`1 handlers)
mdl-tm        |    at Docker.DotNet.DockerClient.MakeRequestAsync(IEnumerable`1 errorHandlers, HttpMethod method, String path, IQueryString queryString, IRequestContent body, IDictionary`2 headers, TimeSpan timeout, CancellationToken token)
mdl-tm        |    at Docker.DotNet.ContainerOperations.CreateContainerAsync(CreateContainerParameters parameters, CancellationToken cancellationToken)
mdl-tm        |    at Monai.Deploy.WorkflowManager.TaskManager.Docker.DockerPlugin.ExecuteTask(CancellationToken cancellationToken) in /app/src/TaskManager/Plug-ins/Docker/DockerPlugin.cs:line 205
mdl-tm        | 2023-10-25 00: 52: 10.0906|106|DEBUG|Monai.Deploy.WorkflowManager.TaskManager.TaskManager|eventType=md.tasks.update, reason=PluginError, EventId=106, EventName=SendingTaskUpdateMessage, @messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, @applicationId=16988a78-87b5-4168-a5c3-2cfc2bab8e54, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent|Sending md.tasks.update, Status=PluginError . 
mdl-tm        | 2023-10-25 00: 52: 10.0911|10000|INFO|Monai.Deploy.Messaging.RabbitMQ.RabbitMQMessagePublisherService|endpoint=rabbitmq, virtualHost=monaideploy, exchange=monaideploy, topic=md.tasks.update, EventId=10000, EventName=PublshingRabbitMQ, @messageId=6da618d6-ada7-41b5-9427-46a775c7f8ce, @applicationId=4c9072a1-35f5-4d85-847d-dafca22244a8, @correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, @recievedTime=10/25/2023 00: 52: 00, correlationId=edcb1deb-c6e2-4280-a096-2e8b29701f80, messageId=7c54a688-569e-47a5-a362-f2cdc4a5c2d8, messageType=TaskDispatchEvent|Publishing message to rabbitmq/monaideploy. Exchange: monaideploy, Topic: md.tasks.update. 

Rebuild of Monai Deploy Express

The working group have decided that a new build of MD Express is needed, which will include builds 0.1.12 of Workflow Manager, 0.1.10 of Task Manager and 0.3.17 of Monai Informatics Gateway.

Workflow won't trigger/complete on Linux server(with no GPU)

I'm trying to remotely deploy monai-deploy-express on a Linux development server that doesn't have an NVIDIA GPU currently. Seeing as, seemingly, the sample workflows (Lung Seg) provided in the deploy express documentation completed on my own machine with CPU power (no GPU usage spike), I thought deploying to a Linux server without a dedicated GPU would be fine. Creating and running the containers on the remote server went fine, I defined the workflows and got back Workflow IDs and uploaded the CT data to Orthanc. But, when I sent to DICOM modality, the segmentation task would never finish and return the segmented images. I checked docker container list -a but couldn't find the container MONAI Lung Seg which would be responsible for the task. I tried adding NVIDIA Runtime to the Docker Daemon, installing CUDA and NVIDIA Toolkit, to see if the workflow would at least trigger then (even though there is no NVIDIA GPU). It did trigger and I could see the MONAI Lung Seg container running, but nothing would ever complete. How can I make monai deploy express execute and complete at least the sample workflows on my GPU-less remote Linux server? Also, when I do docker logs and the container ID for the Lung Seg container it doesn't display anything (maybe @mocsharp can swoop in and save the day again) Cheers

Create a Methods Paper based on the MONAI Workloads Document

The MONAI Deploy Working group believes there is benefit in using the information collected in the MONAI Workloads document to help evangelize the motivating factors found in the MONAI Deploy working group around clinical workflows and how they map to AI deployment workloads.

Workflow Request Message change request

This issue proposes a change to the Workflow Request message to support multiple data sources/modalities: DIMSE, STOW, FHIR, HL7, and more. This change provides additional information for the Workflow Manager to handle, filter and process incoming requests.

The change removes the existing calling_aetitle and called_aetitle properties and replaces them with trigger and data_origins:

Example: an incoming FHIR message that fetches additional DICOM study:

{
  "trigger": {
    "type": "fhir",
    "source": "server-xyz",
    "destination": "1.2.3.4"
  },
  "data_origins": [
    {
      "type": "dimse",
      "source": "calling-aet",
      "destination": "called-aet"
    }
  ]
}

Example: an incoming DICOM study via DIMSE:

{
  "trigger": {
    "type": "dimse",
    "source": "calling-aet",
    "destination": "called-aet"
  }
}

Example: an incoming DICOM study via DICOMWeb STOW:

{
  "trigger": {
    "type": "dimse",
    "source": "source ip",
    "destination": "named-endpoint"
  }
}

Example: an incoming HL7 message that triggers retrieval of data from FHIR server and PACS:

{
  "trigger": {
    "type": "hl7",
    "source": "server-xyz",
    "destination": "1.2.3.4"
  },
  "data_origins": [
    {
      "type": "fhir",
      "source": "server-xyz",
      "destination": "1.2.3.4"
    },
    {
      "type": "dicom",
      "source": "calling-aet",
      "destination": "called-aet"
    }
  ]
}

MDExpress: Orthanc Configurations

I am trying one of the demos, when I send the images using terminal using storescu I can upload the images on Orthanc. However when I try to send the images from Orthanc to MONAI-DEPLOY I get the following error:
image

I checked the Orthanc Configuration files and RemoteAccessAllowed is set to true and AuthenticationEnabled is set to false.
Any suggestions.

Minio Storage Service error

Monai Deploy Express work successfully if only 1 dcm file is send from ORTHANC and segmentation is generated. But if a complete series is presented gives following error, no final segmentation is generated. Tried both liver and lung app.

2023-08-01T11:54:31.778893364Z 2023-08-01 11:54:31.7721|20009|ERROR|Monai.Deploy.Storage.MinIO.MinIoStorageService||Storage service error. System.Threading.Tasks.TaskCanceledException: The operation was canceled.
mdl-ig | 2023-08-01T11:54:31.778946438Z at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.ThrowException(SocketError error, CancellationToken cancellationToken)
mdl-ig | 2023-08-01T11:54:31.779212493Z at Minio.MinioClient.ExecuteTaskCoreAsync(IEnumerable1 errorHandlers, HttpRequestMessageBuilder requestMessageBuilder, CancellationToken cancellationToken, Boolean isSts) in /root/.q/sources/minio-dotnet/Minio/MinioClient.cs:line 602 mdl-ig | 2023-08-01T11:54:31.779268984Z at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.ThrowException(SocketError error, CancellationToken cancellationToken) mdl-ig | 2023-08-01T11:54:31.779295353Z at Minio.MinioClient.ExecuteTaskCoreAsync(IEnumerable1 errorHandlers, HttpRequestMessageBuilder requestMessageBuilder, CancellationToken cancellationToken, Boolean isSts) in /root/.q/sources/minio-dotnet/Minio/MinioClient.cs:line 602

Write Container Submission Guidance

Writing OSS is fun but when you publish and distribute it there are guidelines often imposed upon you by the OSS that you are using and dependent upon.

Write a document that provides the guidelines we need to follow when publishing a container.

HL7 support

Summary of problem

AI Applications want the ability to be able to send HL7 messages back to clinical systems (i.e RIS, CRIS) to support worklist prioritisation.

Summary of proposed change

To facilitate the integration and transmission of HL7 messages back to MONAI for onward delivery to clinical systems, modifications are needed in both MIG and MWM which covers:

  • The creation of a HL7 server which receives and transmits HL7 messages
  • The re-identification of PII data in the HL7 message
  • The introduction of a new event to notify Workflow Manager when results are returned
  • Logic which allows Workflow Manager to dispatch tasks even when artifacts may not have been returned
  • Management of RIS connections

Addition of a modality type to artifacts object

In order to be able to manage multiple results of different or same modality returned, the MWM and MIG must be able to support a new modality type field. This will allow the platform to support multi-dicom results of different types (SEG, RT, SC ect) and HL7 (and can be extended further to support FHIR).

            "artifacts": {
                "input": [
                    {
                        "name": "input-dicom",
                        "value": "{{ context.input.dicom }}",
                        "mandatory": true,
                    }
                ],
                "output": [
                    {
                        "name": "encapsulated-pdf",
		        "type": "DOC" 
                    },
                    {
                        "name": "segmentation",
		        "type": "SEG" 
                    },
                    {
                        "name": "structured-report",
		        "type": "SR" 
                    },
                    {
                        "name": "hl7-message",
		        "type": "HL7" 
                    }
                ]
            },

By knowing the types of files that we are expecting back, the workflow manager can make a decision whether to continue the task or wait for further results to be returned before dispatching the next task. Currently if any DICOM results are returned then the workflow manager will infer that the task has passed and carry on when in fact the AI application may be sending more results.

Artifacts Returned Event

A new event will be created and will replace the WorkflowRequestEvent logic that has been implemented for remote app execution. The WorkflowRequestEvent was updated to support Workflow continuation by the addition of a WorkflowId and a TaskId. However this will not work when multiple results are returned. The proposal is to replace this with a new event named ArtifactsReturnedEvent.

{
    "correlationId": "",
    "WorkflowInstanceId":"",
    "TaskId":"",
    "Artifacts": [
    {
        "Name": "",
        "Type":"",
        "Location":""
    }]
}

Logic to manage artifacts and task dispatches independently

Upon receiving an artifactsReturnedEvent the WorkflowManager must add the artifacts to the WorkflowInstance.Task.ArtefactDictionary and dispatch the next tasks accordingly by checking the input artifacts of the tasks from the TaskDestinations array.

The reason for managing these artifacts separately is that results are returned async and there may be a time where meaningful DICOM results are returned but the HL7 never arrives, but in this case you would still want to dispatch the task to export the DICOMs back to PACS and not wait for a HL7 which may never arrive (or vice versa).

Workflow Example

{
    "tasks": [
        {
            "id": "remote",
            "task_destinations": [
                {
                    "name": "export-dicoms"
                },
                {
                    "name": "export-hl7"
                }
            ],
            "artifacts": {
                "input": [
                    {
                        "name": "input-dicom",
                    }
                ],
                "output": [
                    {
                        "name": "segmentation",
                        "type": "SEG"
                    },
                    {
                        "name": "encapsulated-pdf",
                        "type": "DOC"
                    },
                    {
                        "name": "hl7",
                        "type": "HL7"
                    }
                ]
            },
            "input_parameters": null
        },
        {
            "id": "export-dicoms",
            "description": "export dicoms",
            "type": "export-dicom",
            "export_destinations": [
                {
                    "name": "PACS"
                }
            ],
            "artifacts": {
                "input": [
                    {
                        "name": "segmentation",
                        "value": "{{ context.executions.remote.artifacts.segmentation }}",
                        "mandatory": true
                    },
                    {
                        "name": "encapsulated-pdf",
                        "value": "{{ context.executions.remote.artifacts.encapsulated-pdf }}",
                        "mandatory": true
                    }
                ]
            }
        },
        {
            "id": "export-hl7",
            "description": "export hl7 message",
            "type": "export-hl7",
            "export_destinations": [
                {
                    "name": "RIS"
                }
            ],
            "artifacts": {
                "input": [
                    {
                        "name": "hl7",
                        "value": "{{ context.executions.remote.artifacts.hl7 }}",
                        "mandatory": true
                    }
                ]
            }
        }
    ]
}

segmentation result returned

  • Workflow Manager will add the artifact to the task dict in the Workflow Instance
  • Workflow Manager will update the Task to “Partial Pass”
  • Workflow Manager will not dispatch any tasks as all input artifacts are not returned

encapsulated pdf returned

  • Workflow Manager will add the artifact to the task dict in the Workflow Instance
  • Workflow Manager will check if all outputs have been returned, if not then status will remain as “Partial Pass”
  • Workflow Manager to dispatch export-dicoms task as all input artefacts have been returned

HL7 message returned

  • Workflow Manager will add the artifact to the task dict in the Workflow Instance
  • Workflow Manager will check if all outputs have been returned, all artifacts are returned and task is marked as “passed”
  • Workflow Manager to dispatch export-dicoms task as all input artefacts have been returned

If an artifact is not returned within the timeout period set but has already been marked as “Partial Pass”, then Workflow Manager will mark the Task as “Partial Fail” and the issue will be shown in the issues table.

RIS Integration

Option 1

MIG changes

New DB and endpoints to be created in MIG to support the integration of RIS. Endpoints must support create, update and delete (copy of DestinationApplicationEntity).

{
    "Name": "",
    "HostIp": "",
    "CreatedBy": null,
    "UpdatedBy": null,
    "DateTimeUpdated": null,
    "Port": 80
}

MWM changes

Workflow Manager must check that the name of the export destination is either in DestinationApplicationEntity or in DestinationInformationSystems

Option 2

MIG changes

Add a new type field to the DestinationApplicationEntity table and make the DB name more generic. The type will be an enum and will differentiate DICOM and HL7 destinations. Default will be DICOM

{
    "Name": "",
    "AeTitle": "",
    "HostIp": "",
    "Type": 1,
    "CreatedBy": null,
    "UpdatedBy": null,
    "DateTimeUpdated": null,
    "Port": 80
}

MWM changes

No change

##HL7 server to receive results
A HL7 server will be needed in MIG to take results from remote applications. This server will take HL7 messages and do the following:

  • Re-identify PII data based on the DICOM metadata of the study sent for inference
  • Save the HL7 message to MinIO (.hl7)
  • HL7 .net libraries will be leveraged for this (i.e https://github.com/nHapiNET/nHapi)

Export Request Event

Two new task “types” will use the existing queues but will provide information on the type of artifacts that are being exported.

        {
            "id": "export-dicoms",
            "description": "export dicoms",
            "type": "export-dicom",
            "export_destinations": [
                {
                    "name": "PACS"
                }
            ],
            "artifacts": {
                "input": [
                    {
                        "name": "segmentation",
                        "value": "{{ context.executions.remote.artifacts.segmentation }}",
                        "mandatory": true
                    },
                    {
                        "name": "encapsulated-pdf",
                        "value": "{{ context.executions.remote.artifacts.encapsulated-pdf }}",
                        "mandatory": true
                    }
                ]
            }
        },
        {
            "id": "export-hl7",
            "description": "export hl7 message",
            "type": "export-hl7",
            "export_destinations": [
                {
                    "name": "RIS"
                }
            ],
            "artifacts": {
                "input": [
                    {
                        "name": "hl7",
                        "value": "{{ context.executions.remote.artifacts.hl7 }}",
                        "mandatory": true
                    }
                ]
            }
        }

MIG Changes

Based on the “type” attached to the event, MIG will either invoke DICOM SCU or HL7 to send the artifacts attached.

Once done MIG will generate an export.request.complete event. Validation may need to be looked at (TBC)

MWM Changes

Based on the “type” specified in the export task, only relevant artifacts must be specified. The Workflow Manager will know this but checking the output artifact types of a previous task.

Based on the “type” specified in the export task, only relevant export detsinations must be specified. The Workflow Manager must check that these exist and are correct for the export task.

HL7 server to send HL7 artifact

A HL7 server to send the message to an intended destination will be needed. This will be invoked when an export request of type “export-hl7” is received.

Other considerations

On premise execution

Applications that are executed on premise will be responsible for saving their app's HL7 message in a .hl7 format which will be saved to MinIO (same way we save DICOM results). There will be no requirement to pseudonymise and re-identify PII data by the platform as an app executing on premise will have that context.

Documentation will be uplifted to provide guidance to app developers.

Folders

Some applications will generate folders as part of their output. These folders will potentially contain .dcm and .hl7 files. “Folders” will be a valid type for output and input artifacts. In this case for export requests, depending on the “type” of export the Workflow Manager will traverse folders in MinIO to find all .dcm or .hl7 files respectively.

This is currently done for .dcm files and would need extending for .hl7

DICOM data upload

I get the following error, when trying to upload the example DICOM files.

E: 0006:0317 Peer aborted Association (or never connected)
I: Peer Aborted Association

Add CITATION.cff

I think we need a CITATION.cff file like the core MONAI repo so that people can reference the repository appropriately in written publications. Eventually I hope there will be a journal paper that people can reference instead.

Update contributing guidelines

Update to v2 as discussed in WG meeting from Haris' slides.

Let's also make sure they aligned with the broader MONAI contributing guidelines as much as it makes sense.

MD Express - Hello World - No dicom list printed with curl - MAPs not getting launched

This issue was previously raised #90, but the suggested solution does not apply/work for me.

In the hello world example, after sending DICOMS to MONAI-Deploy, there is this step in the Readme.md:

> docker container list -a | grep alpine
# locate the container ID and run the following command
> docker logs {CONTAINER ID}
# expect a list of files to be printed
/var/monai/input/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.60545822758941849948931508930806372.dcm.json
/var/monai/input/1.2.826.0.1.3680043.2.1125.1.19616861412188316212577695277886020/1.2.826.0.1.3680043.2.1125.1.34918616334750294149839565085991567/1.2.826.0.1.3680043.2.1125.1.60545822758941849948931508930806372.dcm

But this is the output I am getting:

===Configuring Informatics Gateway===
Informatics Gateway IP Address = 172.29.0.50
Informatics Gateway Port       = 5000
Informatics Gateway AE TItle   = MONAI-DEPLOY
Orthanc IP Address             = 172.29.0.100
Orthanc SCP Port               = 4242



Deleting existing MONAI Deploy AE Title
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   233    0   233    0     0    749      0 --:--:-- --:--:-- --:--:--   749
{
  "name": "MONAI-DEPLOY",
  "aeTitle": "MONAI-DEPLOY",
  "grouping": "0020,000D",
  "workflows": [],
  "ignoredSopClasses": [],
  "allowedSopClasses": [],
  "timeout": 5,
  "id": "0d450300-5010-4711-9277-2d126fb2b7ca",
  "dateTimeCreated": "2023-09-29T18:20:57.431Z"
}

Deleting existing DICOM Source
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   151    0   151    0     0   2841      0 --:--:-- --:--:-- --:--:--  2903
{
  "name": "ORTHANC",
  "aeTitle": "ORTHANC",
  "hostIp": "172.29.0.100",
  "id": "aa451300-ee23-4c60-bde3-45afae65c4b5",
  "dateTimeCreated": "2023-09-29T18:20:57.713Z"
}

Deleting existing DICOM Destination
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   163    0   163    0     0   3664      0 --:--:-- --:--:-- --:--:--  3704
{
  "port": 4242,
  "name": "ORTHANC",
  "aeTitle": "ORTHANC",
  "hostIp": "172.29.0.100",
  "id": "5e2a2299-0cec-4fd4-b415-c892595647de",
  "dateTimeCreated": "2023-09-29T18:20:57.771Z"
}

Adding MONAI Deploy AE Title
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   287    0   237  100    50    789    166 --:--:-- --:--:-- --:--:--   956
{
  "name": "MONAI-DEPLOY",
  "aeTitle": "MONAI-DEPLOY",
  "grouping": "0020,000D",
  "workflows": [],
  "ignoredSopClasses": [],
  "allowedSopClasses": [],
  "timeout": 5,
  "id": "264fc41b-bf31-4da5-b007-ff97ebe05120",
  "dateTimeCreated": "2023-09-29T19:46:06.5557054Z"
}

Adding DICOM Source
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   220    0   155  100    65   2893   1213 --:--:-- --:--:-- --:--:--  4150
{
  "name": "ORTHANC",
  "aeTitle": "ORTHANC",
  "hostIp": "172.29.0.100",
  "id": "5a996e53-f217-4539-8733-293aeb947197",
  "dateTimeCreated": "2023-09-29T19:46:06.8520296Z"
}

Adding DICOM Destination
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   245    0   167  100    78   3318   1549 --:--:-- --:--:-- --:--:--  5000
{
  "port": 4242,
  "name": "ORTHANC",
  "aeTitle": "ORTHANC",
  "hostIp": "172.29.0.100",
  "id": "9eb29248-5cd5-4762-af08-2ca420ba8f82",
  "dateTimeCreated": "2023-09-29T19:46:06.9279751Z"
}

Listing DICOM Sources
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   153    0   153    0     0   5283      0 --:--:-- --:--:-- --:--:--  5464
[
  {
    "name": "ORTHANC",
    "aeTitle": "ORTHANC",
    "hostIp": "172.29.0.100",
    "id": "5a996e53-f217-4539-8733-293aeb947197",
    "dateTimeCreated": "2023-09-29T19:46:06.852Z"
  }
]

Listing DICOM Destinations
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   165    0   165    0     0   6224      0 --:--:-- --:--:-- --:--:--  6346
[
  {
    "port": 4242,
    "name": "ORTHANC",
    "aeTitle": "ORTHANC",
    "hostIp": "172.29.0.100",
    "id": "9eb29248-5cd5-4762-af08-2ca420ba8f82",
    "dateTimeCreated": "2023-09-29T19:46:06.927Z"
  }
]

As discussed previously, this is the IGconfig container that is not active anymore. I am running this on a GPU server, so there is no NVIDIA container runtime issue. All my other containers show healthy status and the logs do not seem to show any obvious errors:

image

But, I think none of the MAP containers that are in the workflow definition are getting launched. I tried the Hello-world and Liver Tumor segmentation MAP - no segmentation output was received by ORTHANC, can someone help debug - please!

MD Express Request a way to debug track jobs

MD express works great out of the box in a very simple way. Requesting a similar way to simply track jobs submitted and track dockers launched for easy debugging of jobs.

Listing below a wish list

  1. Need a way to list active workflows.
    a. May need a way to verify the workflows specially the rules
  2. Need a way to visually see the containers that ran / is running.
    a. Which one finished successfully.
    b. Which one failed.
    c. UI way to get logs for these containers. May be a simpler way to get docker logs

Create a Survey of Institutional Workflows and Use Cases for MONAI Deploy

MONAI Deploy working group should produce a generalized survey that existing institutional members and new members contribute to that collects information related to the MONAI Workloads for (non-exhaustive)

  • Use Cases with priority of deployment
  • Types of imaging studies modalities
  • Statistics on data characteristics
  • other key AI data characteristics

The goal of the effort is to begin building a useful profile and catalog of best practices and state of the art descriptions for MONAI Deploy related workflows where this information can be further utilized with regular reports on MONAI Deploy use case and workflows and their characteristics in the community.

MD Express: Request a way to clean up data from incoming requests

Dicom images can be sent to MD express where it would be stored then initiate a pipeline.
Requesting to have a way to delete this dicom data after X hours from the task being completed/failed.
May be have a storage disk limit in Gb or something. Otherwise if we leave the system up long enough we would run out of disk space

One workaround may be to use shared volumes in the docker compose as

volumes: { incoming-dicom-data: {} }

Then instruct user to delete that volume after as

docker compose down
docer volume rm incoming-dicom-data

Create MONAI Application Runner

Develop lightweight tool to run MONAI apps locally as part of the app development workflow. Single app, single job at a time.

MD Express – Hello World example – no file list output

Hi all,

Further to conversation in Slack, I've deployed MDE on a Windows 10 machine running WSL2. The containers are up, running and healthy.

I am testing the Hello World example here. I can access Orthanc and upload data (I'm using the LN00014 liver CT test dataset provided in the instructions).

I am returned a workflow_id when I POST a workflow definition.

I can run a job, which appears to be successful:

image

However, when I view the alpine Docker logs, I'm not seeing the output file list. Instead I get this:

image

Any help much appreciated. Let me know if you need any more info.

PS: I've also tried running the Liver Seg MAP example, which again says Success in the Orthanc jobs page, but I'm not getting any output.

Deploying MONAI in Azure

Hey! I've been trying to deploy MONAI deploy express with Azure Container Instances and modifying the available docker-compose.yml file, but because of inherent issues with mounting Fileshares and not fully knowing how to enable GPU for the containers when deploying with the docker-compose.yml with ACI I'm looking for suggestions.

What is the "best" or easiest way to deploy MONAI deploy (?express?) on Azure(of course, with GPU enabled)? Should I be using the HELM charts instead (with Azure Kubernetes) or should I still try the Docker method? Help appreciated

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.