Project MONAI continues to expand its end-to-end workflow with new releases and a new subproject called MONAI Deploy Inference Service.
Project MONAI is releasing three new updates to existing frameworks, MONAI v0.8, MONAI Label v0.3, and MONAI Deploy App SDK v0.2. It’s also expanding its MONAI Deploy subsystem with the MONAI Deploy Inference Service (MIS), a server that runs MONAI Application Packages (MAPs) in a Kubernetes Cluster as cloud-native microservices.
MIS helps expand the end-to-end capabilities of MONAI by integrating with a container orchestration system like Kubernetes. By using the Kubernetes framework, developers can quickly start testing their models. This allows moving the execution from local development to staging environments.
MONAI Core v0.8
MONAI Core v0.8 focuses on expanding its learning capabilities by both adding Self-Supervised and Multi-Instance learning support.
Also included is a new state-of-the-art differential search framework called DiNTS that helps accelerate Neural Architecture Search (NAS) for large-scale 3D image sets like those found in medical imaging.
- Multi-instance learning with examples for the MSD dataset.
- Visualization of transforms and notebook with approaches for 3D image transform augmentation.
- Self-supervised learning with pretraining pipeline-leveraging vision transformer tutorials, highlighting training with unlabeled data and adaptation for downstream tasks.
- DiNTS AutoML with examples using MSD tasks.
Get started with the new features using the included Jupyter Notebooks:
MONAI Label v0.3
MONAI Label v0.3 focuses on including multilabel segmentation support with DynUNet and UNETR networks as the base architecture options. It also focuses on enhanced performance with multi-GPU training support to improve scalability and usability improvements that make active learning easier to use.
- Multi-Label Segmentation Support
- Multi-GPU Training
- Active Learning UX Changes
MONAI Deploy App SDK v0.2
MONAI Deploy App SDK v0.2 continues to expand its base operators, including support for additional DICOM operations.
- Operator for DICOM Series Selection.
- Operator for exporting DICOM Structured Reports SOP for classification results.
MONAI Deploy Inference Service v0.1
MONAI Deploy Inference Service v0.1 is the first component of the MONAI Deploy Application Server that continues to expand on the end-to-end workflow of MONAI. It includes the ability to deploy MONAI Application Packages (MAPs) created by MONAI Deploy App SDK into a Kubernetes cluster.
- Register a MAP in the Helm Charts of MIS.
- Upload inputs through a REST API request and make them available to the MAP container.
- Provision resources for the MAP container.
- Provide outputs of the MAP container to the client who made the request.
Check out the new MONAI Deploy tutorials that walk you through creating a MAP using App SDK, deploying the MIS Service, and pushing your MAP to MIS to be run as a cloud-native microservice.
You can find more in-depth information about each release under their respective projects in the Project MONAI GitHub.