Skip to content

Commit 6dde274

Browse files
authored
677 Add hybrid programming example for bundle (Project-MONAI#678)
* [DLMED] add hybrid placeholders Signed-off-by: Nic Ma <[email protected]> * [DLMED] update doc Signed-off-by: Nic Ma <[email protected]> * [DLMED] add scripts Signed-off-by: Nic Ma <[email protected]> * [DLMED] split to 2 examples Signed-off-by: Nic Ma <[email protected]> * [DLMED] rename to bundle Signed-off-by: Nic Ma <[email protected]> * [DLMED] update hybrid programming Signed-off-by: Nic Ma <[email protected]> * [DLMED] update bundles to bundle Signed-off-by: Nic Ma <[email protected]> * [DLMED] add README Signed-off-by: Nic Ma <[email protected]> * [DLMED] update according to comments Signed-off-by: Nic Ma <[email protected]> * [DLMED] fix format Signed-off-by: Nic Ma <[email protected]>
1 parent e573874 commit 6dde274

File tree

21 files changed

+262
-9
lines changed

21 files changed

+262
-9
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ Demonstrates the use of the `ThreadBuffer` class used to generate data batches d
178178
Illustrate reading NIfTI files and test speed of different transforms on different devices.
179179

180180
**modules**
181-
#### [engines](./modules/bundles)
181+
#### [bundle](./modules/bundle)
182182
Get started tutorial and concrete training / inference examples for MONAI bundle features.
183183
#### [engines](./modules/engines)
184184
Training and evaluation examples of 3D segmentation based on UNet3D and synthetic dataset with MONAI workflows, which contains engines, event-handlers, and post-transforms. And GAN training and evaluation example for a medical image generative adversarial network. Easy run training script uses `GanTrainer` to train a 2D CT scan reconstruction network. Evaluation script generates random samples from a trained network.

modules/bundle/README.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# MONAI bundle
2+
This folder contains the get_started tutorial and concrete training / inference examples for MONAI bundle features.
3+
4+
### [spleen segmentation](./spleen_segmentation)
5+
A bundle example for volumetric (3D) segmentation of the spleen from CT image.
6+
### [customize component](./custom_component)
7+
Example shows the use cases of bringing customized python components, such as transform, network, and metrics, in a configuration-based workflow.
8+
### [hybrid programming](./hybrid_programming)
9+
Example shows how to parse the config files in your own python program, instantiate necessary components with python program and execute the inference.
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# Description
2+
This example mainly shows a typical use case that brings customized python components (such as transform, network, metrics) in a configuration-based workflow.
3+
4+
Please note that this example depends on the `spleen_segmentation` bundle example and executes via overriding the config file of it.
5+
6+
## commands example
7+
To run the workflow with customized components, `PYTHONPATH` should be revised to include the path to the customized component:
8+
```
9+
export PYTHONPATH=$PYTHONPATH:"<path to 'custom_component/scripts'>"
10+
```
11+
And please make sure the folder `custom_component/scripts` is a valid python module (it has a `__init__.py` file in the folder).
12+
13+
Override the `train` config with the customized `transform` and execute training:
14+
```
15+
python -m monai.bundle run training --meta_file <spleen_configs_path>/metadata.json --config_file "['<spleen_configs_path>/train.json','configs/custom_train.json']" --logging_file <spleen_configs_path>/logging.conf
16+
```
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
{
2+
"train#preprocessing#transforms#6":
3+
{
4+
"_target_": "scripts.custom_transforms.PrintEnsureTyped",
5+
"keys": ["image", "label"]
6+
}
7+
}
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
from monai.config import KeysCollection
13+
from monai.transforms import EnsureTyped
14+
15+
16+
class PrintEnsureTyped(EnsureTyped):
17+
"""
18+
Extend the `EnsureTyped` transform to print the image shape.
19+
20+
Args:
21+
keys: keys of the corresponding items to be transformed.
22+
23+
"""
24+
25+
def __init__(self, keys: KeysCollection, data_type: str = "tensor") -> None:
26+
super().__init__(keys, data_type=data_type)
27+
28+
def __call__(self, data):
29+
d = dict(super().__call__(data=data))
30+
for key in self.key_iterator(d):
31+
print(f"data shape of {key}: {d[key].shape}")
32+
return d

modules/bundles/get_started.ipynb renamed to modules/bundle/get_started.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"A MONAI bundle usually includes the stored weights of a model, TorchScript model, JSON files which include configs and metadata about the model, information for constructing training, inference, and post-processing transform sequences, plain-text description, legal information, and other data the model creator wishes to include.\n",
1010
"\n",
11-
"For more information about MONAI bundles read the description: https://docs.monai.io/en/latest/bundle_intro.html.\n",
11+
"For more information about MONAI bundle read the description: https://docs.monai.io/en/latest/bundle_intro.html.\n",
1212
"\n",
1313
"This notebook is a step-by-step tutorial to help get started to develop a bundle package, which contains a config file to construct the training pipeline and also has a `metadata.json` file to define the metadata information.\n",
1414
"\n",
@@ -26,7 +26,7 @@
2626
"- Override config content at runtime.\n",
2727
"- Hybrid programming with config and python code.\n",
2828
"\n",
29-
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/master/modules/bundles/get_started.ipynb)"
29+
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/master/modules/bundle/get_started.ipynb)"
3030
]
3131
},
3232
{
@@ -143,7 +143,7 @@
143143
"source": [
144144
"## Define train config - Set imports and input / output environments\n",
145145
"\n",
146-
"Now let's start to define the config file for a regular training task. MONAI bundles support both `JSON` and `YAML` format, here we use `JSON` as the example.\n",
146+
"Now let's start to define the config file for a regular training task. MONAI bundle support both `JSON` and `YAML` format, here we use `JSON` as the example.\n",
147147
"\n",
148148
"According to the predefined syntax of MONAI bundle, `$` indicates an expression to evaluate, `@` refers to another object in the config content. For more details about the syntax in bundle config, please check: https://docs.monai.io/en/latest/config_syntax.html.\n",
149149
"\n",
@@ -463,7 +463,7 @@
463463
"Usually we need to execute validation for every N epochs during training to verify the model and save the best model.\n",
464464
"\n",
465465
"Here we don't define the `validate` section step by step as it's similar to the `train` section. The full config is available: \n",
466-
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundles/spleen_segmentation/configs/train.json\n",
466+
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundle/spleen_segmentation/configs/train.json\n",
467467
"\n",
468468
"Just show an example of `macro text replacement` to simplify the config content and avoid duplicated text. Please note that it's just token text replacement of the config content, not refer to the instantiated python objects."
469469
]
@@ -498,7 +498,7 @@
498498
"We can define a `metadata` file in the bundle, which contains the metadata information relating to the model, including what the shape and format of inputs and outputs are, what the meaning of the outputs are, what type of model is present, and other information. The structure is a dictionary containing a defined set of keys with additional user-specified keys.\n",
499499
"\n",
500500
"A typical `metadata` example is available: \n",
501-
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundles/spleen_segmentation/configs/metadata.json"
501+
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundle/spleen_segmentation/configs/metadata.json"
502502
]
503503
},
504504
{
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Description
2+
This example mainly shows a typical use case that parses the config files in your own python program, instantiates necessary components with python program and executes the inference.
3+
4+
## commands example
5+
6+
Parse the config files in the python program and execute inference from the python program:
7+
8+
```
9+
python -m scripts.inference run --config_file "['configs/data_loading.json','configs/net_inferer.json','configs/post_processing.json']" --ckpt_path <path_to_checkpoint>
10+
```
Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
{
2+
"image_key": "image",
3+
"preprocessing": {
4+
"_target_": "Compose",
5+
"transforms": [
6+
{
7+
"_target_": "LoadImaged",
8+
"keys": "@image_key"
9+
},
10+
{
11+
"_target_": "EnsureChannelFirstd",
12+
"keys": "@image_key"
13+
},
14+
{
15+
"_target_": "Orientationd",
16+
"keys": "@image_key",
17+
"axcodes": "RAS"
18+
},
19+
{
20+
"_target_": "Spacingd",
21+
"keys": "@image_key",
22+
"pixdim": [1.5, 1.5, 2.0],
23+
"mode": "bilinear"
24+
},
25+
{
26+
"_target_": "ScaleIntensityRanged",
27+
"keys": "@image_key",
28+
"a_min": -57,
29+
"a_max": 164,
30+
"b_min": 0,
31+
"b_max": 1,
32+
"clip": true
33+
},
34+
{
35+
"_target_": "EnsureTyped",
36+
"keys": "@image_key"
37+
}
38+
]
39+
},
40+
"dataset": {
41+
"_target_": "Dataset",
42+
"data": "@input_data",
43+
"transform": "@preprocessing"
44+
},
45+
"dataloader": {
46+
"_target_": "DataLoader",
47+
"dataset": "@dataset",
48+
"batch_size": 1,
49+
"shuffle": false,
50+
"num_workers": 4
51+
}
52+
}
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
{
2+
"network": {
3+
"_target_": "UNet",
4+
"spatial_dims": 3,
5+
"in_channels": 1,
6+
"out_channels": 2,
7+
"channels": [16, 32, 64, 128, 256],
8+
"strides": [2, 2, 2, 2],
9+
"num_res_units": 2,
10+
"norm": "batch"
11+
},
12+
"inferer": {
13+
"_target_": "SlidingWindowInferer",
14+
"roi_size": [96, 96, 96],
15+
"sw_batch_size": 4,
16+
"overlap": 0.5
17+
}
18+
}
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
{
2+
"pred_key": "pred",
3+
"postprocessing": {
4+
"_target_": "Compose",
5+
"transforms": [
6+
{
7+
"_target_": "Activationsd",
8+
"keys": "@pred_key",
9+
"softmax": true
10+
},
11+
{
12+
"_target_": "Invertd",
13+
"keys": "@pred_key",
14+
"transform": "@preprocessing",
15+
"orig_keys": "@image_key",
16+
"meta_key_postfix": "meta_dict",
17+
"nearest_interp": false,
18+
"to_tensor": true
19+
},
20+
{
21+
"_target_": "AsDiscreted",
22+
"keys": "@pred_key",
23+
"argmax": true
24+
},
25+
{
26+
"_target_": "SaveImaged",
27+
"keys": "@pred_key",
28+
"meta_keys": "pred_meta_dict",
29+
"output_dir": "@output_dir"
30+
}
31+
]
32+
}
33+
}
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
from typing import Sequence, Union
13+
import glob
14+
15+
import torch
16+
from monai.bundle import ConfigParser
17+
from monai.data import decollate_batch
18+
19+
20+
def run(config_file: Union[str, Sequence[str]], ckpt_path: str):
21+
parser = ConfigParser()
22+
parser.read_config(config_file)
23+
# edit the config content at runtime for input / output information and lazy instantiation
24+
datalist = list(sorted(glob.glob("/workspace/data/Task09_Spleen/imagesTs/*.nii.gz")))
25+
input_data = [{f"{parser['image_key']}": i} for i in datalist]
26+
output_dir = "/workspace/data/tutorials/modules/bundle/hybrid_programming/eval"
27+
parser["input_data"] = input_data
28+
parser["output_dir"] = output_dir
29+
parser["inferer"]["roi_size"] = [160, 160, 160]
30+
31+
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
32+
# instantialize the components
33+
model = parser.get_parsed_content("network").to(device)
34+
model.load_state_dict(torch.load(ckpt_path))
35+
36+
dataloader = parser.get_parsed_content("dataloader")
37+
if len(dataloader) == 0:
38+
raise ValueError("no data found in the dataloader, please ensure the input image paths are accessable.")
39+
inferer = parser.get_parsed_content("inferer")
40+
postprocessing = parser.get_parsed_content("postprocessing")
41+
42+
model.eval()
43+
with torch.no_grad():
44+
for d in dataloader:
45+
images = d[parser["image_key"]].to(device)
46+
# define sliding window size and batch size for windows inference
47+
d[parser["pred_key"]] = inferer(inputs=images, network=model)
48+
# decollate the batch data into a list of dictionaries, then execute postprocessing transforms
49+
[postprocessing(i) for i in decollate_batch(d)]
50+
51+
52+
if __name__ == "__main__":
53+
from monai.utils import optional_import
54+
55+
fire, _ = optional_import("fire")
56+
fire.Fire()

modules/bundles/spleen_segmentation/configs/inference.json renamed to modules/bundle/spleen_segmentation/configs/inference.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
"$import glob",
44
"$import os"
55
],
6-
"bundle_root": "/workspace/data/tutorials/modules/bundles/spleen_segmentation",
6+
"bundle_root": "/workspace/data/tutorials/modules/bundle/spleen_segmentation",
77
"output_dir": "$@bundle_root + '/eval'",
88
"dataset_dir": "/workspace/data/Task09_Spleen",
99
"datalist": "$list(sorted(glob.glob(@dataset_dir + '/imagesTs/*.nii.gz')))",

modules/bundles/spleen_segmentation/configs/metadata.json renamed to modules/bundle/spleen_segmentation/configs/metadata.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"0.1.0": "complete the model package",
66
"0.0.1": "initialize the model package structure"
77
},
8-
"monai_version": "0.8.0",
8+
"monai_version": "0.9.0",
99
"pytorch_version": "1.10.0",
1010
"numpy_version": "1.21.2",
1111
"optional_packages_version": {

modules/bundles/spleen_segmentation/configs/train.json renamed to modules/bundle/spleen_segmentation/configs/train.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"$import os",
55
"$import ignite"
66
],
7-
"bundle_root": "/workspace/data/tutorials/modules/bundles/spleen_segmentation",
7+
"bundle_root": "/workspace/data/tutorials/modules/bundle/spleen_segmentation",
88
"ckpt_dir": "$@bundle_root + '/models'",
99
"output_dir": "$@bundle_root + '/eval'",
1010
"dataset_dir": "/workspace/data/Task09_Spleen",

0 commit comments

Comments
 (0)