Skip to content

Commit 430cc81

Browse files
authored
ROI Inference pipeline for HoVerNet (#1055)
Fixes Project-MONAI/MONAI#5539 ### Description A few sentences describing the changes proposed in this pull request. ### Checks <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [ ] Notebook runs automatically `./runner [-p <regex_pattern>]` Signed-off-by: Behrooz <[email protected]>
1 parent a583b9a commit 430cc81

File tree

5 files changed

+371
-118
lines changed

5 files changed

+371
-118
lines changed

pathology/hovernet/README.MD

Lines changed: 63 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -3,89 +3,115 @@
33
This folder contains ignite version examples to run train and validate a HoVerNet model.
44
It also has torch version notebooks to run training and evaluation.
55
<p align="center">
6-
<img src="https://ars.els-cdn.com/content/image/1-s2.0-S1361841519301045-fx1_lrg.jpg" alt="hovernet scheme")
6+
<img src="https://ars.els-cdn.com/content/image/1-s2.0-S1361841519301045-fx1_lrg.jpg" alt="HoVerNet scheme")
77
</p>
88
implementation based on:
99

10-
Simon Graham et al., HoVer-Net: Simultaneous Segmentation and Classification of Nuclei in Multi-Tissue Histology Images.' Medical Image Analysis, (2019). https://arxiv.org/abs/1812.06499
10+
Simon Graham et al., HoVer-Net: Simultaneous Segmentation and Classification of Nuclei in Multi-Tissue Histology Images.' Medical Image Analysis, (2019). <https://arxiv.org/abs/1812.06499>
1111

1212
### 1. Data
1313

14-
CoNSeP datasets which are used in the examples can be downloaded from https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet/.
15-
- First download CoNSeP dataset to `data_root`.
16-
- Run prepare_patches.py to prepare patches from images.
14+
CoNSeP datasets which are used in the examples can be downloaded from <https://warwick.ac.uk/fac/cross_fac/tia/data/HoVerNet/>.
15+
16+
- First download CoNSeP dataset to `DATA_ROOT` (default is `"/workspace/Data/Pathology/CoNSeP"`).
17+
- Run `python prepare_patches.py` to prepare patches from images.
1718

1819
### 2. Questions and bugs
1920

2021
- For questions relating to the use of MONAI, please us our [Discussions tab](https://github.com/Project-MONAI/MONAI/discussions) on the main repository of MONAI.
2122
- For bugs relating to MONAI functionality, please create an issue on the [main repository](https://github.com/Project-MONAI/MONAI/issues).
2223
- For bugs relating to the running of a tutorial, please create an issue in [this repository](https://github.com/Project-MONAI/Tutorials/issues).
2324

24-
2525
### 3. List of notebooks and examples
26+
2627
#### [Prepare Your Data](./prepare_patches.py)
27-
This example is used to prepare patches from tiles referring to the implementation from https://github.com/vqdang/hover_net/blob/master/extract_patches.py. Prepared patches will be saved in `data_root`/Prepared.
28+
29+
This example is used to prepare patches from tiles referring to the implementation from <https://github.com/vqdang/hover_net/blob/master/extract_patches.py>. Prepared patches will be saved in `DATA_ROOT`/Prepared.
2830

2931
```bash
30-
# Run to know all possible options
32+
# Run to get all possible arguments
3133
python ./prepare_patches.py -h
3234

33-
# Prepare patches from images
35+
# Prepare patches from images using default arguments
36+
python ./prepare_patches.py
37+
38+
# Prepare patch to use custom arguments
3439
python ./prepare_patches.py \
35-
--root `data_root`
40+
--root `DATA_ROOT` \
41+
--ps 540 540 \
42+
--ss 164 164
3643
```
3744

3845
#### [HoVerNet Training](./training.py)
46+
3947
This example uses MONAI workflow to train a HoVerNet model on prepared CoNSeP dataset.
40-
Since HoVerNet is training via a two-stage approach. First initialised the model with pre-trained weights on the [ImageNet dataset](https://ieeexplore.ieee.org/document/5206848), trained only the decoders for the first 50 epochs, and then fine-tuned all layers for another 50 epochs. We need to specify `--stage` during training.
48+
Since HoVerNet is training via a two-stage approach. First initialized the model with pre-trained weights on the [ImageNet dataset](https://ieeexplore.ieee.org/document/5206848), trained only the decoders for the first 50 epochs, and then fine-tuned all layers for another 50 epochs. We need to specify `--stage` during training.
4149

4250
Each user is responsible for checking the content of models/datasets and the applicable licenses and determining if suitable for the intended use.
4351
The license for the pre-trained model used in examples is different than MONAI license. Please check the source where these weights are obtained from:
44-
https://github.com/vqdang/hover_net#data-format
52+
<https://github.com/vqdang/hover_net#data-format>
4553

54+
If you didn't use the default value in data preparation, set ``--root `DATA_ROOT`/Prepared`` for each of the training commands.
4655

4756
```bash
48-
# Run to know all possible options
57+
# Run to get all possible arguments
4958
python ./training.py -h
5059

51-
# Train a hovernet model on single-gpu(replace with your own ckpt path)
60+
# Train a HoVerNet model on single-GPU or CPU-only (replace with your own ckpt path)
5261
export CUDA_VISIBLE_DEVICES=0; python training.py \
53-
--ep 50 \
5462
--stage 0 \
63+
--ep 50 \
5564
--bs 16 \
56-
--root `save_root`
65+
--log-dir ./logs
5766
export CUDA_VISIBLE_DEVICES=0; python training.py \
58-
--ep 50 \
5967
--stage 1 \
60-
--bs 4 \
61-
--root `save_root` \
62-
--ckpt logs/stage0/checkpoint_epoch=50.pt
63-
64-
# Train a hovernet model on multi-gpu (NVIDIA)(replace with your own ckpt path)
65-
torchrun --nnodes=1 --nproc_per_node=2 training.py \
66-
--ep 50 \
67-
--bs 8 \
68-
--root `save_root` \
69-
--stage 0
70-
torchrun --nnodes=1 --nproc_per_node=2 training.py \
71-
--ep 50 \
72-
--bs 2 \
73-
--root `save_root` \
74-
--stage 1 \
75-
--ckpt logs/stage0/checkpoint_epoch=50.pt
68+
--ep 50 \
69+
--bs 16 \
70+
--log-dir ./logs \
71+
--ckpt logs/stage0/model.pt
72+
73+
# Train a HoVerNet model on multi-GPU with default arguments
74+
torchrun --nnodes=1 --nproc_per_node=2 training.py
75+
torchrun --nnodes=1 --nproc_per_node=2 training.py --stage 1
7676
```
7777

7878
#### [HoVerNet Validation](./evaluation.py)
79+
7980
This example uses MONAI workflow to evaluate the trained HoVerNet model on prepared test data from CoNSeP dataset.
8081
With their metrics on original mode. We reproduce the results with Dice: 0.82762; PQ: 0.48976; F1d: 0.73592.
82+
8183
```bash
82-
# Run to know all possible options
84+
# Run to get all possible arguments
8385
python ./evaluation.py -h
8486

85-
# Evaluate a HoVerNet model
86-
python ./evaluation.py
87+
# Evaluate a HoVerNet model on single-GPU or CPU-only
88+
python ./evaluation.py \
8789
--root `save_root` \
88-
--ckpt logs/stage0/checkpoint_epoch=50.pt
90+
--ckpt logs/stage0/model.pt
91+
92+
# Evaluate a HoVerNet model on multi-GPU with default arguments
93+
torchrun --nnodes=1 --nproc_per_node=2 evaluation.py
94+
```
95+
96+
#### [HoVerNet Inference](./inference.py)
97+
98+
This example uses MONAI workflow to run inference for HoVerNet model on arbitrary sized region of interest.
99+
Under the hood, it will use a sliding window approach to run inference on overlapping patches and then put the results
100+
of the inference together and makes an output image the same size as the input. Then it will run the post-processing on
101+
this output image and create the final results. This example save the instance map and type map as png files but it can
102+
be modified to save any output of interest.
103+
104+
```bash
105+
# Run to get all possible arguments
106+
python ./inference.py -h
107+
108+
# Run HoVerNet inference on single-GPU or CPU-only
109+
python ./inference.py \
110+
--root `save_root` \
111+
--ckpt logs/stage0/model.pt
112+
113+
# Run HoVerNet inference on multi-GPU with default arguments
114+
torchrun --nnodes=1 --nproc_per_node=2 ./inference.py
89115
```
90116

91117
## Disclaimer

pathology/hovernet/evaluation.py

Lines changed: 30 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -28,21 +28,18 @@
2828

2929

3030
def prepare_data(data_dir, phase):
31-
data_dir = os.path.join(data_dir, phase)
31+
"""prepare data list"""
3232

33-
images = list(sorted(
34-
glob.glob(os.path.join(data_dir, "*/*image.npy"))))
35-
inst_maps = list(sorted(
36-
glob.glob(os.path.join(data_dir, "*/*inst_map.npy"))))
37-
type_maps = list(sorted(
38-
glob.glob(os.path.join(data_dir, "*/*type_map.npy"))))
33+
data_dir = os.path.join(data_dir, phase)
34+
images = sorted(glob.glob(os.path.join(data_dir, "*image.npy")))
35+
inst_maps = sorted(glob.glob(os.path.join(data_dir, "*inst_map.npy")))
36+
type_maps = sorted(glob.glob(os.path.join(data_dir, "*type_map.npy")))
3937

40-
data_dicts = [
38+
data_list = [
4139
{"image": _image, "label_inst": _inst_map, "label_type": _type_map}
4240
for _image, _inst_map, _type_map in zip(images, inst_maps, type_maps)
4341
]
44-
45-
return data_dicts
42+
return data_list
4643

4744

4845
def run(cfg):
@@ -75,13 +72,10 @@ def run(cfg):
7572
)
7673

7774
# Create MONAI DataLoaders
78-
valid_data = prepare_data(cfg["root"], "valid")
75+
valid_data = prepare_data(cfg["root"], "Test")
7976
valid_ds = CacheDataset(data=valid_data, transform=val_transforms, cache_rate=1.0, num_workers=4)
8077
val_loader = DataLoader(
81-
valid_ds,
82-
batch_size=cfg["batch_size"],
83-
num_workers=cfg["num_workers"],
84-
pin_memory=torch.cuda.is_available()
78+
valid_ds, batch_size=cfg["batch_size"], num_workers=cfg["num_workers"], pin_memory=torch.cuda.is_available()
8579
)
8680

8781
# initialize model
@@ -95,23 +89,31 @@ def run(cfg):
9589
freeze_encoder=False,
9690
).to(device)
9791

98-
post_process_np = Compose([
99-
Activationsd(keys=HoVerNetBranch.NP.value, softmax=True),
100-
Lambdad(keys=HoVerNetBranch.NP.value, func=lambda x: x[1: 2, ...] > 0.5)])
92+
post_process_np = Compose(
93+
[
94+
Activationsd(keys=HoVerNetBranch.NP.value, softmax=True),
95+
Lambdad(keys=HoVerNetBranch.NP.value, func=lambda x: x[1:2, ...] > 0.5),
96+
]
97+
)
10198
post_process = Lambdad(keys="pred", func=post_process_np)
10299

103100
# Evaluator
104101
val_handlers = [
105-
CheckpointLoader(load_path=cfg["ckpt_path"], load_dict={"net": model}),
102+
CheckpointLoader(load_path=cfg["ckpt"], load_dict={"net": model}),
106103
StatsHandler(output_transform=lambda x: None),
107104
]
108105
evaluator = SupervisedEvaluator(
109106
device=device,
110107
val_data_loader=val_loader,
111-
prepare_batch=PrepareBatchHoVerNet(extra_keys=['label_type', 'hover_label_inst']),
108+
prepare_batch=PrepareBatchHoVerNet(extra_keys=["label_type", "hover_label_inst"]),
112109
network=model,
113110
postprocessing=post_process,
114-
key_val_metric={"val_dice": MeanDice(include_background=False, output_transform=from_engine_hovernet(keys=["pred", "label"], nested_key=HoVerNetBranch.NP.value))},
111+
key_val_metric={
112+
"val_dice": MeanDice(
113+
include_background=False,
114+
output_transform=from_engine_hovernet(keys=["pred", "label"], nested_key=HoVerNetBranch.NP.value),
115+
)
116+
},
115117
val_handlers=val_handlers,
116118
amp=cfg["amp"],
117119
)
@@ -125,18 +127,22 @@ def main():
125127
parser.add_argument(
126128
"--root",
127129
type=str,
128-
default="/workspace/Data/CoNSeP/Prepared/consep",
130+
default="/workspace/Data/Pathology/CoNSeP/Prepared",
129131
help="root data dir",
130132
)
131-
133+
parser.add_argument(
134+
"--ckpt",
135+
type=str,
136+
default="./logs/model.pt",
137+
help="Path to the pytorch checkpoint",
138+
)
132139
parser.add_argument("--bs", type=int, default=16, dest="batch_size", help="batch size")
133140
parser.add_argument("--no-amp", action="store_false", dest="amp", help="deactivate amp")
134141
parser.add_argument("--classes", type=int, default=5, dest="out_classes", help="output classes")
135142
parser.add_argument("--mode", type=str, default="original", help="choose either `original` or `fast`")
136143

137144
parser.add_argument("--cpu", type=int, default=8, dest="num_workers", help="number of workers")
138145
parser.add_argument("--use_gpu", type=bool, default=True, dest="use_gpu", help="whether to use gpu")
139-
parser.add_argument("--ckpt", type=str, dest="ckpt_path", help="checkpoint path")
140146

141147
args = parser.parse_args()
142148
cfg = vars(args)

0 commit comments

Comments
 (0)