Skip to content

Fix typos in UNet_input_size_constrains.ipynb #1481

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Aug 8, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -275,7 +275,7 @@ This notebook shows how to quickly set up training workflow based on `MedNISTDat
This notebook shows how to load the TCIA data with CSVDataset from CSV file and extract information for TCIA data to fetch DICOM images based on REST API.
##### [transforms_demo_2d](./modules/transforms_demo_2d.ipynb)
This notebook demonstrates the image transformations on histology images using
##### [UNet_input_size_constrains](./modules/UNet_input_size_constrains.ipynb)
##### [UNet_input_size_constraints](./modules/UNet_input_size_constraints.ipynb)
This tutorial shows how to determine a reasonable spatial size of the input data for MONAI UNet, which not only supports residual units, but also can use more hyperparameters (like `strides`, `kernel_size` and `up_kernel_size`) than the basic UNet implementation.
##### [TorchIO, MONAI, PyTorch Lightning](./modules/TorchIO_MONAI_PyTorch_Lightning.ipynb)
This notebook demonstrates how the three libraries from the official PyTorch Ecosystem can be used together to segment the hippocampus on brain MRIs from the Medical Segmentation Decathlon.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "0aed74fd",
"metadata": {},
Expand All @@ -17,13 +16,13 @@
"See the License for the specific language governing permissions and \n",
"limitations under the License.\n",
"\n",
"# UNet input size constrains\n",
"# UNet input size constraints\n",
"\n",
"MONAI provides an enhanced version of UNet (``monai.networks.nets.UNet ``), which not only supports residual units, but also can use more hyperparameters (like ``strides``, ``kernel_size`` and ``up_kernel_size``) than ``monai.networks.nets.BasicUNet``. However, ``UNet`` has some constrains for both network hyperparameters and sizes of input.\n",
"MONAI provides an enhanced version of UNet (``monai.networks.nets.UNet ``), which not only supports residual units, but also can use more hyperparameters (like ``strides``, ``kernel_size`` and ``up_kernel_size``) than ``monai.networks.nets.BasicUNet``. However, ``UNet`` has some constraints for both network hyperparameters and sizes of input.\n",
"\n",
"The constrains of hyperparameters can be found in the docstring of the network, and this tutorial is focused on how to determine a reasonable input size.\n",
"The constraints of hyperparameters can be found in the docstring of the network, and this tutorial is focused on how to determine a reasonable input size.\n",
"\n",
"The last section: **Constrains of UNet** shows the conclusions."
"The last section: **Constraints of UNet** shows the conclusions."
]
},
{
Expand Down Expand Up @@ -197,18 +196,18 @@
"3. Normalization layers (`InstanceNorm3d`).\n",
"4. Convolution layers (`Conv` and `ConvTranspose`).\n",
"\n",
"As for the layers, convolution layers may change the size of the input, and normalization layers may have extra constrains of the input size.\n",
"As for the modules, the `SkipConnection` module also has some constrains.\n",
"As for the layers, convolution layers may change the size of the input, and normalization layers may have extra constraints of the input size.\n",
"As for the modules, the `SkipConnection` module also has some constraints.\n",
"\n",
"Consequently, This tutorial shows the constrains of convolution layers, normalization layers and the `SkipConnection` module respectively."
"Consequently, This tutorial shows the constraints of convolution layers, normalization layers and the `SkipConnection` module respectively."
]
},
{
"cell_type": "markdown",
"id": "bded0633",
"metadata": {},
"source": [
"## Constrains of convolution layers"
"## Constraints of convolution layers"
]
},
{
Expand Down Expand Up @@ -378,7 +377,7 @@
"id": "391b93e6",
"metadata": {},
"source": [
"## Constrains of normalization layers"
"## Constraints of normalization layers"
]
},
{
Expand Down Expand Up @@ -407,7 +406,7 @@
"id": "9e47a8ef",
"metadata": {},
"source": [
"In MONAI's norm factories, There are six normalization layers can be used. The official docs can be found in [here](https://pytorch.org/docs/stable/nn.html#normalization-layers), and their constrains is shown in [torch.nn.functional](https://pytorch.org/docs/stable/_modules/torch/nn/functional.html).\n",
"In MONAI's norm factories, There are six normalization layers can be used. The official docs can be found in [here](https://pytorch.org/docs/stable/nn.html#normalization-layers), and their constraints is shown in [torch.nn.functional](https://pytorch.org/docs/stable/_modules/torch/nn/functional.html).\n",
"\n",
"However, the following normalization layers will not be discussed:\n",
"1. SyncBatchNorm, since it only supports `DistributedDataParallel`, please check the official docs for more details.\n",
Expand Down Expand Up @@ -447,7 +446,7 @@
"id": "07347476",
"metadata": {},
"source": [
"In reality, when batch size is 1, it's not practical to use batch normalizaton. Therefore, the constrain can be converted to **the batch size should be larger than 1**."
"In reality, when batch size is 1, it's not practical to use batch normalizaton. Therefore, the constraints can be converted to **the batch size should be larger than 1**."
]
},
{
Expand Down Expand Up @@ -482,7 +481,7 @@
"source": [
"### local response normalization\n",
"\n",
"**No constrain**. For example:"
"**No constraint**. For example:"
]
},
{
Expand Down Expand Up @@ -511,15 +510,15 @@
"id": "bd830ec6",
"metadata": {},
"source": [
"## Constrains of SkipConnection"
"## Constraints of SkipConnection"
]
},
{
"cell_type": "markdown",
"id": "37f2aff4",
"metadata": {},
"source": [
"In this section, we will check if the module [SkipConnection](https://github.com/Project-MONAI/MONAI/blob/dev/monai/networks/layers/simplelayers.py) itself has more constrains for the input size.\n",
"In this section, we will check if the module [SkipConnection](https://github.com/Project-MONAI/MONAI/blob/dev/monai/networks/layers/simplelayers.py) itself has more constraints for the input size.\n",
"\n",
"In `UNet`, the `SkipConnection` is called via:\n",
"\n",
Expand All @@ -545,7 +544,7 @@
"id": "cdd7033e",
"metadata": {},
"source": [
"If `len(channels) = 2`, there will only have one `SkipConnection` module in the network, and the module is built by a single down layer with `stride = 1`. From the formulas we achieved in the previous section, we know that this layer will not change the size, thus we only need to meet the constrains from the inside normalization layer:\n",
"If `len(channels) = 2`, there will only have one `SkipConnection` module in the network, and the module is built by a single down layer with `stride = 1`. From the formulas we achieved in the previous section, we know that this layer will not change the size, thus we only need to meet the constraints from the inside normalization layer:\n",
"\n",
"1. When using batch normalization, the batch size should larger than 1.\n",
"\n",
Expand All @@ -565,7 +564,7 @@
"id": "2efce3e2",
"metadata": {},
"source": [
"If `len(channels) > 2`, more `SkipConnection` module will be built and each of the module is consisted with one down layer and one up layer. Consequently, **the output of the up layer should has the same spatial sizes as the input before entering into the down layer**. The corresponding stride values for these modules are coming from `strides[1:]`, hence for each stride value `s` from `strides[1:]`, for each spatial size value `v` of the input, the constrain of the corresponding `SkipConnection` module is:\n",
"If `len(channels) > 2`, more `SkipConnection` module will be built and each of the module is consisted with one down layer and one up layer. Consequently, **the output of the up layer should has the same spatial sizes as the input before entering into the down layer**. The corresponding stride values for these modules are coming from `strides[1:]`, hence for each stride value `s` from `strides[1:]`, for each spatial size value `v` of the input, the constraint of the corresponding `SkipConnection` module is:\n",
"\n",
"```\n",
"math.floor((v + s - 1) / s) = v / s\n",
Expand All @@ -590,7 +589,7 @@
"\n",
"**`np.remainder(v, np.prod(strides[1:])) == 0`**\n",
"\n",
"In addition, there may have more constrains from normalization layers:\n",
"In addition, there may have more constraints from normalization layers:\n",
"\n",
"1. When using batch normalization, the batch size of the input should be larger than 1.\n",
"\n",
Expand All @@ -602,15 +601,15 @@
"id": "8e2d99ef",
"metadata": {},
"source": [
"## Constrains of UNet"
"## Constraints of UNet"
]
},
{
"cell_type": "markdown",
"id": "554744bc",
"metadata": {},
"source": [
"As the first section discussed, UNet is consisted with 1) a down layer, 2) one or mode skip connection module(s) and 3) an up layer. Based on the analyses for each single layer/module, the constrains of the network can be summarized as follow."
"As the first section discussed, UNet is consisted with 1) a down layer, 2) one or mode skip connection module(s) and 3) an up layer. Based on the analyses for each single layer/module, the constraints of the network can be summarized as follow."
]
},
{
Expand All @@ -626,10 +625,10 @@
"id": "8cd1d3b5",
"metadata": {},
"source": [
"If `len(channels) == 2`, `strides` must be a single value, thus assume `s = strides`, and the input size is `[B, C, H, W, D]`. The constrains are:\n",
"If `len(channels) == 2`, `strides` must be a single value, thus assume `s = strides`, and the input size is `[B, C, H, W, D]`. The constraints are:\n",
"\n",
"1. If using batch normalization: **`B > 1`.**\n",
"2. If using local response normalization: no constrain.\n",
"2. If using local response normalization: no constraint.\n",
"3. If using instance normalization, assume `d = max(H, W, D)`, then `math.floor((d + s - 1) / s) >= 2`, which means **`d >= s + 1`.**\n",
"\n",
"The following are the corresponding examples:"
Expand Down Expand Up @@ -749,7 +748,7 @@
"id": "c804fa49",
"metadata": {},
"source": [
"Assume the input size is `[B, C, H, W, D]`, and `s = strides`. The common constrains are:\n",
"Assume the input size is `[B, C, H, W, D]`, and `s = strides`. The common constraints are:\n",
"\n",
"```\n",
"For v in [H, W, D]:\n",
Expand All @@ -758,7 +757,7 @@
"```\n",
"In addition,\n",
"1. If using batch normalization: **`B > 1`.**\n",
"2. If using local response normalization: no more constrain.\n",
"2. If using local response normalization: no more constraint.\n",
"3. If using instance normalization, then:\n",
"```\n",
"d = max(H, W, D)\n",
Expand Down
10 changes: 5 additions & 5 deletions modules/network_contraints/unet_plusplus.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,13 +15,13 @@
"See the License for the specific language governing permissions and \n",
"limitations under the License.\n",
"\n",
"# UNet++ input size constrains\n",
"# UNet++ input size constraints\n",
"\n",
"MONAI provides an enhanced version of UNet (``monai.networks.nets.UNet ``), which not only supports residual units, but also can use more hyperparameters (like ``strides``, ``kernel_size`` and ``up_kernel_size``) than ``monai.networks.nets.BasicUNet``. However, ``UNet`` has some constrains for both network hyperparameters and sizes of input.\n",
"MONAI provides an enhanced version of UNet (``monai.networks.nets.UNet ``), which not only supports residual units, but also can use more hyperparameters (like ``strides``, ``kernel_size`` and ``up_kernel_size``) than ``monai.networks.nets.BasicUNet``. However, ``UNet`` has some constraints for both network hyperparameters and sizes of input.\n",
"\n",
"MONAI provides a version of UNET++ (`` monai.networks.nets.BasicUnetPlusPlus ``), with fixed num. of down-scale layer, strides of 2. The configurations you can change are: the number input and output channels, number of hidden channels (6 different layers), norm and activation, bias of convolution, dropout rate, and up-sampling model. As `UNET`, different model configurations can affect the input shape.\n",
"\n",
"The constrains of hyper-parameters can be found in the docstring of the network, and this tutorial is focused on how to determine a reasonable input size."
"The constraints of hyper-parameters can be found in the docstring of the network, and this tutorial is focused on how to determine a reasonable input size."
]
},
{
Expand Down Expand Up @@ -130,12 +130,12 @@
"source": [
"## Normalization\n",
"\n",
"UNET++ use the same `TwoConv`, `Down`, and `UpCat` as UNet. Therefore, you can referred to the `modules/UNet_input_size_constrains.ipynb` for break down analysis. For summary, the constraints for these types of normalization are:\n",
"UNET++ use the same `TwoConv`, `Down`, and `UpCat` as UNet. Therefore, you can referred to the `modules/UNet_input_size_constraints.ipynb` for break down analysis. For summary, the constraints for these types of normalization are:\n",
"\n",
"- Instance Norm: the product of spatial dimension must > 1 (not include channel and batch)\n",
"- Batch Norm: the product of spatial dimension and batch must > 1 (not include channels). For training best interested, `batch_size` should be larger than 1\n",
"- Local Response Norm: No constraint.\n",
"- Other Normalization: please referred to `modules/UNet_input_size_constrains.ipynb`\n",
"- Other Normalization: please referred to `modules/UNet_input_size_constraints.ipynb`\n",
"\n",
"As for UNET++ have 4 down-sampling blocks with 2x kernel size, with no argument to change this behavior, the smallest edge we can have is `2**4 = 16`, and after the last down-sampling block, the `vector.shape = [..., ..., 1, 1]` or (`[..., ..., 1, 1, 1]` for 3D), which will cause error for the Normalization layer.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion runner.sh
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" TorchIO_MONAI_PyTor
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" image_dataset.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" decollate_batch.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" csv_datasets.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" UNet_input_size_constrains.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" UNet_input_size_constraints.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" unet_plusplus.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" network_api.ipynb)
doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" tcia_csv_processing.ipynb)
Expand Down