Skip to content

Commit 1ad4540

Browse files
authored
Minor format enhance (#1847)
Minor format enhance ### Checks <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [ ] Avoid including large-size files in the PR. - [ ] Clean up long text outputs from code cells in the notebook. - [ ] For security purposes, please check the contents and remove any sensitive info such as user names and private key. - [ ] Ensure (1) hyperlinks and markdown anchors are working (2) use relative paths for tutorial repo files (3) put figure and graphs in the `./figure` folder - [ ] Notebook runs automatically `./runner.sh -t <path to .ipynb file>` --------- Signed-off-by: YunLiu <[email protected]>
1 parent c29337a commit 1ad4540

File tree

6 files changed

+95
-63
lines changed

6 files changed

+95
-63
lines changed

active_learning/liver_tumor_al/active_learning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@
5454
parser = argparse.ArgumentParser(description="Active Learning Setting")
5555

5656
# Directory & Json & Seed
57-
parser.add_argument("--base_dir", default="/home/vishwesh/experiments/al_sanity_test_apr27_2023", type=str)
57+
parser.add_argument("--base_dir", default="./experiments/al_sanity_test_apr27_2023", type=str)
5858
parser.add_argument("--data_root", default="/scratch_2/data_2021/68111", type=str)
5959
parser.add_argument("--json_path", default="/scratch_2/data_2021/68111/dataset_val_test_0_debug.json", type=str)
6060
parser.add_argument("--seed", default=102, type=int)

active_learning/tool_tracking_al/active_learning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@
4747
parser = argparse.ArgumentParser(description="Active Learning Settings")
4848

4949
# Directory & Json & Seed
50-
parser.add_argument("--base_dir", default="/home/vishwesh/experiments/robo_tool_experiments/variance_sanity", type=str)
50+
parser.add_argument("--base_dir", default="./experiments/robo_tool_experiments/variance_sanity", type=str)
5151
parser.add_argument("--data_root", default="/scratch_2/robo_tool_dataset_2023", type=str)
5252
parser.add_argument("--json_path", default="/scratch_2/robo_tool_dataset_2023/data_list.json", type=str)
5353
parser.add_argument("--seed", default=120, type=int)

generation/2d_vqvae/2d_vqvae_tutorial.ipynb

Lines changed: 42 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -25,10 +25,15 @@
2525
"\n",
2626
"The VQVAE can also be used as a generative model if an autoregressor model (e.g., PixelCNN, Decoder Transformer) is trained on the discrete latent representations of the VQVAE bottleneck. This falls outside of the scope of this tutorial.\n",
2727
"\n",
28-
"[1] - Oord et al. \"Neural Discrete Representation Learning\" https://arxiv.org/abs/1711.00937\n",
29-
"\n",
30-
"\n",
31-
"### Setup environment"
28+
"[1] - Oord et al. \"Neural Discrete Representation Learning\" https://arxiv.org/abs/1711.00937"
29+
]
30+
},
31+
{
32+
"cell_type": "markdown",
33+
"id": "d167a850",
34+
"metadata": {},
35+
"source": [
36+
"## Setup environment"
3237
]
3338
},
3439
{
@@ -50,7 +55,7 @@
5055
"id": "6b8ae5e8",
5156
"metadata": {},
5257
"source": [
53-
"### Setup imports"
58+
"## Setup imports"
5459
]
5560
},
5661
{
@@ -118,32 +123,16 @@
118123
"print_config()"
119124
]
120125
},
121-
{
122-
"cell_type": "code",
123-
"execution_count": 2,
124-
"id": "f7f7056e",
125-
"metadata": {},
126-
"outputs": [],
127-
"source": [
128-
"# for reproducibility purposes set a seed\n",
129-
"set_determinism(42)"
130-
]
131-
},
132-
{
133-
"cell_type": "markdown",
134-
"id": "51a9a628",
135-
"metadata": {},
136-
"source": [
137-
"### Setup a data directory and download dataset"
138-
]
139-
},
140126
{
141127
"cell_type": "markdown",
142128
"id": "9b9b6e14",
143129
"metadata": {},
144130
"source": [
145-
"Specify a `MONAI_DATA_DIRECTORY` variable, where the data will be downloaded. If not\n",
146-
"specified a temporary directory will be used."
131+
"## Setup data directory\n",
132+
"\n",
133+
"You can specify a directory with the `MONAI_DATA_DIRECTORY` environment variable. \n",
134+
"This allows you to save results and reuse downloads. \n",
135+
"If not specified a temporary directory will be used."
147136
]
148137
},
149138
{
@@ -166,12 +155,30 @@
166155
"print(root_dir)"
167156
]
168157
},
158+
{
159+
"cell_type": "markdown",
160+
"id": "d49ee071",
161+
"metadata": {},
162+
"source": [
163+
"## Set deterministic"
164+
]
165+
},
166+
{
167+
"cell_type": "code",
168+
"execution_count": null,
169+
"id": "3b010865",
170+
"metadata": {},
171+
"outputs": [],
172+
"source": [
173+
"set_determinism(42)"
174+
]
175+
},
169176
{
170177
"cell_type": "markdown",
171178
"id": "049661aa",
172179
"metadata": {},
173180
"source": [
174-
"### Download the training set"
181+
"## Download the training set"
175182
]
176183
},
177184
{
@@ -248,7 +255,7 @@
248255
"id": "d437adbd",
249256
"metadata": {},
250257
"source": [
251-
"### Visualise examples from the training set"
258+
"## Visualise examples from the training set"
252259
]
253260
},
254261
{
@@ -282,7 +289,7 @@
282289
"id": "8c6ca19a",
283290
"metadata": {},
284291
"source": [
285-
"### Download the validation set"
292+
"## Download the validation set"
286293
]
287294
},
288295
{
@@ -327,7 +334,7 @@
327334
"id": "1cfa9906",
328335
"metadata": {},
329336
"source": [
330-
"### Define network, optimizer and losses"
337+
"## Define network, optimizer and losses"
331338
]
332339
},
333340
{
@@ -377,7 +384,7 @@
377384
"id": "331aa4fc",
378385
"metadata": {},
379386
"source": [
380-
"### Model training\n",
387+
"## Model training\n",
381388
"Here, we are training our model for 100 epochs (training time: ~60 minutes)."
382389
]
383390
},
@@ -474,7 +481,7 @@
474481
"id": "ab3f5e08",
475482
"metadata": {},
476483
"source": [
477-
"### Learning curves"
484+
"## Learning curves"
478485
]
479486
},
480487
{
@@ -518,7 +525,7 @@
518525
"id": "e7c7b3b4",
519526
"metadata": {},
520527
"source": [
521-
"### Plotting evolution of reconstructed images"
528+
"## Plotting evolution of reconstructed images"
522529
]
523530
},
524531
{
@@ -559,7 +566,7 @@
559566
"id": "517f51ea",
560567
"metadata": {},
561568
"source": [
562-
"### Plotting the reconstructions from final trained model"
569+
"## Plotting the reconstructions from final trained model"
563570
]
564571
},
565572
{
@@ -595,7 +602,7 @@
595602
"id": "222c56d3",
596603
"metadata": {},
597604
"source": [
598-
"### Cleanup data directory\n",
605+
"## Cleanup data directory\n",
599606
"\n",
600607
"Remove directory if a temporary was used."
601608
]

generation/2d_vqvae_transformer/2d_vqvae_transformer_tutorial.ipynb

Lines changed: 35 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,15 @@
2828
"\n",
2929
"[1] - Oord et al. \"Neural Discrete Representation Learning\" https://arxiv.org/abs/1711.00937\n",
3030
"\n",
31-
"[2] - Tudosiu et al. \"Morphology-Preserving Autoregressive 3D Generative Modelling of the Brain\" https://arxiv.org/abs/2209.03177\n",
32-
"\n",
33-
"\n",
34-
"### Setup environment"
31+
"[2] - Tudosiu et al. \"Morphology-Preserving Autoregressive 3D Generative Modelling of the Brain\" https://arxiv.org/abs/2209.03177"
32+
]
33+
},
34+
{
35+
"cell_type": "markdown",
36+
"id": "3a0642b8",
37+
"metadata": {},
38+
"source": [
39+
"## Setup environment"
3540
]
3641
},
3742
{
@@ -51,7 +56,7 @@
5156
"id": "e3440cd3",
5257
"metadata": {},
5358
"source": [
54-
"### Setup imports"
59+
"## Setup imports"
5560
]
5661
},
5762
{
@@ -129,26 +134,16 @@
129134
"print_config()"
130135
]
131136
},
132-
{
133-
"cell_type": "code",
134-
"execution_count": 2,
135-
"id": "e11e1e9c",
136-
"metadata": {},
137-
"outputs": [],
138-
"source": [
139-
"# for reproducibility purposes set a seed\n",
140-
"set_determinism(42)"
141-
]
142-
},
143137
{
144138
"cell_type": "markdown",
145139
"id": "4f71d660",
146140
"metadata": {},
147141
"source": [
148-
"### Setup a data directory and download dataset\n",
142+
"## Setup data directory\n",
149143
"\n",
150-
"Specify a `MONAI_DATA_DIRECTORY` variable, where the data will be downloaded. If not\n",
151-
"specified a temporary directory will be used."
144+
"You can specify a directory with the `MONAI_DATA_DIRECTORY` environment variable. \n",
145+
"This allows you to save results and reuse downloads. \n",
146+
"If not specified a temporary directory will be used."
152147
]
153148
},
154149
{
@@ -171,12 +166,30 @@
171166
"print(root_dir)"
172167
]
173168
},
169+
{
170+
"cell_type": "markdown",
171+
"id": "0bdd379a",
172+
"metadata": {},
173+
"source": [
174+
"## Set deterministic training for reproducibility"
175+
]
176+
},
177+
{
178+
"cell_type": "code",
179+
"execution_count": null,
180+
"id": "8a5c290d",
181+
"metadata": {},
182+
"outputs": [],
183+
"source": [
184+
"set_determinism(42)"
185+
]
186+
},
174187
{
175188
"cell_type": "markdown",
176189
"id": "c6975501",
177190
"metadata": {},
178191
"source": [
179-
"### Download training data"
192+
"## Download training data"
180193
]
181194
},
182195
{
@@ -252,7 +265,7 @@
252265
"id": "9eb87583",
253266
"metadata": {},
254267
"source": [
255-
"### Visualse some examples from the dataset"
268+
"## Visualse some examples from the dataset"
256269
]
257270
},
258271
{
@@ -286,7 +299,7 @@
286299
"id": "a9f6b281",
287300
"metadata": {},
288301
"source": [
289-
"### Download Validation Data"
302+
"## Download Validation Data"
290303
]
291304
},
292305
{

generation/maisi/maisi_inference_tutorial.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,14 @@
1818
"\n",
1919
"# MAISI Inference Tutorial\n",
2020
"\n",
21-
"This tutorial illustrates how to use trained MAISI model and codebase to generate synthetic 3D images and paired masks.\n",
22-
"\n",
21+
"This tutorial illustrates how to use trained MAISI model and codebase to generate synthetic 3D images and paired masks."
22+
]
23+
},
24+
{
25+
"cell_type": "markdown",
26+
"id": "301dab0b",
27+
"metadata": {},
28+
"source": [
2329
"## Setup environment"
2430
]
2531
},

generation/maisi/maisi_train_vae_tutorial.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,14 @@
1818
"\n",
1919
"# MAISI VAE Training Tutorial\n",
2020
"\n",
21-
"This tutorial illustrates how to train the VAE model in MAISI on CT and MRI datasets. The VAE model is used for latent feature compression, which significantly reduce the memory usage of the diffusion model. The released VAE model weights can work on both CT and MRI images.\n",
22-
"\n",
21+
"This tutorial illustrates how to train the VAE model in MAISI on CT and MRI datasets. The VAE model is used for latent feature compression, which significantly reduce the memory usage of the diffusion model. The released VAE model weights can work on both CT and MRI images."
22+
]
23+
},
24+
{
25+
"cell_type": "markdown",
26+
"id": "12ff48d3",
27+
"metadata": {},
28+
"source": [
2329
"## Setup environment"
2430
]
2531
},

0 commit comments

Comments
 (0)