You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: integrations/nuance_pin/README.md
+10-12Lines changed: 10 additions & 12 deletions
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ with minimal code changes.
8
8
9
9
## Prerequisites
10
10
11
-
Before setting up and running the example MONAI spleen segmentation app to run as a Nuance PIN App, the user will need to install/download the following libraries.
11
+
Before setting up and running the example MONAI lung nodule detection app to run as a Nuance PIN App, the user will need to install/download the following libraries.
12
12
It is optional to use a GPU for the example app, however, it is recommended that a GPU is used for inference as it is very computationally intensive.
13
13
14
14
Minimum software requirements:
@@ -30,9 +30,8 @@ cd integrations/nuance_pin
30
30
In this folder you will see the following directory structure
31
31
```bash
32
32
nuance_pin
33
-
├── app # directory with MONAI app code
34
-
├── lib # directory where we will place Nuance PIN wheels
35
-
├── model # directory where we will place the model used by our MONAI app
33
+
├── app/ # directory with MONAI app code ├── lib/ # you should create this directory where we will place Nuance PIN wheels
34
+
├── model/ # directory where we will place the model used by our MONAI app
@@ -48,7 +47,7 @@ To download the test data you may follow the instructions in the [Lund Nodule De
48
47
49
48
### Download Nuance PIN SDK
50
49
51
-
Place the Nuance PIN `ai_service` wheel in the `nuance_pin/lib` folder. This can be obtained in the link provided in step 3 of of the [prerequisites](#prerequisites).
50
+
Place the Nuance PIN `ai_service` wheel in the `nuance_pin/lib` folder. This can be obtained in the link provided in step 4 of of the [prerequisites](#prerequisites).
52
51
53
52
### Running the Example App in the Container
54
53
@@ -57,7 +56,7 @@ Now we are ready to build and start the container that runs our MONAI app as a N
57
56
docker-compose up --build
58
57
```
59
58
60
-
If the build is successful the a service will start on `localhost:5000`. We can verify the service is running
59
+
If the build is successful the service will start on `localhost:5000`. We can verify the service is running
This example integration may be modified to fit any existing MONAI app, however, there may be caveats.
138
+
This example integration may be modified to fit any existing MONAI app by tailoring the files within the `app/` directory, however, there may be caveats.
140
139
141
140
Nuance PIN requires all artifacts present in the output folder to be also added into the `resultManifest.json` output file
142
141
to consider the run successful. To see what this means in practical terms, check the `resultManifest.json` output from the
143
142
example app we ran the in previous sections. You will notice an entry in `resultManifest.json` that corresponds to the DICOM
@@ -153,17 +152,16 @@ SEG output generated by the underlying MONAI app
153
152
{
154
153
"documentType": "application/dicom",
155
154
"groupCode": "default",
156
-
"name": "dicom_seg-DICOMSEG.dcm",
155
+
"name": "gsps.dcm",
157
156
"trackingUids": []
158
157
}
159
158
]
160
159
}
161
160
]
162
161
},
163
162
```
164
-
This entry is generated by `app_wrapper.py`, which takes care of adding any DICOM present in the output folder in the `resultManifest.json`
165
-
to ensure that existing MONAI apps complete successfully when deployed in Nuance. In general, however, the developer may need to tailor some
166
-
of the code in `app_wrapper.py` to provide more insight to Nuance's network, such as adding findings, conclusions, etc. and generating more insight
163
+
This entry is generated automatically by Nuance's `ai_service` library as a result of uploading the DICOM GSPS object in `app/post_inference_ops.py`.
164
+
In general, however, the developer may need to tailor some of the code in `app_wrapper.py` to provide more insight to Nuance's network, such as adding findings, conclusions, etc. and generating more insight
167
165
using SNOMED codes. All of this is handled within the Nuance PIN SDK libraries - for more information please consult Nuance PIN [documentation](https://www.nuance.com/healthcare/diagnostics-solutions/precision-imaging-network.html).
168
166
169
167
In simpler cases, the developer will need to place their code and model under `nuance_pin`. Placing the model under `model` is optional as the model may be placed
0 commit comments