This project is no longer under active maintenance. It is read-only, but you can still clone or fork the repo. Check here for further info. Please contact innereye_info@service.microsoft.com if you run into trouble with the "Archived" state of the repo.
This document has instructions specially for InnerEye-Gateway https://github.com/microsoft/InnerEye-Gateway.
For InnerEye-Inference repo please visit this documentation.
The InnerEye-Gateway comprises Windows services that act as a DICOM Service Class Provider. After an Association Request to C-STORE a set of DICOM image files, these will be anonymised by removing a user-defined set of identifiers, and passed to a web service running InnerEye-Inference. Inference will then pass them to an instance of InnerEye-Deeplearning running on Azure to execute InnerEye-DeepLearning models. The result is downloaded, deanonymised and passed to a configurable DICOM destination. All DICOM image files, and the model output, are automatically deleted immediately after use.
The gateway should be installed on a machine within your DICOM network that is able to access a running instance of InnerEye-Inference.
- InnerEye-Gateway
- Overview
- Contents
- Getting Started
- License Keys
- To Build The Gateway
- To Run The Tests
- To Run The Gateway In Development
- To Manually Test The Gateway
- To Run The Gateway In Production
- Architecture
- Anonymisation
- Configuration
- OWASP
- Licensing
- Contributing
- Microsoft Open Source Code of Conduct
To get started with setting up this project you will need the following pre-requisites:
-
A machine running Windows 10, because the Gateway runs as Windows Services. It requires a network card to act as a DICOM SCP and for access to the InnerEye-Inference web service. A large hard disk is not required because the DICOM image files are not kept for very long.
-
Visual Studio 2019 Community Edition (Download Visual Studio Community Edition)
-
.Net Framework 4.6.2 Download .Net Framework
-
To clone the repository a git client with large file support, e.g. git for windows
-
Wix Toolset for building the installer Download WixToolset
-
Ensure you have run the
download_dcmtk.ps1
PowerShell script to download two DICOM tools (DICOM Toolkit and Dicom3tools) required for testing the gateway. Note that these tools are only required for testing but the test projects and therefore the gateway will not build without them. To enable and run the script:-
Start PowerShell with the
Run as administrator
option. -
Run the PowerShell command:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
-
Agree to the change
-
From the root dir, run the following command:
.\Source\Microsoft.Gateway\download_dcmtk.ps1
-
If you get an error similar to the following:
Invoke-WebRequest : The remote server returned an error: (404) Not Found.
then it's highly likely the Dicom3tools link is outdated. You'll need to navigate to dclunie.com, locate the latest work-in-progress copy of the windows dicom3tools (this link should be correct), copy the download link of the latest.zip
file displayed, and replace the URL in line 5 of thedownload_dcmtk.ps1
script before running again (you may need to remove files downloaded on the failed run first).
-
-
InnerEye-Inference service from https://github.com/microsoft/InnerEye-Inference running as a web service, using, for example Azure Web Services. Note the URI that the service has been deployed to and the license key stored in the environment variable
CUSTOMCONNSTR_API_AUTH_SECRET
on the InnerEye-Inference deployment, they are needed as explained below.
For security in InnerEye-Gateway and InnerEye-Inference the license keys are stored in environment variables and never stored in JSON or other configuration files. InnerEye-Inference uses the environment variable CUSTOMCONNSTR_API_AUTH_SECRET
whereas InnerEye-Gateway allows the name of the environment variable to be configured in the JSON file in the LicenseKeyEnvVar
property. This is so that the tests may be configured to run against a different InnerEye-Inference web service. Note also that because the applications run as windows services the environment variables should be system variables, not user variables, so they can be accessed by the services.
Alongside LicenseKeyEnvVar
the property InferenceUri
holds the URI of a running instance of InnerEye-Inference and the environment variable identified by LicenseKeyEnvVar
should hold the license key for that instance. The license key is a user-created string that is defined when you configure your inference service.
For example if InferenceUri
is "https://myinnereyeinference.azurewebsites.net", LicenseKeyEnvVar
is "MY_GATEWAY_API_AUTH_SECRET", and the contents of the environment variable CUSTOMCONNSTR_API_AUTH_SECRET
used for "https://myinnereyeinference.azurewebsites.net" is MYINFERENCELICENSEKEY
, then set this environment variable with the PowerShell command, running as administrator:
setx MY_GATEWAY_API_AUTH_SECRET MYINFERENCELICENSEKEY /M
-
Clone the repository.
-
Download the DICOM tools using the PowerShell script in Getting Started.
-
Open the project solution file ./Source/Microsoft.Gateway/Microsoft.Gateway.sln in Visual Studio 2019.
-
Set the project platform to x64.
-
Build the Solution.
-
Run Visual Studio as Administrator, because one of the tests needs to access system environment variables.
-
For end to end tests:
-
check the configuration settings for
Microsoft.InnerEye.Listener.Processor
in ./Source/Microsoft.Gateway/Microsoft.InnerEye.Listener.Tests/TestConfigurations/GatewayProcessorConfig.json. Check the following settings inProcessorSettings
:-
InferenceUri
is the Uri for the InnerEye-Inference web service from Getting Started. See License Keys for more details. -
LicenseKeyEnvVar
is the name of the environment variable which contains the license key for the InnerEye-Inference web service atInferenceUri
. See License Keys for more details.
-
-
check the configuration settings for the test application entity models in ./Source/Microsoft.GatewayMicrosoft.InnerEye.Listener.Tests/TestConfigurations/GatewayModelRulesConfig/GatewayModelRulesConfig.json. The tests will use the first found application entity model so check that
ModelId
in the firstModelsConfig
is a valid PassThrough model id for the configured InnerEye-Inference service. Note that the PassThrough model is a special model intended for testing that simply returns a hard-coded list of structures for any input DICOM image series; it will always return the structures["SpinalCord", "Lung_R", "Lung_L", "Heart", "Esophagus"]
. If this is not present in the instance of InnerEye-Deeplearning build the model by running the command:
python InnerEye/ML/runner.py --azureml=True --model=PassThroughModel
-
-
Make sure the testing Default Processor Architecture is set to x64. This can be checked by navigating to Test > Test Settings > Default Process Architecture
-
The tests can be executed from Visual Studio 2019 by navigating to Test > Windows > Test Explorer.
-
All the available tests will be visible in the Test Explorer.
-
A test can be executed by right clicking on the test and selecting the Run Selected Tests option.
-
To execute all the available tests, click on the Run All link in the Test Explorer.
-
The log for the test execution can be found in the Output window.
-
The gateway uses multiple startup projects:
Microsoft.InnerEye.Listener.Processor
andMicrosoft.InnerEye.Listener.Receiver
. Configure this in Visual Studio by right clicking on the Solution in Solution Explorer and selecting "Set Startup Projects...". Click the radio button "Multiple startup projects" and for the projects above select "Start" in the "Action" combo box. -
Check the configuration settings for
Microsoft.InnerEye.Listener.Processor
in ./Source/Microsoft.Gateway/SampleConfigurations/GatewayProcessorConfig.json. More details are in Processor Configuration, but the most important parameters to check are inProcessorSettings
:-
InferenceUri
is the Uri for the InnerEye-Inference web service from Getting Started. See License Keys for more details. -
LicenseKeyEnvVar
is the name of the environment variable which contains the license key for the InnerEye-Inference web service atInferenceUri
. See License Keys for more details.
-
-
Check the configuration settings for
Microsoft.InnerEye.Listener.Receiver
in ./Source/Microsoft.Gateway/SampleConfigurations/GatewayReceiveConfig.json. More details are in Receiver Configuration, but the most important parameter to check is:Port
inGatewayDicomEndPoint
inReceiveServiceConfig
- holds the IP port the receiver service listens on.
-
Check the configuration settings for the application entity models in the folder ./Source/Microsoft.Gateway/SampleConfigurations/GatewayModelRulesConfig. The configuration may be split across multiple JSON files, if desired, and the configuration will be merged. The structure of these files is that each must contain an array of the form:
[
{
"CallingAET": << Calling application entity title >>,
"CalledAET": << Called application entity title >>,
"AETConfig": {
"Config": {
"AETConfigType": << One of "Model", "ModelDryRun", or "ModelWithResultDryRun" >>
"ModelsConfig": [
{
"ModelId": << Model id for model in inference service, e.g. "PassThroughModel:3" >>,
"ChannelConstraints": [ << Array of channel constraints >> ],
"TagReplacements": [ << Array of tag replacements >> ]
},
...
]
},
"Destination": << Destination >>,
"ShouldReturnImage": << Return image flag >>
}
}
All JSON files in this folder are loaded and parsed. If the same CallingAET
and CalledAET
are found in more than one instance then the ModelsConfig
arrays are concatenated to create one instance sharing all the other properties (which are taken from the first instance found). More details are in Model Configuration.
The DCMTK - DICOM Toolkit can be used to push DICOM images into the gateway, and receive the segmented DICOM-RT file that is returned.
As described later in Model Configuration, the gateway can be configured to support multiple application entity models. For a manual test, pick one of these, which will be called the target ae model
below.
The tool storescp can be used to store the segmented images. A sample Powershell script is provided here which will start storescp
and wait for the segmented images to be returned from the gateway, and is copied below:
$ReceiveFolder = "TestReceived"
$AETitle = "PACS"
$Port = 104
if (-not(Test-Path $ReceiveFolder))
{
New-Item $ReceiveFolder -ItemType Directory
}
& ".\dcmtk-3.6.5-win64-dynamic\bin\storescp.exe" `
--log-level trace <# log level #> `
--aetitle $AETitle <# set my AE title #> `
--output-directory $ReceiveFolder <# write received objects to existing directory TestReceived #> `
$Port <# port #>
Here:
$ReceiveFolder
is a folder that will be used to store the returned segmented DICOM images (the sample script creates it if it does not already exist).$AETitle
is the Application Entity Title for this application. In principle it should match theTitle
set in theDestination
part of thetarget ae model
, but in practice this is not validated.$Port
is the port that this application will listen on. It must match thePort
set in theDestination
part of thetarget ae model
.
The tool storescu can be used to send a set of DICOM images to the gateway for segmentation. A sample Powershell script is provided here which will send a folder of DICOM files to the gateway, and is copied below:
$AETitle = "RADIOMICS_APP"
$Call = "PassThroughModel"
$Port = 111
$SendFolder = "..\..\Images\HN\"
& ".\dcmtk-3.6.5-win64-dynamic\bin\storescu.exe" `
--log-level trace <# log level #> `
--scan-directories <# scan directories for input files #> `
--scan-pattern "*.dcm" <# pattern for filename matching (wildcards) #> `
--aetitle $AETitle <# set my calling AE title #> `
--call $Call <# set called AE title of peer #> `
127.0.0.1 <# peer #> `
$Port <# port #> `
$SendFolder <# dcmfile-in #>
Here:
$AETitle
is the calling Application Entity Title for this application. It should match theCallingAET
set in thetarget ae model
.$Call
is the called Application Entity Title. It should match theCalledAET
set in thetarget ae model
.$Port
is the port that the gateway is listening on, as configured inGatewayReceiveConfig.json
in theReceiveServiceConfig.GatewayDicomEndPoint.Port
parameter.$SendFolder
is the path to the folder of DICOM images to send to the gateway.
-
Check the powershell script test_recv.ps1 and start it in one shell. Leave it running, it will print "T: Timeout while waiting for incoming network data".
-
Start Visual Studio debugging, making sure that both projects
Microsoft.InnerEye.Listener.Processor
andMicrosoft.InnerEye.Listener.Receiver
are set to start. -
Check the powershell script test_push.ps1 and start it in another shell. This will send the folder of images in ./Images/HN/ to the gateway.
-
Check that the
ReceiveService
of theReceiver
application received the images. -
Check that the
UploadService
of theProcessor
application uploaded the images to the Inference service. -
Check that the
DownloadService
of theProcessor
application polled the Inference service for the segmentation result (this event will have "DownloadProgress": 50 until the segmentation is available). -
Check that the
DeleteService
deleted the received DICOM image files. -
Check that the
DownloadService
finally downloaded the segmenttion. -
Check that the
PushService
pushed the segmentation to storescp. -
Check that the
DeleteService
deleted the segmentation.
The WixToolset will build an installer when the gateway is built. For a release build it will be:
Note that the installer will include the configuration files in the folder ./Source/Microsoft.Gateway/SampleConfigurations, as well as the receiver and processor applications. Therefore the workflow should be to evolve the configuration files running the gateway as above and when the configuration files are correct then rebuild the gateway to include the new configuration files in the installer.
By default the installer will put the files in the folder: C:\Program Files\Microsoft InnerEye Gateway
, with the following folder structure (with the default GatewayModelRulesConfig):
├── Config
│ ├── GatewayModelRulesConfig
│ │ ├── GatewayModelRulesConfigPassThrough1.json
│ │ ├── GatewayModelRulesConfigPassThrough2.json
│ │ ├── GatewayModelRulesConfigPelvis.json
│ ├── GatewayProcessorConfig.json
│ ├── GatewayReceiveConfig.json
├── Microsoft InnerEye Gateway Processor
│ ├── Microsoft.InnerEye.Listener.Processor.exe
│ ├── log4net.config
│ ├── ... (all other dependencies)
├── Microsoft InnerEye Gateway Receiver
│ ├── Microsoft.InnerEye.Listener.Receiver.exe
│ ├── log4net.config
│ ├── ... (all other dependencies)
Both processor and receiver are installed as windows services that start automatically. The config files log4net.config
control the logging for their respective service. They are copied from the source folders, e.g.
./Source/Microsoft.Gateway/Microsoft.InnerEye.Listener.Processor/log4net.config
By default they are both set to create a new log file each day and keep a total of 5 log files.
The gateway runs as two applications or services configured with JSON files. The two applications communicate, and operate internally, using message queues built on Sqlite.
To edit this, see: Rebuild Diagram.
The first application is Microsoft.InnerEye.Listener.Receiver
. This create a DICOM server that listens on an IP port for incoming DICOM communications and stores requests that are accepted on a message queue. This is configured in Receiver Configuration.
In detail, this:
-
Starts a DICOM server that listens for incoming DICOM messages.
-
On a DICOM Association Request, checks against the configured acceptable SOP classes and their transfer syntaxes.
-
On a DICOM C-Store Request, stores the incoming DICOM image file to a subfolder of the RootDicomFolder. There will be one of these for each image slice.
-
On a DICOM Association Release Request, puts a new message on the Upload message queue with details of the DICOM Association and location of the files.
The second application is Microsoft.InnerEye.Listener.Processor
. This waits for messages on the Upload message queue from the Receiver Application. When a new message is received it: copies and de-identifies the received DICOM images, sends them to the InnerEye-Inference web service, waits and then downloads the resulting DICOM-RT file. This is then de-anonymised and sent on to the configured destination.
This is configured in Processor Configuration.
In detail, this starts 4 worker tasks: the Delete Service, Download Service, Push Service, and Upload Service, which communicate using the message queues: Delete, Download, Push, and Upload.
This service watches the Upload message queue for messages from the Receiver application that indicate a DICOM request has completed. When a new message is received it:
-
Tries to find an application entity model configured with the matching
CalledAET
andCallingAET
(Model Configuration). -
If there is a corresponding model and it is set to
Model
orModelWithResultDryRun
:a. Reads the received DICOM image files from the subfolder of the RootDicomFolder.
b. Groups the DICOM image files by Study Instance UID, and then grouped by Series Instance UID.
c. Compares each group with each channel config until a set of filtered DICOM image files matches the constraints in the channel config.
d. Copies the image data and only the required DICOM tags to a set of new images, removing the remaining DICOM tags.
e. Zips the images and POSTs them to the InnerEye-Inference web service.
f. If the model
ShouldReturnImage
is set then copies all sent data to either the Results or DryRunModelWithResultFolder subfolders of the RootDicomFolder.g. Creates a new message on the Download queue for this web service request.
h. Creates a new message on the Delete queue to delete the received DICOM image files.
-
If there is a corresponding model and it is set to
ModelDryRun
:a. Reads the received DICOM image files from the subfolder of the RootDicomFolder.
b. Removes the user-defined DICOM tags specified for de-identification.
c. Saves the anonymised image files to the DryRunModelAnonymizedImage subfolder of the RootDicomFolder.
d. Creates a new message on the Delete queue to delete the received image files.
-
If there is no correspond model then:
a. Creates a new message on the Delete queue to delete the received image files.
This service watches the Download message queue for messages from the Upload Service that indicate a set of DICOM files have been sent to the InnerEye-Inference web service. When a new message is received it:
-
Waits for the InnerEye-Inference web service to perform segmentation then downloads the resulting DICOM-RT file.
-
De-anonymises the DICOM-RT file using data from the original received DICOM image files.
-
Saves the de-anonymised file to either the Results or DryRunModelWithResultFolder subfolders of the RootDicomFolder.
-
If this is not
ModelWithResultDryRun
then creates a new message on the Push message queue.
This service watches the Push message queue for messages from the Download Service that indicate that a DICOM-RT file has been downloaded from the InnerEye-Inference web service. When a new message is received it:
-
Tries to find an application entity model configured with the matching
CalledAET
andCallingAET
(Model Configuration). -
Reads the de-anonymised DICOM-RT file from the Results subfolder of the RootDicomFolder.
-
Sends them to the DICOM Destination as set in the application entity model.
-
Creates a new message on the Delete queue to delete the received DICOM-RT file.
This service watches the Delete message queue for messages from the other services. If it receives a message it will delete the specified files or folders.
The InnerEye Gateway allows users to define a set of identifiers that will be removed before being sent to the InnerEye-Inference web service. The set of identifier tags for removal are user-defined in GatewayProcessorConfig.json.
Users should ensure that the file represents the appropriate level of de-identification, including against any applicable organizational or local requirements.
The Gateway service processes and de-identifies the DICOM files using the procedure below.
-
The Receiver Application saves the incoming DICOM files directly to a subfolder of RootDicomFolder and passes a message to the Upload Service.
-
The Upload Service loads the DICOM files and makes a copy of each file in memory (discarding the image data) to use as reference images for de-anonymisation later. The following tags are always kept, as well as any additional tags specified in
GatewayProcessorConfig.json
:// Patient module PatientID, PatientName, PatientBirthDate, PatientSex, // Study module StudyDate, StudyTime, ReferringPhysicianName, StudyID, AccessionNumber, StudyDescription
-
The Upload Service loads the DICOM files again and makes a second copy of each file in memory, performing the following transformations to the DICOM tags:
-
Tags specified in
GatewayProcessorConfig.json
are kept and transformed according to their specified anonymisation method (i.e "Keep", "Hash" or "Random"). -
All other tags are discarded.
-
The transformed, anonymised DICOM files are then zipped before sending to the InnerEye-Inference web service.
-
-
Upon completion of the inference run, the Download Service downloads the DICOM-RT file and de-anonymises the segmented images by replacing their hashed, randomised and discarded tags with those saved in the reference image.
Anonymisation is tested extensively across multiple sets of tests. These can be found in the config tests file, the DICOM anonymisation tests file and the end-to-end tests file.
Both applications share some common configuration.
The InnerEye-Gateway repeatedly checks if its configuration files have been modified, to ensure that it can be updated without interrupting the DICOM processing. This is configured with ConfigurationServiceConfig
. The algorithm behind this is that each ConfigurationRefreshDelaySeconds
the application will check connectivity to the InnerEye-Inference service (for the Processor service) and check for changes to the configuration files. If ConfigCreationDateTime
is different to the last loaded configuration file and ApplyConfigDateTime
is past then the application will effectively restart with the new configuration.
{
"ServiceSettings": {
"RunAsConsole": boolean
},
<< service specific configuration >>,
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": date/time,
"ApplyConfigDateTime": date/time,
"ConfigurationRefreshDelaySeconds": number
}
}
For example:
{
"ServiceSettings": {
"RunAsConsole": true
},
<< service specific configuration >>,
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": "2020-05-31T20:14:51",
"ApplyConfigDateTime": "2020-05-31T20:14:51",
"ConfigurationRefreshDelaySeconds": 60
}
}
Where:
-
RunAsConsole
if true means that this application runs as a normal console application, otherwise it runs as a windows service. This can be used for debugging or testing. -
ConfigCreationDateTime
is the date and time that this configuration was created. It does not need to be exact, because it is only used to identify when the file has been edited. -
ApplyConfigDateTime
is the date and time that this configuration should be applied. -
ConfigurationRefreshDelaySeconds
is the time in seconds the application will wait between checks if a new configuration file is available.
The processor application is configured by ./Source/Microsoft.Gateway/SampleConfigurations/GatewayProcessorConfig.json.
The structure of this configuration file is:
{
"ServiceSettings": {
"RunAsConsole": boolean
},
"ProcessorSettings": {
"LicenseKeyEnvVar": string,
"InferenceUri": string
},
"DequeueServiceConfig": {
"MaximumQueueMessageAgeSeconds": number,
"DeadLetterMoveFrequencySeconds": number
},
"DownloadServiceConfig": {
"DownloadRetryTimespanInSeconds": number,
"DownloadWaitTimeoutInSeconds": number
},
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": date/time,
"ApplyConfigDateTime": date/time,
"ConfigurationRefreshDelaySeconds": number
},
"AnonymisationSettings": {
"DicomTagsAnonymisationConfig": {
"string": [string],
}
]
}
}
For example:
{
"ServiceSettings": {
"RunAsConsole": true
},
"ProcessorSettings": {
"LicenseKeyEnvVar": "MYINFERENCELICENSEKEY",
"InferenceUri": "https://myinnereyeinference.azurewebsites.net"
},
"DequeueServiceConfig": {
"MaximumQueueMessageAgeSeconds": 100,
"DeadLetterMoveFrequencySeconds": 1
},
"DownloadServiceConfig": {
"DownloadRetryTimespanInSeconds": 5,
"DownloadWaitTimeoutInSeconds": 3600
},
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": "2020-05-31T20:14:51",
"ApplyConfigDateTime": "2020-05-31T20:14:51",
"ConfigurationRefreshDelaySeconds": 60
},
"AnonymisationSettings": {
"DicomTagsAnonymisationConfig": {
"Keep": [
"PatientPosition",
"Columns",
"Rows",
"PixelSpacing",
"ImagePositionPatient",
"ImageOrientationPatient",
"SliceLocation",
"Modality",
"ModalityLUTSequence",
"BodyPartExamined",
"HighBit",
"BitsStored",
"BitsAllocated",
"SamplesPerPixel",
"PixelData",
"PhotometricInterpretation",
"PixelRepresentation",
"RescaleIntercept",
"RescaleSlope",
"ImageType",
"SOPClassUID",
"RTReferencedStudySequence",
"ReferencedROINumber",
"ROIDisplayColor",
"ContourSequence",
"ROIContourSequence",
"ReferencedSOPClassUID",
"NumberOfContourPoints",
"ContourData",
"ContourGeometricType",
"ContourImageSequence",
"ObservationNumber",
"RTReferencedSeriesSequence",
"ReferencedFrameOfReferenceSequence",
"ROINumber",
"ROIName"
],
"Hash": [
"SeriesInstanceUID",
"StudyInstanceUID",
"SOPInstanceUID",
"FrameOfReferenceUID",
"ReferencedSOPInstanceUID",
"RTROIObservationsSequence",
"StructureSetLabel",
"StructureSetName",
"ReferencedFrameOfReferenceUID",
"ROIGenerationAlgorithm",
"StructureSetROISequence",
]
}
}
}
}
}
Where:
-
ServiceSettings
andConfigurationServiceConfig
are as above. -
InferenceUri
is the Uri for the InnerEye-Inference web service from Getting Started. See License Keys for more details. -
LicenseKeyEnvVar
is the name of the environment variable which contains the license key for the InnerEye-Inference web service atInferenceUri
. See License Keys for more details. -
MaximumQueueMessageAgeSeconds
is an internal message queue setting - the maximum time in seconds a message may spend in a queue before being automatically deleted. -
DeadLetterMoveFrequencySeconds
is another internal message queue setting - if there is an error processing a message then it will be moved to a Dead letter queue and this is the time in seconds before moving messages back from the dead letter queue to the normal queue. -
DownloadRetryTimespanInSeconds
is the delay in seconds between attempts to download the completed segmentation from the InnerEye-Inference service. -
DownloadWaitTimeoutInSeconds
is the maximum time in seconds to wait whilst attempting to download the completed segmentation. -
AnonymisationSettings: DicomTagAnonymisationConfig
is a dictionary where the keys correspond to the anonymisation method ("Keep", "Hash" or "Random"), and the entries are lists of DICOM tags that will be sent using that method. "Keep" means that the tag is sent unaltered to this inference service, "Hash" means that the value is hashed first and "Random" replaces the value with a random data time (we do not recommend using this option currently). When altering this setting, please note the following:- Certain tags are needed by the inference service in order to complete the forward pass on the trained model successfully. We do not recommend editing the default settings unless you are sure what you are doing. If in doubt, please open a discussion topic in this repo to ask any questions you may have.
- Any tags not included in this list will not be sent to the inference service
The receiver application is configured by ./Source/Microsoft.Gateway/SampleConfigurations/GatewayReceiveConfig.json and also by model configuration files in the folder ./Source/Microsoft.Gateway/SampleConfigurations/GatewayModelRulesConfig. The model configuration is explained below: Model Configuration.
The structure of GatewayReceiveConfig.json configuration file is:
{
"ServiceSettings": {
"RunAsConsole": boolean
},
"ReceiveServiceConfig": {
"GatewayDicomEndPoint": {
"Title": string,
"Port": number,
"Ip": string
},
"RootDicomFolder": string,
"AcceptedSopClassesAndTransferSyntaxesUIDs": object of the form {
string: array of strings,
string: array of strings,
string: array of strings,
...
}
},
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": date/time,
"ApplyConfigDateTime": date/time,
"ConfigurationRefreshDelaySeconds": number
}
}
For example:
{
"ServiceSettings": {
"RunAsConsole": true
},
"ReceiveServiceConfig": {
"GatewayDicomEndPoint": {
"Title": "GATEWAY",
"Port": 111,
"Ip": "localhost"
},
"RootDicomFolder": "C:\\InnerEyeGateway\\",
"AcceptedSopClassesAndTransferSyntaxesUIDs": {
"1.2.840.10008.1.1": [ "1.2.840.10008.1.2.1", "1.2.840.10008.1.2" ],
"1.2.840.10008.5.1.4.1.1.481.3": [ "1.2.840.10008.1.2", "1.2.840.10008.1.2.1" ],
"1.2.840.10008.5.1.4.1.1.2": [ "1.2.840.10008.1.2", "1.2.840.10008.1.2.1", "1.2.840.10008.1.2.4.57", "1.2.840.10008.1.2.4.70", "1.2.840.10008.1.2.4.80", "1.2.840.10008.1.2.5" ],
"1.2.840.10008.5.1.4.1.1.4": [ "1.2.840.10008.1.2", "1.2.840.10008.1.2.1", "1.2.840.10008.1.2.4.57", "1.2.840.10008.1.2.4.70", "1.2.840.10008.1.2.4.80", "1.2.840.10008.1.2.5" ]
}
},
"ConfigurationServiceConfig": {
"ConfigCreationDateTime": "2018-07-25T20:14:51.539351Z",
"ApplyConfigDateTime": "2018-07-25T20:14:51.539351Z",
"ConfigurationRefreshDelaySeconds": 60
}
}
Where:
-
ServiceSettings
andConfigurationServiceConfig
are as above. -
ReceiveServiceConfig.GatewayDicomEndPoint.Title
is only used for testing, but must be supplied. -
ReceiveServiceConfig.GatewayDicomEndPoint.Port
is the IP port the DICOM server will listen for DICOM messages on. -
ReceiveServiceConfig.GatewayDicomEndPoint.Ip
is also only used for testing, but must be supplied. -
RootDicomFolder
is the folder to be used for temporarily storing DICOM files. -
AcceptedSopClassesAndTransferSyntaxesUIDs
is a dictionary of DICOM Service-Object Pair (SOP) Classes and Transfer Syntax UIDs that the application supports. Here:- "1.2.840.10008.1.1" is the Verification SOP Class
- "1.2.840.10008.5.1.4.1.1.481.3" is Radiation Therapy Structure Set Storage.
- "1.2.840.10008.5.1.4.1.1.2" is CT Image Storage
- "1.2.840.10008.5.1.4.1.1.4" is MR Image Storage
-
For each of these SOPs there is a list of supported Transfer Syntax UIDs. For example:
- "1.2.840.10008.1.2" is Implicit VR Endian: Default Transfer Syntax for DICOM.
- "1.2.840.10008.1.2.4.57" is a type of JPEG.
The application entity models are configured in the folder ./Source/Microsoft.Gateway/SampleConfigurations/GatewayModelRulesConfig. The configuration may be split across multiple JSON files, if desired, and the configuration will be merged. The structure of these files is that each must contain an array of the form:
[ array of application entity model config objects of the form:
{
"CallingAET": string,
"CalledAET": string,
"AETConfig": {
"Config": {
"AETConfigType": string, one of "Model", "ModelDryRun", or "ModelWithResultDryRun"
"ModelsConfig": [ array of models config objects of the form:
{
"ModelId": string,
"ChannelConstraints": [ array of channel constraints objects ],
"TagReplacements": [ array of tag replacements objects ]
}
]
},
"Destination": {
"Title": string,
"Port": number,
"Ip": string
},
"ShouldReturnImage": boolean
}
}
]
For example:
[
{
"CallingAET": "RADIOMICS_APP",
"CalledAET": "PassThroughModel",
"AETConfig": {
"Config": {
"AETConfigType": "Model",
"ModelsConfig": [
{
"ModelId": "PassThroughModel:3",
"ChannelConstraints": [
{
"ChannelID": "ct",
"ImageFilter": {
"Constraints": [
{
"RequirementLevel": "PresentNotEmpty",
"Constraint": {
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
},
"discriminator": "RequiredTagConstraint"
}
],
"Op": "And",
"discriminator": "GroupConstraint"
},
"ChannelConstraints": {
"Constraints": [
{
"RequirementLevel": "PresentNotEmpty",
"Constraint": {
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
},
"discriminator": "RequiredTagConstraint"
}
],
"Op": "And",
"discriminator": "GroupConstraint"
},
"MinChannelImages": 50,
"MaxChannelImages": 1000
}
],
"TagReplacements": [
{
"Operation": "UpdateIfExists",
"DicomTagIndex": {
"Group": 12294,
"Element": 2
},
"Value": "InnerEye"
},
{
"Operation": "AppendIfExists",
"DicomTagIndex": {
"Group": 12294,
"Element": 38
},
"Value": " NOT FOR CLINICAL USE"
}
]
}
]
},
"Destination": {
"Title": "RADIOMICS_APP",
"Port": 104,
"Ip": "127.0.0.1"
},
"ShouldReturnImage": false
}
}
]
Where:
-
CallingAET
is the calling application entity title to be matched. -
CalledAET
is the called application entity title to be matched. -
AETConfig
consists of three parts:-
Config
consists of the pair:-
AETConfigType
is one of "Model", "ModelDryRun", or "ModelWithResultDryRun". "Model" is the normal case, the other two are for debugging. "ModelDryRun" means that the received DICOM image files will be de-identified by removing a user-defined set of identifiers and saved to the DryRunModelAnonymizedImage subfolder of RootDicomFolder. "ModelWithResultDryRun" means almost the same as "Model" except that the DICOM-RT file is downloaded to the DryRunRTResultDeAnonymized subfolder of RootDicomFolder and it is not pushed to a DICOM destination. -
ModelsConfig
is an array of ModelsConfig objects, described below.
-
-
Destination
is where to send the resulting DICOM-RT file, consisting of:-
Title
is the destination application entity title, -
Port
is the destination application entity port, -
Ip
is the destination IP addres.
-
-
ShouldReturnImage
is true if the original received DICOM image files should be returned when InnerEye-Inference service is complete, false otherwise.
-
Each model config has the following structure:
{
"ModelId": string,
"ChannelConstraints": [ array of objects of the form
{
"ChannelID": string,
"ImageFilter": {
"Constraints": [ array of constraints ],
"Op": string, one of "And" or "Or",
"discriminator": "GroupConstraint"
},
"ChannelConstraints": {
"Constraints": [ array of constraints ],
"Op": string, one of "And" or "Or",
"discriminator": "GroupConstraint"
},
"MinChannelImages": number,
"MaxChannelImages": number
}
],
"TagReplacements": [ array of objects of the form:
{
"Operation": string, one of "UpdateIfExists" or "AppendIfExists",
"DicomTagIndex": {
"Group": number,
"Element": number
},
"Value": string
}
]
}
For example:
{
"ModelId": "PassThroughModel:3",
"ChannelConstraints": [
{
"ChannelID": "ct",
"ImageFilter": {
"Constraints": [
{
"RequirementLevel": "PresentNotEmpty",
"Constraint": {
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
},
"discriminator": "RequiredTagConstraint"
}
],
"Op": "And",
"discriminator": "GroupConstraint"
},
"ChannelConstraints": {
"Constraints": [
{
"RequirementLevel": "PresentNotEmpty",
"Constraint": {
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
},
"discriminator": "RequiredTagConstraint"
}
],
"Op": "And",
"discriminator": "GroupConstraint"
},
"MinChannelImages": 50,
"MaxChannelImages": 1000
}
],
"TagReplacements": [
{
"Operation": "UpdateIfExists",
"DicomTagIndex": {
"Group": 12294,
"Element": 2
},
"Value": "InnerEye"
},
{
"Operation": "AppendIfExists",
"DicomTagIndex": {
"Group": 12294,
"Element": 38
},
"Value": " NOT FOR CLINICAL USE"
}
]
}
Where:
-
ModelId
is the model identifier that is passed to the InnerEye-Inference service. -
ChannelConstraints
is an array of ChannelConstraint that are applied to the received DICOM image files. The algorithm here is that:-
The image files are first grouped by Study Instance UID (which must exist), and then grouped by Series Instance UID (which must also exist)
-
For each group of shared Study and Series Instance UIDs: the channel constraints for each
ModelsConfig
are applied in order to the image group. If there is a match then the matching ModelsConfig and image group are then used. The channel constraints are explained below.
-
-
TagReplacements
is a list of DICOM tag replacements that are performed for DICOM-RT file de-anonymisation. The algorithm here is during de-anonymisation to work through all theTagReplacements
:-
If the file contains the tag as specified in
DicomTagIndex
:-
If the
Operation
is "UpdateIfExists" then replace the existing tag withValue
; -
If the
Operation
is "AppendIfExists" then appendValue
to the existing tag.
-
-
Each channel constraint has the form:
{
"ChannelID": string,
"ImageFilter": {
"Constraints": [ array of constraints ],
"Op": string, one of "And" or "Or",
"discriminator": "GroupConstraint"
},
"ChannelConstraints": {
"Constraints": [ array of constraints ],
"Op": string, one of "And" or "Or",
"discriminator": "GroupConstraint"
},
"MinChannelImages": number,
"MaxChannelImages": number
}
Where:
-
ChannelID
is the id of the channel. It is used in the zip file sent to the InnerEye-Inference service. -
ImageFilter
is a GroupConstraint used to filter the DICOM image files before applying the constraints. This selects a subset of images from the same series by filtering unwanted data e.g. extraneous sop classes. -
ChannelConstraints
is a GroupConstraint used on the images that have passed throughImageFilter
. -
MinChannelImages
is the minimum number of files required for this channel. Use 0 or less to impose no constraint. This is the inclusive lower bound for the number of filtered images. -
MaxChannelImages
is the maximum number of files allowed for this channel. Use 0 or less to impose no maximum constraint. This is the inclusive upper bound for the number of filtered images.
There are many types of DicomConstraint. So they can be identified when loading the JSON they are all of the form:
{
<< constraint specific data >>,
"discriminator": string
}
Where:
discriminator
identifies the DicomConstraint type.
These are split into two groups. The first group contains GroupConstraint and RequiredTagConstraint and are sort of container objects. The second group are all instances of DicomTagConstraint which specify a DICOM tag and a constraint to be applied.
This constraint acts as a container for a set of other constraints that must either all pass, or at least one of them must pass.
{
"Constraints": [ array of constraints ],
"Op": string, one of "And" or "Or",
"discriminator": "GroupConstraint"
}
Where:
-
Constraints
is an array of DicomConstraint. -
Op
controls whether all constraints ("And") must be met, or at least one of them ("Or"). -
discriminator
is as DicomConstraint.
This constraint acts as a container for a tag constraint and a requirement on the tag.
{
"RequirementLevel": string, one of "PresentNotEmpty", "PresentCanBeEmpty", or "Optional",
"Constraint": a dicom tag constraint object,
"discriminator": "RequiredTagConstraint"
}
For example:
{
"RequirementLevel": "PresentNotEmpty",
"Constraint": {
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
},
"discriminator": "RequiredTagConstraint"
}
Where:
-
RequirementLevel
means:-
"PresentNotEmpty" - the tag must be present and the conditions on the tag must pass.
-
"PresentCanBeEmpty" - the tag must be present and the conditions must pass when the tag is non-empty.
-
"Optional" - the tag does not need to be present but the condition must pass if the tag is present and non-empty.
-
-
Constraint
is any DicomTagConstraint. -
discriminator
is as DicomConstraint.
This group of DICOM constraints operate on a DICOM tag. They are all of the form:
{
<< tag constraint specific data >>,
"Index": {
"Group": number,
"Element": number
},
"discriminator": string
}
Where:
-
Index
identifies the target DICOM tag. -
discriminator
is as DicomConstraint.
This acts as a container for a set of constraints and a DICOM tag that the constraints apply to.
{
"Group": group constraint object,
"Index": {
"Group": number,
"Element": number
},
"discriminator": "GroupTagConstraint"
}
Where:
-
Group
is a GroupConstraint -
Index
is as DicomTagConstraint. -
discriminator
is as DicomConstraint.
This constraint tests that a DICOM tag contains a given value.
{
"Match": string,
"Ordinal": number,
"Index": {
"Group": number,
"Element": number
},
"discriminator": "StringContainsConstraint"
}
For example:
{
"Match": "AXIAL",
"Ordinal": -1,
"Index": {
"Group": 8,
"Element": 8
},
"discriminator": "StringContainsConstraint"
}
Where:
-
Match
is the string that the DICOM tag should contain. -
Ordinal
is the ordinal to extract from the tag or -1 to extract all. -
Index
is as DicomTagConstraint. -
discriminator
is as DicomConstraint.
This constraint contains a regular expression and a tag to test against.
{
"Expression": string, a regular expression, suitable for the Regex constructor from .NET System.Text.RegularExpressions,
"Options": number, one of the RegexOptions from .NET System.Text.RegularExpressions,
"Ordinal": number,
"Index": {
"Group": number,
"Element": number
},
"discriminator": "RegexConstraint"
}
Where:
-
Expression
is a regular expression compatible with .NET Regex. -
Options
is the value of the enum in .NET RegexOptions. -
Ordinal
is the ordinal to extract from the tag or -1 to extract all. -
Index
is as DicomTagConstraint. -
discriminator
is as DicomConstraint.
These contain an ordering function and a DICOM tag that the ordering function applies to.
{
"Function": {
"Order": string, one of "Never", "LessThan", "Equal", "LessThanOrEqual", "GreaterThan", "NotEqual", "Always",
"Value": object, one of date/time, number, or for strings {
"Value": string,
"ComparisonType": number, 0 for case sensitive string comparisons, 1 for case insensitive.
},
"Ordinal": 0
},
"Index": {
"Group": number,
"Element": number
},
"discriminator": string, one of "OrderedDateTimeConstraint", "OrderedDoubleConstraint", "OrderedIntConstraint", "OrderedStringConstraint"
}
For example:
{
"Function": {
"Order": "Equal",
"Value": {
"Value": "HEAD",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 24,
"Element": 21
},
"discriminator": "OrderedStringConstraint"
}
Where:
-
Function
is the ordering function to apply to the DICOM tag.-
Order
is the required relation between the suppliedValue
and the value of the DICOM tag. -
Value
is the value to test the DICOM tag against. -
Ordinal
is the ordinal to extract from the tag or -1 to extract all.
-
-
Index
is as DicomTagConstraint. -
discriminator
is as DicomConstraint.
This is a variant of the OrderedDateTimeConstraint etc, but contains a constraint for a DICOM UID tag.
{
"Function": {
"Order": string, order as above,
"Value": {
"Value": string,
"ComparisonType": number, as above
},
"Ordinal": number
},
"Index": {
"Group": number,
"Element": number
},
"discriminator": "UIDStringOrderConstraint"
}
For example:
{
"Function": {
"Order": "Equal",
"Value": {
"Value": "1.2.840.10008.5.1.4.1.1.2",
"ComparisonType": 0
},
"Ordinal": 0
},
"Index": {
"Group": 8,
"Element": 22
},
"discriminator": "UIDStringOrderConstraint"
}
Where the Function
and Index
are as above, except that the constraint test is applied to a DICOM UID tag.
This is a variant of the OrderedDateTimeConstraint etc, but for a tag of TM value representation.
{
"Function": {
"Order": string, order as above,
"Value": number, a TimeSpan in .NET Timespan serialization format,
"Ordinal": number
},
"Index": {
"Group": number,
"Element": number
},
"discriminator": "TimeOrderConstraint"
}
For example:
{
"Function": {
"Order": "GreaterThanOrEqual",
"Value": "16:05:42.7380000",
"Ordinal": 0
},
"Index": {
"Group": 25447,
"Element": 60116
},
"discriminator": "TimeOrderConstraint"
}
Where the Function
and Index
are as above, except that the constraint test is applied to a DICOM tag TimeOfDay property.
All communications between a deployed Inference app service and the InnerEye Gateway are compliant with OWASP 3.0.
You are responsible for the performance, the necessary testing, and if needed any regulatory clearance for any of the models produced by this toolbox.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com .
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct . For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project has adopted the Microsoft Open Source Code of Conduct