Download OpenAPI specification:Download
The DIBBs Orchestration service offers a REST API for processing messages through a series of microservices.
You can run the Orchestration service using Docker, any other OCI container runtime (e.g., Podman), or directly from the Python source code.
Before running the orchestration unit tests, make sure you have all the services running. For the integration tests, the services will be spun up automatically.
[!IMPORTANT] Before you spin up the services using the
docker-compose.yaml
make sure you have a.env
file. If you do not have a.env
file then please copy the.env.sample
as.env
in order to stand up the services.
cd containers/orchestration
eval "$(pyenv init -)"
source .venv/bin/activate
pip install -r requirements.txt -r dev-requirements.txt
python -m pytest --cov-report xml --cov=. -m "not integration" tests/
cd containers/orchestration
eval "$(pyenv init -)"
source .venv/bin/activate
pip install -r requirements.txt -r dev-requirements.txt
python -m pytest -m "integration"
To run the Orchestration service with Docker, follow these steps.
docker -v
. If you don't see a response similar to what's shown below, follow these instructions to install Docker.❯ docker -v
Docker version 20.10.21, build baeda1f
docker pull ghcr.io/cdcgov/dibbs-ecr-viewer/orchestration:latest
. docker run -p 8080:8080 orchestration:latest
.Congratulations, the Orchestration service should now be running on localhost:8080
!
We recommend running the Orchestration service from a container, but if that isn’t feasible for a given use case, you can also run the service directly from Python using the steps below.
git clone https://github.com/CDCgov/dibbs-ecr-viewer
./dibbs-ecr-viewer/containers/orchestration/
.python -m venv .venv
.source .venv/bin/activate
(MacOS and Linux), venv\Scripts\activate
(Windows Command Prompt), or .venv\Scripts\Activate.ps1
(Windows Power Shell).pip install -r requirements.txt
into your virtual environment.localhost:8080
with python -m uvicorn app.main:app --host 0.0.0.0 --port 8080
.To build the Docker image for the Orchestration service from source instead of downloading it from the PHDI repository follow these steps.
git clone https://github.com/CDCgov/dibbs-ecr-viewer
./dibbs-ecr-viewer/containers/orchestration/
.docker build -t orchestration .
.When viewing these docs from the /redoc
endpoint on a running instance of the Orchestration service or the DIBBs website, detailed documentation on the API will be available below.
When processing an eCR .zip file through the Orchestration service, call the /process-zip
endpoint to run the file.
The endpoint will be a POST call to the /process-zip
endpoint with the following parameters in the form body
message_type
: The type of stream of the uploaded file's underlying data (e.g. ecr, elr, etc.). If the data is in FHIR format, set to FHIR.include_error_types
: The type of errors to return (e.g. warnings, errors, fatal).data_type
: The type of data held in the uploaded file. Eligible values include ecr
, zip
, fhir
, and hl7
. In most cases it will be zipconfig_file_name
: The name of the configuration file to load on the service's back-end, specifying the workflow to apply. These are uploaded by the organization hosting the application. There are samples of these files in the /assets folder of this applicationupload_file
: A file containing clinical health care information.An an example of calling this endpoint would look like this
curl --location 'https://your_url_here/orchestration/process-zip' \
--form 'message_type="ecr"' \
--form 'include_error_types="[errors]"' \
--form 'config_file_name="sample-orchestration-s3-config.json"' \
--form 'upload_file=@"/path_to_your_zip/sample.zip";type=application/zip' \
--form 'data_type="zip"'
The output will vary depending on the type of configuration chosen. However, the process will have status 200
indicating it did not encounter errors when running the Orchestration service.
For more information on the endpoint go to the documentation here
graph TD
subgraph Main Services
A[Orchestration Service]
A --> B[Validation Service]
A --> C[FHIR Converter Service]
A --> D[Ingestion Service]
A --> E[Trigger Code Reference Service]
A --> F[Message Parser Service]
A --> G[ECR Viewer]
G --> H[ECR Viewer DB]
end
subgraph Observability
direction TB
I[Jaeger] --> J[Prometheus]
K[OpenTelemetry Collector] --> J
K --> I
L[Grafana] --> J
end
A --> I
M[Python]
N[Uvicorn]
O[FastAPI]
A -.-> M
A -.-> N
A -.-> O
style A fill:#f9f,stroke:#333,stroke-width:4px,color:#000
style Main Services fill:#bbf,stroke:#333,stroke-width:2px
style Observability fill:#bbf,stroke:#333,stroke-width:2px,color:#000
graph TD
A[Orchestration Service]
subgraph API Endpoints
direction TB
M[GET /configs]
N[GET /configs{processing_config_name}]
O[PUT /configs{processing_config_name}]
P[POST /process-zip]
Q[POST /process-message]
R[WebSocket /process-ws]
end
A --> M
A --> N
A --> O
A --> P
A --> Q
A --> R
style A fill:#f9f,stroke:#333,stroke-width:4px,color:#000
style API Endpoints fill:#bfb,stroke:#333,stroke-width:2px
This endpoint provides a wrapper function for unpacking an uploaded zip file, determining appropriate parameter and application settings, and applying a config-driven workflow to the data in that file. This is one of two endpoints that can actually invoke and apply a config workflow to data and is meant to be used to process files.
ecr
, zip
, fhir
, and hl7
.message_type | string (Message Type) |
data_type | string (Data Type) |
config_file_name | string (Config File Name) |
upload_file | string <binary> (Upload File) |
{- "message": "Processing succeeded!",
- "content": {
- "foo": "bar"
}
}
This endpoint provides a wrapper function for unpacking a message processing input and using those settings to apply a config-driven workflow to a raw string of data. This endpoint is the second of two workflow-driven endpoints and is meant to be used with raw string data (meaning if the data is JSON, it must be string serialized with json.dumps).
message_type required | string (Message Type) Enum: "ecr" "elr" "vxu" "fhir" The type of message to be validated. |
data_type required | string (Data Type) Enum: "ecr" "zip" "fhir" "hl7" The type of data of the passed-in message. Must be one of 'ecr', 'fhir', or 'zip'. If |
config_file_name required | string (Config File Name) The name of a config file in either the |
required | Message (object) or Message (string) (Message) The message to be validated. |
Rr Data (string) or Rr Data (null) (Rr Data) If an eICR message, the accompanying Reportability Response data. |
{- "message_type": "ecr",
- "data_type": "ecr",
- "config_file_name": "string",
- "message": { },
- "rr_data": "string"
}
{- "message": "Processing succeeded!",
- "content": {
- "foo": "bar"
}
}
This endpoint gets the config specified by 'processing_config_name'.
processing_config_name required | string (Processing Config Name) |
{- "message": "Config found!",
- "workflow": {
- "workflow": [
- {
- "service": "ingestion",
- "url": "some-url-for-an-ingestion-service",
- "endpoint": "/fhir/harmonization/standardization/standardize_names"
}
]
}
}
This endpoint uploads a new processing config to the service or update an existing config.
processing_config_name required | string (Processing Config Name) |
required | object (Workflow) A JSON-formatted config dict containing a single key |
Overwrite (boolean) or Overwrite (null) (Overwrite) Default: false When |
{- "workflow": {
- "property1": [
- {
- "service": "string",
- "endpoint": "string",
- "params": { }
}
], - "property2": [
- {
- "service": "string",
- "endpoint": "string",
- "params": { }
}
]
}, - "overwrite": false
}
{- "message": "Config uploaded successfully!"
}