PHDI Orchestration (3.0.0)

Download OpenAPI specification:Download

Getting Started with the DIBBs Orchestration

Introduction

The DIBBs Orchestration service offers a REST API for processing messages through a series of microservices.

Running Orchestration

You can run the Orchestration service using Docker, any other OCI container runtime (e.g., Podman), or directly from the Python source code.

Running Orchestration tests

Before running the orchestration unit tests, make sure you have all the services running. For the integration tests, the services will be spun up automatically.

[!IMPORTANT] Before you spin up the services using the docker-compose.yaml make sure you have a .env file. If you do not have a .env file then please copy the .env.sample as .env in order to stand up the services.

Running Unit Tests

  1. cd containers/orchestration
  2. eval "$(pyenv init -)"
  3. source .venv/bin/activate
  4. pip install -r requirements.txt -r dev-requirements.txt
  5. python -m pytest --cov-report xml --cov=. -m "not integration" tests/

Running Integration Tests

  1. cd containers/orchestration
  2. eval "$(pyenv init -)"
  3. source .venv/bin/activate
  4. pip install -r requirements.txt -r dev-requirements.txt
  5. python -m pytest -m "integration"

To run the Orchestration service with Docker, follow these steps.

  1. Confirm that you have Docker installed by running docker -v. If you don't see a response similar to what's shown below, follow these instructions to install Docker.
❯ docker -v
Docker version 20.10.21, build baeda1f
  1. Download a copy of the Docker image from the PHDI repository by running docker pull ghcr.io/cdcgov/dibbs-ecr-viewer/orchestration:latest.
  2. Run the service with docker run -p 8080:8080 orchestration:latest.

Congratulations, the Orchestration service should now be running on localhost:8080!

Running from Python Source Code

We recommend running the Orchestration service from a container, but if that isn’t feasible for a given use case, you can also run the service directly from Python using the steps below.

  1. Ensure that both Git and Python 3.13 or higher are installed.
  2. Clone the PHDI repository with git clone https://github.com/CDCgov/dibbs-ecr-viewer.
  3. Navigate to /dibbs-ecr-viewer/containers/orchestration/.
  4. Make a fresh virtual environment with python -m venv .venv.
  5. Activate the virtual environment with source .venv/bin/activate (MacOS and Linux), venv\Scripts\activate (Windows Command Prompt), or .venv\Scripts\Activate.ps1 (Windows Power Shell).
  6. Install all of the Python dependencies for the Orchestration service with pip install -r requirements.txt into your virtual environment.
  7. Run the Orchestration service on localhost:8080 with python -m uvicorn app.main:app --host 0.0.0.0 --port 8080.

Building the Docker Image

To build the Docker image for the Orchestration service from source instead of downloading it from the PHDI repository follow these steps.

  1. Ensure that both Git and Docker are installed.
  2. Clone the PHDI repository with git clone https://github.com/CDCgov/dibbs-ecr-viewer.
  3. Navigate to /dibbs-ecr-viewer/containers/orchestration/.
  4. Run docker build -t orchestration ..

The API

When viewing these docs from the /redoc endpoint on a running instance of the Orchestration service or the DIBBs website, detailed documentation on the API will be available below.

Running the /process-zip endpoint

When processing an eCR .zip file through the Orchestration service, call the /process-zip endpoint to run the file.

The endpoint will be a POST call to the /process-zip endpoint with the following parameters in the form body

  • message_type: The type of stream of the uploaded file's underlying data (e.g. ecr, elr, etc.). If the data is in FHIR format, set to FHIR.
  • include_error_types: The type of errors to return (e.g. warnings, errors, fatal).
  • data_type: The type of data held in the uploaded file. Eligible values include ecr, zip, fhir, and hl7. In most cases it will be zip
  • config_file_name: The name of the configuration file to load on the service's back-end, specifying the workflow to apply. These are uploaded by the organization hosting the application. There are samples of these files in the /assets folder of this application
  • upload_file: A file containing clinical health care information.

An an example of calling this endpoint would look like this

curl --location 'https://your_url_here/orchestration/process-zip' \
--form 'message_type="ecr"' \
--form 'include_error_types="[errors]"' \
--form 'config_file_name="sample-orchestration-s3-config.json"' \
--form 'upload_file=@"/path_to_your_zip/sample.zip";type=application/zip' \
--form 'data_type="zip"'

The output will vary depending on the type of configuration chosen. However, the process will have status 200 indicating it did not encounter errors when running the Orchestration service.

For more information on the endpoint go to the documentation here

Architecture Diagram

Application Stack

graph TD
    subgraph Main Services
        A[Orchestration Service]
        A --> B[Validation Service]
        A --> C[FHIR Converter Service]
        A --> D[Ingestion Service]
        A --> E[Trigger Code Reference Service]
        A --> F[Message Parser Service]
        A --> G[ECR Viewer]
        G --> H[ECR Viewer DB]
    end

    subgraph Observability
        direction TB
        I[Jaeger] --> J[Prometheus]
        K[OpenTelemetry Collector] --> J
        K --> I
        L[Grafana] --> J
    end

    A --> I

    M[Python]
    N[Uvicorn]
    O[FastAPI]

    A -.-> M
    A -.-> N
    A -.-> O

    style A fill:#f9f,stroke:#333,stroke-width:4px,color:#000
    style Main Services fill:#bbf,stroke:#333,stroke-width:2px
    style Observability fill:#bbf,stroke:#333,stroke-width:2px,color:#000

Application API

graph TD
    A[Orchestration Service]

    subgraph API Endpoints
        direction TB
        M[GET /configs]
        N[GET /configs{processing_config_name}]
        O[PUT /configs{processing_config_name}]
        P[POST /process-zip]
        Q[POST /process-message]
        R[WebSocket /process-ws]
    end

    A --> M
    A --> N
    A --> O
    A --> P
    A --> Q
    A --> R

    style A fill:#f9f,stroke:#333,stroke-width:4px,color:#000
    style API Endpoints fill:#bfb,stroke:#333,stroke-width:2px

Health Check

This endpoint checks service status. If an HTTP 200 status code is returned along with '{"status": "OK"}' then the service is available and running properly.

Responses

Response samples

Content type
application/json
{
  • "status": "OK"
}

Process Zip Endpoint

This endpoint provides a wrapper function for unpacking an uploaded zip file, determining appropriate parameter and application settings, and applying a config-driven workflow to the data in that file. This is one of two endpoints that can actually invoke and apply a config workflow to data and is meant to be used to process files.

Inputs and Outputs

  • :param message_type: The type of stream of the uploaded file's underlying data (e.g. ecr, elr, etc.). If the data is in FHIR format, set to FHIR.
  • :param data_type: The type of data held in the uploaded file. Eligible values include ecr, zip, fhir, and hl7.
  • :param config_file_name: The name of the configuration file to load on the service's back-end, specifying the workflow to apply.
  • :param upload_file: A file containing clinical health care information.
  • :return: A response holding whether the workflow application was successful as well as the results of the workflow.
Request Body schema: multipart/form-data
message_type
string (Message Type)
data_type
string (Data Type)
config_file_name
string (Config File Name)
upload_file
string <binary> (Upload File)

Responses

Response samples

Content type
application/json
Example
{
  • "message": "Processing succeeded!",
  • "content": {
    }
}

Process Message Endpoint

This endpoint provides a wrapper function for unpacking a message processing input and using those settings to apply a config-driven workflow to a raw string of data. This endpoint is the second of two workflow-driven endpoints and is meant to be used with raw string data (meaning if the data is JSON, it must be string serialized with json.dumps).

Inputs

  • :param request: A response holding whether the workflow application was successful as well as the results of the workflow.
Request Body schema: application/json
message_type
required
string (Message Type)
Enum: "ecr" "elr" "vxu" "fhir"

The type of message to be validated.

data_type
required
string (Data Type)
Enum: "ecr" "zip" "fhir" "hl7"

The type of data of the passed-in message. Must be one of 'ecr', 'fhir', or 'zip'. If data_type is set to 'zip', the underlying unzipped data is assumed to be ecr.

config_file_name
required
string (Config File Name)

The name of a config file in either the default/ or custom/ schemas directory that will define the workflow applied to the passed data.

required
Message (object) or Message (string) (Message)

The message to be validated.

Rr Data (string) or Rr Data (null) (Rr Data)

If an eICR message, the accompanying Reportability Response data.

Responses

Request samples

Content type
application/json
{
  • "message_type": "ecr",
  • "data_type": "ecr",
  • "config_file_name": "string",
  • "message": { },
  • "rr_data": "string"
}

Response samples

Content type
application/json
Example
{
  • "message": "Processing succeeded!",
  • "content": {
    }
}

List Configs

This endpoint gets a list of all the process configs currently available.

Responses

Response samples

Content type
application/json
{
  • "default_configs": [
    ],
  • "custom_configs": [ ]
}

Get Config

This endpoint gets the config specified by 'processing_config_name'.

Inputs and Outputs

  • :param processing_config_name: The name of the processing configuration to retrieve.
  • :param response: The response object used to modify the response status and body.
path Parameters
processing_config_name
required
string (Processing Config Name)

Responses

Response samples

Content type
application/json
{
  • "message": "Config found!",
  • "workflow": {
    }
}

Upload Config

This endpoint uploads a new processing config to the service or update an existing config.

Inputs and Outputs

  • :param processing_config_name: The name of the processing configuration to be uploaded or updated.
  • :param input: A Pydantic model representing the processing configuration data.
  • :param response: The response object used to modify the response status and body.
path Parameters
processing_config_name
required
string (Processing Config Name)
Request Body schema: application/json
required
object (Workflow)

A JSON-formatted config dict containing a single key workflow that maps to a list of WorkflowServiceStep objects, each defining one step in the orchestration configuration to upload.

Overwrite (boolean) or Overwrite (null) (Overwrite)
Default: false

When true if a config already exists for the provided name it will be replaced. When false no action will be taken and the response will indicate that a config for the given name already exists. To proceed submit a new request with a different config name or set this field to true.

Responses

Request samples

Content type
application/json
{
  • "workflow": {
    },
  • "overwrite": false
}

Response samples

Content type
application/json
{
  • "message": "Config uploaded successfully!"
}