Skip to content

Commit

Permalink
TECH_DEBT: Moving documentation from confluence to github (#550)
Browse files Browse the repository at this point in the history
### 🛠 Description of Changes

Please see the root README on my branch for docs.
Added a section for "Documentation" that links to the docs/README.md 
Exported from confluence each of the service specs and moved them to the
docs/service_specs folder.
Moved the service-specific docs from that root README.md into their
equivalent service spec.
Created new documentation for Census Management under "functionality"
and moved the "submission folder structure" in functionality from
confluence.

### 🧪 Testing Performed

N/A

But, after approval and merging, I plan to archive each of the
equivalent pages in confluence.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

## Release Notes

- **New Features**
- Introduced a comprehensive documentation structure for various
services including Census Management, Submission, and Reporting.
- Added detailed guides for each service, enhancing user understanding
of functionalities and configurations.

- **Documentation**
- Removed outdated service descriptions and replaced them with a
centralized "Documentation" section.
- Added new documentation files for services such as Account, Admin UI,
Audit, Notification, and others, improving navigability and clarity.
- Enhanced existing documentation with specific details on service
responsibilities, environment variables, and event handling.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
  • Loading branch information
seanmcilvenna authored Nov 21, 2024
2 parents a25db16 + 16ad7a4 commit 6532597
Show file tree
Hide file tree
Showing 22 changed files with 723 additions and 60 deletions.
67 changes: 7 additions & 60 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,62 +13,9 @@

NHSNLink is an open-source reference implementation for CDC’s National Healthcare Safety Network (NHSN) reporting. It is an application that aggregates, transforms, evaluates, validates and submits patient-level clinical data for patients matching NHSN surveillance requirements. It is based on a event driven micro service architecture using C#, Java, Kafka and other technologies. NHSNLink is designed to handle large-scale data processing efficiently. It leverages streaming technologies that can be configured to continuously query and evaluate patient data throughout the reporting cycle, rather than waiting until the end to initiate this process.

## Link Cloud Services

### Tenant

The Tenant service is the entry point for configuring a tenant into Link Cloud. The service is responsible for maintaining and generating events for the scheduled measure reporting periods that the tenant is configured for. These events contain the initial information needed for Link Cloud to query resources and perform measure evaluations based on a specific reporting period.

### Census

The Census service is primarily responsible for maintaining a tenants admit and discharge patient information needed to determine when a patient is ready for reporting. To accomplish this, the Census service has functionality in place to request an updated FHIR List of recently admitted patients. The frequency that the request is made is based on a Tenant configuration made in the Census service.

### Query Dispatch

The Query Dispatch service is primarily responsible for applying a lag period prior to making FHIR resource query requests against a facility endpoint. The current implementation of the Query Dispatch service handles how long Link Cloud should wait before querying for a patient’s FHIR resources after being discharged. To ensure that the encounter related data for the patient has been settled (Medications have been closed, Labs have had their results finalized, etc), tenants are able to customize how long they would like the lag from discharge to querying to be.

### Data Acquisition

The Data Acquistion service is responsible for connecting and querying a tenant's endpoint for FHIR resources that are needed to evaluate patients for a measure. For Epic installations, Link Cloud is utilizing the STU3 Patient List ([Link Here](https://fhir.epic.com/Specifications?api=879)) resource to inform which patients are currently admitted in the facility. While this is the current solution to acquiring the patient census, there are other means of patient acquisition being investigated (ADT V2, Bulk FHIR) to provide universal support across multiple EHR vendors.

### Normalization

FHIR resources queried from EHR endpoints can vary from location to location. There will be occasions where data for specific resources may need to be adjusted to ensure that Link Cloud properly evaluates a patient against dQM’s. The Normalization service is a component in Link Cloud to help make those adjustments in an automated way. The service operates in between the resource acquisition and evaluation steps to ensure that the tenant data is in a readied state for measure evaluation.

### Measure Eval

The Measure Eval service is a Java based application that is primarily responsible for evaluating bundles of acquired patient resources against the measures that Link Cloud tenants are configured to evaluate with. The service utilizes the CQF framework ([Link Here](https://github.com/cqframework/cqf-ruler)) to perform the measure evaluations.

### Report

The Report service is responsible for persisting the Measure Reports and FHIR resources that the Measure Eval service generates after evaluating a patient against a measure. When a tenant's reporting period end date has been met, the Report Service performs various workflows to determine if all of the patient MeasureReports are accounted for that period prior to initiating the submission process.

### Submission
The Submission service is responsible for packaging a tenant's reporting content and submitting them to a configured destination. Currently, the service only writes the submission content to its local file store. The submission package for a reporting period includes the following files:

| File | Description | Multiple Files? |
| ---- | ---- | ---- |
| Aggregate | A [MeasureReport](https://hl7.org/fhir/R4/measurereport.html) resource that contains references to each patient evaluation for a specific measure | Yes, one per measure |
| Patient List | A [List](https://hl7.org/fhir/R4/list.html) resource of all patients that were admitted into the facility during the reporting period | No |
| Device | A [Device](https://hl7.org/fhir/R4/device.html) resource that details the version of Link Cloud that was used | No |
| Organization | An [Organization](https://hl7.org/fhir/R4/organization.html) resource for the submitting facility | No |
| Other Resources | A [Bundle](https://hl7.org/fhir/R4/bundle.html) resource that contains all of the shared resources (Location, Medication, etc) that are referenced in the patient Measure Reports | No |
| Patient | A [Bundle](https://hl7.org/fhir/R4/bundle.html) resource that contains the MeasureReports and related resources for a patient | Yes, one per evaluated patient |

An example of the submission package can be found at `\link-cloud\Submission Example`.

### Account

The Account service is responsible for maintaining roles and permissions for Link Cloud users.

### Audit

The Audit service is responsible for persisting auditable events that are generated by the Link Cloud services.

### Notification

The Notification service is responsible for emailing configured users when a notifiable event occurs when the Link Cloud services attempt to perform their work.
## Documentation

Documentation on Link's implementation and the functionality it supports can be found [here](docs/README.md).
## Helpful Tools

- SQL Server Management Studio: [Link Here](https://learn.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver16)
Expand Down Expand Up @@ -127,7 +74,7 @@ The initial building of the services will take a few minutes to run.

5. Open a web browser and access Kafka UI. By default, the page can be accessed at `http://localhost:9095`. Click the `Topics` tab and ensure that Kafka topics exist:

<img src="Documentation/Images/readme_kafkaui.png" width="75%" height="75%">
<img src="docs/images/readme_kafkaui.png" width="75%" height="75%" alt="Kafka UI showing Link topics" />

If there aren't any topics populated (shown in the image above), attempt to rerun the following command: `docker compose up kafka_init -d`

Expand Down Expand Up @@ -256,7 +203,7 @@ Test-Hospital

At the end of the reporting period, the Report service will make additional requests to query and evaluate patients that are currently admitted in the facility prior to submitting. After each of those admitted patients are evaluated, the Report service will then produce a `SubmitReport` event to inform the Submission service that a report is complete. To access the submission package open Docker Desktop and click the `link-submission` container. Select the `files` tab and navigate to the `app\submissions` folder. There, you'll be able to download the submission results for the reporting period:

<img src="Documentation/Images/docker_submission.png" width="75%" height="75%">
<img src="docs/images/docker_submission.png" width="75%" height="75%" alt="Docker Desktop UI showing submissions folder" />

## Reporting Event Workflow
> [!NOTE]
Expand All @@ -266,13 +213,13 @@ Detailed below are the steps Link Cloud takes to generate a report for a tenant

### Report Scheduling

<img src="Documentation/Images/readme_event_report_scheduling.png" width="75%" height="75%">
<img src="docs/images/readme_event_report_scheduling.png" width="75%" height="75%" alt="UML diagram showing events for report scheduling" />

At the beginning of a new reporting period, the Tenant service produces a `ReportScheduled` event. The Query Dispatch and Report services consume and persist the reporting information in the event into their databases. The Report service sets an internal cron job (based on the EndDate of the consumed event) to execute the work needed to complete the report.

### Census Acquisition and Discharge

<img src="Documentation/Images/readme_event_census_discharge.png" width="75%" height="75%">
<img src="docs/images/readme_event_census_discharge.png" width="75%" height="75%" alt="UML diagram showing events for census of patients" />

During the reporting period, the Census service is configured to continually request a new list of patients admitted in a facility by producing the `CensusAcquisitionScheduled` event. The Data Acquisition service consumed this event and queries the facility's List endpoint. After receiving a response back from the EHR endpoints, the Data Acquisition service then produces a `PatientIDsAcquired` event that contains a list of all patients that are currently admitted in the facility.

Expand All @@ -287,7 +234,7 @@ The QueryDispatch service consumes the patient events and appends the tenants' r

### Resource Acquisition and Evaluation

<img src="Documentation/Images/readme_event_acquisition_evaluation.png" width="75%" height="75%">
<img src="docs/images/readme_event_acquisition_evaluation.png" width="75%" height="75%" alt="UML diagram showing events for data acquisition and evaluation" />

A `DataAcquisitionRequested` event is generated for patients that have either been discharged or are still admitted when the reporting period end date is met. This event is the trigger that causes the resource acquisition, normalization and evaluation phases for a patient.

Expand Down
26 changes: 26 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@

## Overview

This page and its references include documentation for Link's services and the functionality those services supports.

## Functionality

* [Census Management](functionality/census_management.md)
* Submission
* [Folder Structure](functionality/submission_folder.md)

## Service Specifications

* [Account](service_specs/account.md)
* [Admin UI](service_specs/admin_ui.md)
* [Audit](service_specs/audit.md)
* [Backend For Frontend (BFF)](service_specs/bff.md)
* [Census](service_specs/census.md)
* [Data Acquisition](service_specs/data_acquisition.md)
* [Measure Evaluation](service_specs/measure_eval.md)
* [Normalization](service_specs/normalization.md)
* [Notification](service_specs/notification.md)
* [Query Dispatch](service_specs/query_dispatch.md)
* [Report](service_specs/report.md)
* [Submission](service_specs/submission.md)
* [Tenant](service_specs/tenant.md)
45 changes: 45 additions & 0 deletions docs/functionality/census_management.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
[← Back Home](../README.md)

## Overview

Link includes a **Census Management** module designed to acquire, evaluate, and maintain a real-time census of patients actively or recently treated within a hospital system. This module supports the submission of required patient data to governing bodies (such as NHSN) per established reporting criteria.

The system routinely gathers census data by leveraging multiple methods compatible with hospital EHRs. This document provides an overview of each acquisition method, the data elements tracked, and future considerations for expanding data persistence.

## Census Data Acquisition Methods

### 1. FHIR Standard - List Resource

The **FHIR List Resource** method is one of the primary approaches for acquiring patient lists from hospital EHR systems. In systems like Epic, patient lists are generated through proprietary queries within the EHR, associated with the FHIR List resource for access through FHIR integrations.

- **Endpoint**: `GET /List/<listId>`
- **Functionality**: Queries the EHR for patient lists identified by a `listId`.
- **Tenant Configurability**: The `listId` is configurable per tenant within Link, allowing each institution to define the patient population that constitutes their census.
- **Applicability**: This method supports census management for any EHR that utilizes FHIR Lists representing relevant patient groups.

### 2. FHIR Standard - Bulk FHIR Implementation Guide

The **Bulk FHIR** method allows Link to acquire patient data via batch processing, useful for large patient groups in systems that support flexible querying.

- **Endpoint**: Bulk FHIR `$export` request with `groupId`
- **Process**:
- Link initiates a `$export` request for patient data by `groupId`.
- The export process is monitored and polled routinely until completion.
- Upon completion, patient resources are retrieved, and the FHIR ID of each patient is extracted and stored.
- **Tenant Configurability**: Each tenant configures a unique `groupId` corresponding to their desired patient group.

### 3. ADT Feeds (Under Exploration)

Link is evaluating the feasibility of using **ADT feeds**—specifically for admission, discharge, and cancellation events—to dynamically manage census data. This approach would offer real-time census updates and potentially enhance the accuracy and responsiveness of the census.

## Scheduling and Frequency

Both the FHIR List and Bulk FHIR methods utilize a configurable scheduling system, allowing each tenant to define query intervals. The scheduling system is based on CRON patterns, ensuring Link can automatically query the EHR at specified times to maintain an updated census.

## Data Persistence and Tracking

Currently, Link persists only the **FHIR ID** of each patient. This identifier allows Link to accurately track patients across census updates without storing additional demographic information.

### Future Considerations

In the interest of enhancing the user interface, Link is considering storing additional data elements, such as the **patient name** associated with each FHIR ID. This would provide users with meaningful patient identifiers, facilitating easier navigation and record management within the Link UI.
27 changes: 27 additions & 0 deletions docs/functionality/submission_folder.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
[← Back Home](../README.md)

## Folder Submission - Link Documentation

### Naming Convention

Format: `<nhsn-org-id>-<plus-separated-list-of-measure-ids>-<period-start>-<period-end?>-<timestamp>`

Example: `1234-NHSNdQMAcuteCareHospitalInitialPopulation+NHSNGlycemicControlHypoglycemicInitialPopulation-20240401-20240430-20240503T084523`

### Files

- `_manifest.json`
- `submitting-device.json` - The submitting Device (always “NHSNLink” for reports generated by NHSNLink)
- `submitting-org.json` - The submitting Organization
- `census.json`
- `query-plan.yml`
- `aggregate-NHSNdQMAcuteCareHospitalInitialPopulation.json` - a MeasureReport
- `aggregate-NHSNGlycemicControlHypoglycemicInitialPopulation.json` - a MeasureReport
- `patient-X.json` - X is the patient’s Patient.id and is a Bundle of:
- individual MeasureReport for ACH
- individual MeasureReport for HYPO
- Patient/X
- Condition/X, etc.
- `shared-resources.json` - a Bundle of shared resources, such as Medication and Location
- `validation-results.json` - an OperationOutcome of raw validation results for all data in the submission
- `validation-report.html` - categorized validation results for all data in the submission in a renderable HTML
File renamed without changes
File renamed without changes
53 changes: 53 additions & 0 deletions docs/service_specs/account.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
[← Back Home](../README.md)

## Account Overview

The Account service is responsible for maintaining roles and permissions for Link Cloud users.

- **Technology**: .NET Core
- **Image Name**: link-account
- **Port**: 8080
- **Database**: MSSQL (previously Postgres)

## Environment Variables

| Name | Value | Secret? |
|------------------------------------------|---------------------------------|---------|
| ExternalConfigurationSource | AzureAppConfiguration | No |
| ConnectionStrings__AzureAppConfiguration | `<AzureAppConfigEndpoint>` | Yes |

## App Config

### Kafka Connection

| Name | Value | Secret? |
|--------------------------------------|---------------------------|----------|
| KafkaConnection__BootstrapServers__0 | `<KafkaBootstrapServer>` | No |
| KafkaConnection__GroupId | Account | No |

### Redis

| Name | Value | Secret? |
|--------------------------|---------------------------|--------|
| ConnectionStrings__Redis | `<RedisConnectionString>` | Yes |
| Cache__Enabled | true/false | No |

### Database Settings (MSSQL)

| Name | Value | Secret? |
|---------------------------------------|-----------------------|----------|
| ConnectionStrings__DatabaseConnection | `<ConnectionString>` | Yes |

### Tenant API Settings

| Name | Value | Secret? |
|-----------------------------------------------|-------------------------------------|---------|
| TenantApiSettings__TenantServiceBaseEndpoint | `<TenantServiceUrl>/api` | No |

## Consumed Events

- **NONE**

## Produced Events

- **Event**: `AuditableEventOccurred`
28 changes: 28 additions & 0 deletions docs/service_specs/admin_ui.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
[← Back Home](../README.md)

## Admin UI

> ⚠️ **Note:** This service is currently called "demo app" and is planned to be renamed.
## Admin UI Overview

- **Technology**: JavaScript (TypeScript) & Angular
- **Image Name**: link-admin-ui
- **Port**: 80
- **Database**: NONE
- **Scale**: 0-5

## Volumes

| Volume | Mount Path | Sub-path |
|-------------------------------|------------------------------------------------------|-------------------------|
| Azure Storage Account | `/usr/share/nginx/html/assets/app.config.local.json` | `app.config.local.json` |

## app.config.local.json

```json
{
"baseApiUrl": "<DEMO-API-GATEWAY-BASE-URL>/api",
"idpIssuer": "https://oauth.nhsnlink.org/realms/NHSNLink",
"idpClientId": "link-botw"
}
35 changes: 35 additions & 0 deletions docs/service_specs/audit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
[← Back Home](../README.md)

## Audit Overview

The Audit service is responsible for persisting auditable events that are generated by the Link Cloud services.

- **Technology**: .NET Core
- **Image Name**: link-audit
- **Port**: 8080
- **Database**: MSSQL

## Environment Variables

| Name | Value | Secret? |
|---------------------------------------------|-------------------------------|---------|
| Link__Audit__ExternalConfigurationSource | AzureAppConfiguration | No |
| ConnectionStrings__AzureAppConfiguration | `<AzureAppConfigEndpoint>` | Yes |

## App Config

### Kafka Connection

| Name | Value | Secret? |
|---------------------------------------------------|--------------------------|---------|
| Link__Audit__KafkaConnection__BootstrapServers__0 | `<KafkaBootstrapServer>` | No |
| Link__Audit__KafkaConnection__GroupId | audit-events | No |
| Link__Audit__KafkaConnection__ClientId | audit-events | No |

## Consumed Events

- **AuditableEventOccurred**

## Produced Events

- **NONE**
Loading

0 comments on commit 6532597

Please sign in to comment.