Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: env and secrets + test: front-to-end #65

Merged
merged 5 commits into from
Dec 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions docker-compose.env.example → .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,3 @@ TIMELINE_FILE=./../../../data/sr2silo/demo_real/timeline.tsv
PRIMER_FILE=./tests/data/samples_large/primers.yaml
NEXTCLADE_REFERENCE=sars-cov2
RESULTS_DIR=./../results/A1_10_2024_09_30/20241018_AAG55WNM5/1000/
AWS_ACCESS_KEY_ID="XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
AWS_SECRET_ACCESS_KEY="XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
AWS_DEFAULT_REGION="eu-central-1"
54 changes: 53 additions & 1 deletion .github/workflows/docker-build-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,10 +42,62 @@ jobs:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}

- name: Create .env file
run: |
echo SAMPLE_DIR=${{ secrets.SAMPLE_DIR }} >> .env
echo SAMPLE_ID=${{ secrets.SAMPLE_ID }} >> .env
echo BATCH_ID=${{ secrets.BATCH_ID }} >> .env
echo TIMELINE_FILE=${{ secrets.TIMELINE_FILE }} >> .env
echo PRIMER_FILE=${{ secrets.PRIMER_FILE }} >> .env
echo NEXTCLADE_REFERENCE=${{ secrets.NEXTCLADE_REFERENCE }} >> .env
echo RESULTS_DIR=${{ secrets.RESULTS_DIR }} >> .env
echo AWS_ACCESS_KEY_ID=${{ secrets.AWS_ACCESS_KEY_ID }} >> .env
echo AWS_SECRET_ACCESS_KEY=${{ secrets.AWS_SECRET_ACCESS_KEY }} >> .env
echo AWS_DEFAULT_REGION=${{ secrets.AWS_DEFAULT_REGION }} >> .env

- name: Build Docker image
run: docker-compose --env-file docker-compose.env build
run: docker-compose --env-file .env build

- name: Push to DockerHub
if: github.ref == 'refs/heads/main'
run: |
docker-compose push

test:
needs: build
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v3

- name: Set up Docker Compose
run: |
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker-compose --version

- name: Create .env file
run: |
echo SAMPLE_DIR=${{ secrets.SAMPLE_DIR }} >> .env
echo SAMPLE_ID=${{ secrets.SAMPLE_ID }} >> .env
echo BATCH_ID=${{ secrets.BATCH_ID }} >> .env
echo TIMELINE_FILE=${{ secrets.TIMELINE_FILE }} >> .env
echo PRIMER_FILE=${{ secrets.PRIMER_FILE }} >> .env
echo NEXTCLADE_REFERENCE=${{ secrets.NEXTCLADE_REFERENCE }} >> .env
echo RESULTS_DIR=${{ secrets.RESULTS_DIR }} >> .env


- name: Create Docker secrets files
run: |
mkdir -p ./secrets
echo "${{ secrets.AWS_ACCESS_KEY_ID }}" > ./secrets/aws_access_key_id.txt
echo "${{ secrets.AWS_SECRET_ACCESS_KEY }}" > ./secrets/aws_secret_access_key.txt
echo "${{ secrets.AWS_DEFAULT_REGION }}" > ./secrets/aws_default_region.txt


- name: Run Docker Compose
run: docker-compose --env-file .env up

- name: Tear down Docker Compose
run: docker-compose down
43 changes: 42 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,48 @@ $ poetry shell
$ pytest
```

#### Tool Sections
### [WIP]: Run V-Pipe to SILO Transformation
This is currently implemented as script and under heavy development.
To run, we recommend a build as a docker compose as it relies on other RUST components.

#### Configuration

Edit the `docker-compose.env` file in the `docker-compose` directory with the following paths:

```env
SAMPLE_DIR=../../../data/sr2silo/daemon_test/samples/A1_05_2024_10_08/20241024_2411515907/alignments/
SAMPLE_ID=A1_05_2024_10_08
BATCH_ID=20241024_2411515907
TIMELINE_FILE=../../../data/sr2silo/daemon_test/timeline.tsv
NEXTCLADE_REFERENCE=sars-cov2
RESULTS_DIR=./results
```


#### Docker Secrets
To upload the processed outputs S3 storage is required.

For sensitive information like AWS credentials, use Docker secrets. Create the following files in the secrets directory:

- secrets/aws_access_key_id.txt:
```YourAWSAccessKeyId````

- secrets/aws_secret_access_key.txt:
```YourAWSSecretAccessKey````

- secrets/aws_default_region.txt:
```YourAWSRegion```

#### Run Transformation

To process a single sample, run the following command:

```sh
docker-compose --env-file .env up --build
```


### Tool Sections
The code quality checks run on GitHub can be seen in
- ``.github/workflows/test.yml`` for the python package CI/CD,

Expand Down
7 changes: 0 additions & 7 deletions docker-compose.env

This file was deleted.

19 changes: 15 additions & 4 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
version: '3.8'

services:
sr2silo:
build: .
Expand All @@ -7,7 +9,7 @@ services:
- ${PRIMER_FILE}:/app/primers.yaml
- ${RESULTS_DIR}:/app/results
- ./scripts/database_config.yaml:/app/scripts/database_config.yaml
- ./scripts/reference_genomes.json:/app/scripts/- ./scripts/database_config.yaml:/app/scripts/reference_genomes.json
- ./scripts/reference_genomes.json:/app/scripts/reference_genomes.json
environment:
- PYTHONUNBUFFERED=1
- SAMPLE_DIR=${SAMPLE_DIR}
Expand All @@ -16,9 +18,18 @@ services:
- TIMELINE_FILE=${TIMELINE_FILE}
- PRIMER_FILE=${PRIMER_FILE}
- RESULTS_DIR=${RESULTS_DIR}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION}
secrets:
- aws_access_key_id
- aws_secret_access_key
- aws_default_region

secrets:
aws_access_key_id:
file: ./secrets/aws_access_key_id.txt
aws_secret_access_key:
file: ./secrets/aws_secret_access_key.txt
aws_default_region:
file: ./secrets/aws_default_region.txt

volumes:
results:
22 changes: 0 additions & 22 deletions scripts/README.md

This file was deleted.

28 changes: 24 additions & 4 deletions src/sr2silo/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
from __future__ import annotations

import bz2
import os
import shutil
from pathlib import Path

Expand All @@ -23,13 +22,34 @@ def compress_bz2(input_fp: Path, output_fp: Path) -> None:
shutil.copyfileobj(f_in, f_out)


def get_aws_credentials():
"""Get AWS credentials from Docker secrets.

Returns:
Tuple[str, str, str]: AWS access key ID, AWS secret access key,
and AWS default region.

Raises:
RuntimeError: If any of the required secrets are missing.
"""
try:
with open("/run/secrets/aws_access_key_id") as f:
aws_access_key_id = f.read().strip()
with open("/run/secrets/aws_secret_access_key") as f:
aws_secret_access_key = f.read().strip()
with open("/run/secrets/aws_default_region") as f:
aws_default_region = f.read().strip()
except FileNotFoundError as e:
raise RuntimeError("Required secret is missing: " + str(e))

return aws_access_key_id, aws_secret_access_key, aws_default_region


def get_s3_client():
"""Get an S3 client using AWS credentials from environment variables."""

# Get AWS credentials from environment variables
aws_access_key_id = os.getenv("AWS_ACCESS_KEY_ID")
aws_secret_access_key = os.getenv("AWS_SECRET_ACCESS_KEY")
aws_default_region = os.getenv("AWS_DEFAULT_REGION")
aws_access_key_id, aws_secret_access_key, aws_default_region = get_aws_credentials()

# Create an S3 client
s3_client = boto3.client(
Expand Down
Loading