Django application for the DDP platform's management backend. Exposes API endpoints for the management frontend to communicate with, for the purposes of
- Onboarding an NGO client
- Adding users from the client-organization
- Creating a client's workspace in our Airbyte installation
- Configuring that workspace i.e. setting up sources, destinations and connections
- Configuring data ingest jobs in our Prefect setup
- Connecting to the client's dbt GitHub repository
- Configuring dbt run jobs in our Prefect setup
- REST conventions are being followed.
- CRUD end points for a User resource would look like:
- GET /api/users/
- GET /api/users/user_id
- POST /api/users/
- PUT /api/users/:user_id
- DELETE /api/users/:user_id
- Route parameteres should be named in snake_case as shown above.
- All api docs are at
http://localhost:8002/api/docs
Pep8
has been used to standardized variable names, classes, module names etc.Pylint
is the linting tool used to analyze the code as per Pep8 style.Black
is used as the code formatter.
- Recommended IDE is VsCode.
- Install the pylint extension in vscode and enable it.
- Set the default format provider in vscode as
black
- Update the vscode settings.json as follows
{ "editor.defaultFormatter": null, "python.linting.enabled": true, "python.formatting.provider": "black", "editor.formatOnSave": true }
- Run "pre-commit install" after activating your virtual env
- Run "pre-commit run --all-files" to run the formatter
- In your virtual environment run:
celery -A ddpui worker -n ddpui
- For windows run:
celery -A ddpui worker -n ddpui -P solo
- To start celery beat run:
celery -A ddpui beat
-
pyenv local 3.10
-
pyenv exec python -m venv venv
-
source venv/bin/activate
-
pip install --upgrade pip
-
pip install -r requirements.txt
- create
.env
from.env.template
-
create a SQL database and populate its credentials into
.env
-
You can use a postgresql docker image for local development
docker run --name postgres-db -e POSTGRES_PASSWORD=<password> -p 5432:5432 -d <db name>
- Add the environment variable to .env
DBNAME=<db name>
DBHOST=localhost
DBPORT=5432
DBUSER=postgres
DBPASSWORD=<password>
DBADMINUSER=postgres
DBADMINPASSWORD=<password>
- Open a new terminal
- Download run-ab-platform.sh for Airbyte 0.58.0
- Run
./run-ab-platform.sh
to start Airbyte. This is a self-contained application which includes the configuration database - Populate Airbyte connection credentials in the
.env
from Step 2:
AIRBYTE_SERVER_HOST=localhost
AIRBYTE_SERVER_PORT=8000
AIRBYTE_SERVER_APIVER=v1
AIRBYTE_API_TOKEN= <token> # base64 encryption of username:password. Default username and password is airbyte:password and token will be YWlyYnl0ZTpwYXNzd29yZA==
AIRBYTE_DESTINATION_TYPES="Postgres,BigQuery"
- Start Prefect Proxy and populate connection info in
.env
PREFECT_PROXY_API_URL=
- Set
DEV_SECRETS_DIR
in.env
unless you want to use Amazon's Secrets Manager
-
Open a new terminal
-
Create a local
venv
, installdbt
and put its location intoDBT_VENV
in.env
pyenv local 3.10
pyenv exec python -m venv <env-name>
source <env-name>/bin/activate
python -m pip install \
dbt-core \
dbt-postgres \
dbt-bigquery
- Create empty directories for
CLIENTDBT_ROOT
CLIENTDBT_ROOT=
DBT_VENV=<env-name>/bin/activate
- The
SIGNUPCODE
in.env
is for signing up using the frontend. If you are running the frontend, set its URL inFRONTEND_URL
DJANGOSECRET=
-
Create logs folder in
ddpui
-
create
whitelist.py
from.whitelist.template.py
in ddpui > assets folder -
Run DB migrations
python manage.py migrate
-
Seed the DB
python manage.py loaddata seed/*.json
-
Create the system user
python manage.py create-system-orguser
-
Start the server
uvicorn ddpui.asgi:application --port <PORT_TO_LISTEN_ON>
- Run
python manage.py createorganduser <Org Name> <Email address> --role super-admin
- The above command creates a user with super admin role. If we don't provide any role, the default role is of account manager.
Follow the steps below:
- Install docker
- Install docker-compose
- create
.env.docker
from.env.template
inside the Docker folder
- Copy the file in ddpui/assets/ to Docker/mount
If using M1-based MacBook run this before building image
export DOCKER_DEFAULT_PLATFORM=linux/amd64
docker build -f Docker/Dockerfile.main --build-arg BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ') -t dalgo_backend_main_image:0.1 .
This will create the main imagedocker build -f Docker/Dockerfile.dev.deploy --build-arg BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ') -t dalgo_backend:0.1 .
docker compose -p dalgo_backend -f Docker/docker-compose.yml --env-file Docker/.env.docker up -d
docker compose -p dalgo_backend -f Docker/docker-compose.yml --env-file Docker/.env.docker down