diff --git a/manuscript/09.2-Deployment-Usage_Pipeline-Workflow.md b/manuscript/09.2-Deployment-Usage_Pipeline-Workflow.md index abe08fe..7093ee0 100644 --- a/manuscript/09.2-Deployment-Usage_Pipeline-Workflow.md +++ b/manuscript/09.2-Deployment-Usage_Pipeline-Workflow.md @@ -16,7 +16,7 @@ To execute the pipeline steps, the Airflow Docker Operator is employed, which en Once the model is in the serving phase, a Streamlit app is deployed for applying inference on new data. -![DAG pipeline](images/09-Deployment-Usage/DAG-pipeline.png) +![](images/09-Deployment-Usage/use-case-pipeline-graph.png) The code below defines the `ml_pipeline_dag` function as an Airflow DAG using the `dag` decorator. Each step of the pipeline, including data preprocessing, model training, model comparison, and serving the best model, is represented as a separate task with the `@task` decorator. Dependencies between these tasks are established by passing the output of one task as an argument to the next task. The `ml_pipeline` object serves as a representation of the entire DAG. diff --git a/manuscript/images/09-Deployment-Usage/use-case-pipeline-graph.png b/manuscript/images/09-Deployment-Usage/use-case-pipeline-graph.png new file mode 100644 index 0000000..b28bdea Binary files /dev/null and b/manuscript/images/09-Deployment-Usage/use-case-pipeline-graph.png differ