Skip to content
/ mlflow Public

Docker-compose for mlflow. Stack: tracking server, proxied S3, and DB.

License

Notifications You must be signed in to change notification settings

Yann21/mlflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLflow On-Premise Deployment using Docker Compose

Easily deploy an MLflow tracking server with 1 command.

The main difference between this fork and the original is that the Mlflow tracking server proxies the requests to the artifact which were improperly handled with the original code base.

Additionally, I have replaced minio with S3 for the sake of convenience. The previous improvement (proper proxying) can be layered on top of minio if needed.

How to run

  1. Clone (download) this repository

    git clone https://github.com/sachua/mlflow-docker-compose.git
  2. cd into the mlflow-docker-compose directory

  3. Build and run the containers with docker-compose

    docker-compose up -d --build
  4. Access MLflow UI with http://localhost:5000

  5. Access MinIO UI with http://localhost:9000

Containerization

The MLflow tracking server is composed of 4 docker containers:

  • MLflow server
  • MinIO object storage server
  • MySQL database server

Example

  1. Install conda

  2. Install MLflow with extra dependencies, including scikit-learn

    pip install mlflow[extras]
  3. Set environmental variables

    export MLFLOW_TRACKING_URI=http://localhost:5000
    export MLFLOW_S3_ENDPOINT_URL=http://localhost:9000
  4. Set MinIO credentials

    cat <<EOF > ~/.aws/credentials
    [default]
    aws_access_key_id=minio
    aws_secret_access_key=minio123
    EOF
  5. Train a sample MLflow model

    mlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.42
    • Note: To fix ModuleNotFoundError: No module named 'boto3'

      #Switch to the conda env
      conda env list
      conda activate mlflow-3eee9bd7a0713cf80a17bc0a4d659bc9c549efac #replace with your own generated mlflow-environment
      pip install boto3
  6. Serve the model (replace with your model's actual path)

    mlflow models serve -m S3://mlflow/0/98bdf6ec158145908af39f86156c347f/artifacts/model -p 1234
  7. You can check the input with this command

    curl -X POST -H "Content-Type:application/json; format=pandas-split" --data '{"columns":["alcohol", "chlorides", "citric acid", "density", "fixed acidity", "free sulfur dioxide", "pH", "residual sugar", "sulphates", "total sulfur dioxide", "volatile acidity"],"data":[[12.8, 0.029, 0.48, 0.98, 6.2, 29, 3.33, 1.2, 0.39, 75, 0.66]]}' http://127.0.0.1:1234/invocations

About

Docker-compose for mlflow. Stack: tracking server, proxied S3, and DB.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published