This folder contains the following examples for code llama 7B models:
File | Description | Model Used | GPU Minimum Requirement |
---|---|---|---|
01_load_inference | Environment setup and suggested configurations when inferencing Code Llama 7B models on Databricks. | CodeLlama-7b-hf CodeLlama-7b-hf-instructions CodeLlama-7b-hf-python |
1xA10-24GB |
02_mlflow_logging_inference | Save, register, and load Code llama models with MLflow, and create a Databricks model serving endpoint. | CodeLlama-7b-hf |
1xA10-24GB |