Skip to content

Commit

Permalink
Prepare release 0.56.0 (#2546)
Browse files Browse the repository at this point in the history
* Prepare release 0.56.0

* Auto-update of LLM Finetuning template

* Apply suggestions from code review

Co-authored-by: Alex Strick van Linschoten <strickvl@users.noreply.github.com>

* Refactor model deployment process and enable continuous deployment

* Apply suggestions from code review

Co-authored-by: Alex Strick van Linschoten <strickvl@users.noreply.github.com>

* Update model deployment process and service functions

* Auto-update of LLM Finetuning template

* Fix model deployment issue

* Add admin user notion, protection for certain operations, and rate limiting for login API

* Add support for LLM template in ZenML

---------

Co-authored-by: GitHub Actions <actions@github.com>
Co-authored-by: Alex Strick van Linschoten <strickvl@users.noreply.github.com>
  • Loading branch information
3 people authored Mar 20, 2024
1 parent 56f5ba8 commit 90c5bda
Show file tree
Hide file tree
Showing 8 changed files with 201 additions and 6 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@
<a href="https://github.com/zenml-io/zenml-projects">Projects Showcase</a>
<br />
<br />
🎉 Version 0.55.5 is out. Check out the release notes
🎉 Version 0.56.0 is out. Check out the release notes
<a href="https://github.com/zenml-io/zenml/releases">here</a>.
<br />
<br />
Expand Down
172 changes: 172 additions & 0 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,176 @@
<!-- markdown-link-check-disable -->
# 0.56.0

ZenML 0.56.0 introduces a wide array of new features, enhancements, and bug fixes,
with a strong emphasis on elevating the user experience and streamlining machine
learning workflows. Most notably, you can now deploy models using Hugging Face inference endpoints thanks for an open-source community contribution of this model deployer stack component!

This release also comes with a breaking change to the services
architecture.

## Breaking Change

A significant change in this release is the migration of the `Service` (ZenML's technical term for deployment)
registration and deployment from local or remote environments to the ZenML server.
This change will be reflected in an upcoming tab in the dashboard which will
allow users to explore and see the deployed models in the dashboard with their live
status and metadata. This architectural shift also simplifies the model deployer
abstraction and streamlines the model deployment process for users by moving from
limited built-in steps to a more documented and flexible approach.

Important note: If you have models that you previously deployed with ZenML, you might
want to redeploy them to have them stored in the ZenML server and tracked by ZenML,
ensuring they appear in the dashboard.

Additionally, the `find_model_server` method now retrieves models (services) from the
ZenML server instead of local or remote deployment environments. As a result, any
usage of `find_model_server` will only return newly deployed models stored in the server.

It is also no longer recommended to call service functions like `service.start()`.
Instead, use `model_deployer.start_model_server(service_id)`, which will allow ZenML
to update the changed status of the service in the server.

### Starting a service
**Old syntax:**
```python
from zenml import pipeline,
from zenml.integrations.bentoml.services.bentoml_deployment import BentoMLDeploymentService

@step
def predictor(
service: BentoMLDeploymentService,
) -> None:
# starting the service
service.start(timeout=10)
```

**New syntax:**
```python
from zenml import pipeline
from zenml.integrations.bentoml.model_deployers import BentoMLModelDeployer
from zenml.integrations.bentoml.services.bentoml_deployment import BentoMLDeploymentService

@step
def predictor(
service: BentoMLDeploymentService,
) -> None:
# starting the service
model_deployer = BentoMLModelDeployer.get_active_model_deployer()
model_deployer.start_model_server(service_id=service.service_id, timeout=10)
```

### Enabling continuous deployment

Instead of replacing the parameter that was used in the `deploy_model` method to replace the
existing service (if it matches the exact same pipeline name and step name without
taking into accounts other parameters or configurations), we now have a new parameter,
`continuous_deployment_mode`, that allows you to enable continuous deployment for
the service. This will ensure that the service is updated with the latest version
if it's on the same pipeline and step and the service is not already running. Otherwise,
any new deployment with different configurations will create a new service.

```python
from zenml import pipeline, step, get_step_context
from zenml.client import Client

@step
def deploy_model() -> Optional[MLFlowDeploymentService]:
# Deploy a model using the MLflow Model Deployer
zenml_client = Client()
model_deployer = zenml_client.active_stack.model_deployer
mlflow_deployment_config = MLFlowDeploymentConfig(
name: str = "mlflow-model-deployment-example",
description: str = "An example of deploying a model using the MLflow Model Deployer",
pipeline_name: str = get_step_context().pipeline_name,
pipeline_step_name: str = get_step_context().step_name,
model_uri: str = "runs:/<run_id>/model" or "models:/<model_name>/<model_version>",
model_name: str = "model",
workers: int = 1
mlserver: bool = False
timeout: int = DEFAULT_SERVICE_START_STOP_TIMEOUT
)
service = model_deployer.deploy_model(mlflow_deployment_config, continuous_deployment_mode=True)
logger.info(f"The deployed service info: {model_deployer.get_model_server_info(service)}")
return service
```


## Major Features and Enhancements:

* A new `Huggingface Model Deployer` has been introduced, allowing you to seamlessly
deploy your Huggingface models using ZenML. (Thank you so much @dudeperf3ct for the contribution!)
* Faster Integration and Dependency Management ZenML now leverages the `uv` library,
significantly improving the speed of integration installations and dependency management,
resulting in a more streamlined and efficient workflow.
* Enhanced Logging and Status Tracking Logging have been improved, providing better
visibility into the state of your ZenML services.
* Improved Artifact Store Isolation: ZenML now prevents unsafe operations that access
data outside the scope of the artifact store, ensuring better isolation and security.
* Adding admin user notion for the user accounts and added protection to certain operations
performed via the REST interface to ADMIN-allowed only.
* Rate limiting for login API to prevent abuse and protect the server from potential
security threats.
* The LLM template is now supported in ZenML, allowing you to use the LLM template
for your pipelines.


## 🥳 Community Contributions 🥳

We'd like to give a special thanks to @dudeperf3ct he contributed to this release
by introducing the Huggingface Model Deployer. We'd also like to thank @moesio-f
for their contribution to this release by adding a new attribute to the `Kaniko` image builder.
Additionally, we'd like to thank @christianversloot for his contributions to this release.


## All changes:

* Upgrading SQLModel to the latest version by @bcdurak in https://github.com/zenml-io/zenml/pull/2452
* Remove KServe integration by @safoinme in https://github.com/zenml-io/zenml/pull/2495
* Upgrade migration testing with 0.55.5 by @avishniakov in https://github.com/zenml-io/zenml/pull/2501
* Relax azure, gcfs and s3 dependencies by @strickvl in https://github.com/zenml-io/zenml/pull/2498
* Use HTTP forwarded headers to detect the real origin of client devices by @stefannica in https://github.com/zenml-io/zenml/pull/2499
* Update README.md for quickstart colab link by @strickvl in https://github.com/zenml-io/zenml/pull/2505
* Add sequential migration tests for MariaDB and MySQL by @strickvl in https://github.com/zenml-io/zenml/pull/2502
* Huggingface Model Deployer by @dudeperf3ct in https://github.com/zenml-io/zenml/pull/2376
* Use `uv` to speed up pip installs & the CI in general by @strickvl in https://github.com/zenml-io/zenml/pull/2442
* Handle corrupted or empty global configuration file by @stefannica in https://github.com/zenml-io/zenml/pull/2508
* Add admin users notion by @avishniakov in https://github.com/zenml-io/zenml/pull/2494
* Remove dashboard from gitignore by @safoinme in https://github.com/zenml-io/zenml/pull/2517
* Colima / Homebrew fix by @strickvl in https://github.com/zenml-io/zenml/pull/2512
* [HELM] Remove extra environment variable assignment by @wjayesh in https://github.com/zenml-io/zenml/pull/2518
* Allow installing packages using UV by @schustmi in https://github.com/zenml-io/zenml/pull/2510
* Additional fields for track events by @bcdurak in https://github.com/zenml-io/zenml/pull/2507
* Check if environment key is set before deleting in HyperAI orchestrator by @christianversloot in https://github.com/zenml-io/zenml/pull/2511
* Fix the pagination in the database backup by @stefannica in https://github.com/zenml-io/zenml/pull/2522
* Bump mlflow to version 2.11.1 by @christianversloot in https://github.com/zenml-io/zenml/pull/2524
* Add docs for uv installation by @schustmi in https://github.com/zenml-io/zenml/pull/2527
* Fix bug in HyperAI orchestrator depends_on parallelism by @christianversloot in https://github.com/zenml-io/zenml/pull/2523
* Upgrade pip in docker images by @schustmi in https://github.com/zenml-io/zenml/pull/2528
* Fix node selector and other fields for DB job in helm chart by @stefannica in https://github.com/zenml-io/zenml/pull/2531
* Revert "Upgrading SQLModel to the latest version" by @bcdurak in https://github.com/zenml-io/zenml/pull/2515
* Add `pod_running_timeout` attribute to `Kaniko` image builder by @moesio-f in https://github.com/zenml-io/zenml/pull/2509
* Add test to install dashboard script by @strickvl in https://github.com/zenml-io/zenml/pull/2521
* Sort pipeline namespaces by last run by @schustmi in https://github.com/zenml-io/zenml/pull/2514
* Add support for LLM template by @schustmi in https://github.com/zenml-io/zenml/pull/2519
* Rate limiting for login API by @avishniakov in https://github.com/zenml-io/zenml/pull/2484
* Try/catch for Docker client by @christianversloot in https://github.com/zenml-io/zenml/pull/2513
* Fix config file in starter guide by @schustmi in https://github.com/zenml-io/zenml/pull/2534
* Log URL for pipelines and model versions when running a pipeline by @wjayesh in https://github.com/zenml-io/zenml/pull/2506
* Add security exclude by @schustmi in https://github.com/zenml-io/zenml/pull/2541
* Update error message around notebook use by @strickvl in https://github.com/zenml-io/zenml/pull/2536
* Cap `fsspec` for Huggingface integration by @avishniakov in https://github.com/zenml-io/zenml/pull/2542
* Fix integration materializers' URLs in docs by @strickvl in https://github.com/zenml-io/zenml/pull/2538
* Bug fix HyperAI orchestrator: Offload scheduled pipeline execution to bash script by @christianversloot in https://github.com/zenml-io/zenml/pull/2535
* Update `pip check` command to use `uv` by @strickvl in https://github.com/zenml-io/zenml/pull/2520
* Implemented bitbucket webhook event source by @AlexejPenner in https://github.com/zenml-io/zenml/pull/2481
* Add ZenMLServiceType and update service registration by @safoinme in https://github.com/zenml-io/zenml/pull/2471

## New Contributors
* @dudeperf3ct made their first contribution in https://github.com/zenml-io/zenml/pull/2376
* @moesio-f made their first contribution in https://github.com/zenml-io/zenml/pull/2509

**Full Changelog**: https://github.com/zenml-io/zenml/compare/0.55.5...0.56.0

# 0.55.5

This patch contains a number of bug fixes and security improvements.
Expand Down
Binary file modified examples/llm_finetuning/.assets/model.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "zenml"
version = "0.55.5"
version = "0.56.0"
packages = [{ include = "zenml", from = "src" }]
description = "ZenML: Write production-ready ML code."
authors = ["ZenML GmbH <info@zenml.io>"]
Expand Down
2 changes: 1 addition & 1 deletion src/zenml/VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.55.5
0.56.0
2 changes: 1 addition & 1 deletion src/zenml/zen_server/deploy/helm/Chart.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
apiVersion: v2
name: zenml
version: "0.55.5"
version: "0.56.0"
description: Open source MLOps framework for portable production ready ML pipelines
keywords:
- mlops
Expand Down
4 changes: 2 additions & 2 deletions src/zenml/zen_server/deploy/helm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ ZenML is an open-source MLOps framework designed to help you create robust, main
To install the ZenML chart directly from Amazon ECR, use the following command:

```bash
# example command for version 0.55.5
helm install my-zenml oci://public.ecr.aws/zenml/zenml --version 0.55.5
# example command for version 0.56.0
helm install my-zenml oci://public.ecr.aws/zenml/zenml --version 0.56.0
```

Note: Ensure you have OCI support enabled in your Helm client and that you are authenticated with Amazon ECR.
Expand Down
23 changes: 23 additions & 0 deletions src/zenml/zen_stores/migrations/versions/0.56.0_release.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
"""Release [0.56.0].
Revision ID: 0.56.0
Revises: 1a9a9d2a836d
Create Date: 2024-03-20 13:30:40.013587
"""

# revision identifiers, used by Alembic.
revision = "0.56.0"
down_revision = "1a9a9d2a836d"
branch_labels = None
depends_on = None


def upgrade() -> None:
"""Upgrade database schema and/or data, creating a new revision."""
pass


def downgrade() -> None:
"""Downgrade database schema and/or data back to the previous revision."""
pass

0 comments on commit 90c5bda

Please sign in to comment.