Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare release 0.55.2 #2401

Merged
merged 2 commits into from
Feb 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@
<a href="https://www.zenml.io/company#team">Meet the Team</a>
<br />
<br />
🎉 Version 0.55.1 is out. Check out the release notes
🎉 Version 0.55.2 is out. Check out the release notes
<a href="https://github.com/zenml-io/zenml/releases">here</a>.
<br />
<br />
Expand Down
37 changes: 37 additions & 0 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,42 @@
<!-- markdown-link-check-disable -->

# 0.55.2

This patch comes with a variety of new features, bug-fixes, and documentation updates.

Some of the most important changes include:

- The ability to add tags to outputs through the step context
- Allowing the secret stores to utilize the implicit authentication method of AWS/GCP/Azure Service Connectors
- [Lazy loading client methods](https://docs.zenml.io/v/docs/user-guide/advanced-guide/data-management/late-materialization) in a pipeline context
- Updates on the Vertex orchestrator to switch to the native VertexAI scheduler
- The new [HyperAI](https://hyperai.ai) integration featuring a new orchestrator and service connector
- Bumping the mlflow version to 2.10.0

We'd like to give a special thanks to @christianversloot and @francoisserra for their contributions.

## What's Changed
* `0.55.1` in migration testing by @avishniakov in https://github.com/zenml-io/zenml/pull/2368
* Credential-less AWS/GCP/Azure Secrets Store support by @stefannica in https://github.com/zenml-io/zenml/pull/2365
* Small docs updates by @strickvl in https://github.com/zenml-io/zenml/pull/2359
* generic `Client()` getters lazy loading by @avishniakov in https://github.com/zenml-io/zenml/pull/2323
* Added slack settings OSSK-382 by @htahir1 in https://github.com/zenml-io/zenml/pull/2378
* Label triggered slow ci by @avishniakov in https://github.com/zenml-io/zenml/pull/2379
* Remove unused `is-slow-ci` input from fast and slow integration testing by @strickvl in https://github.com/zenml-io/zenml/pull/2382
* Add deprecation warning for `ExternalArtifact` non-value features by @avishniakov in https://github.com/zenml-io/zenml/pull/2375
* Add telemetry pipeline run ends by @htahir1 in https://github.com/zenml-io/zenml/pull/2377
* Updating the `update_model` decorator by @bcdurak in https://github.com/zenml-io/zenml/pull/2136
* Mocked API docs building by @avishniakov in https://github.com/zenml-io/zenml/pull/2360
* Add outputs tags function by @avishniakov in https://github.com/zenml-io/zenml/pull/2383
* Bump mlflow to v2.10.0 by @christianversloot in https://github.com/zenml-io/zenml/pull/2374
* Fix sharing of model versions by @schustmi in https://github.com/zenml-io/zenml/pull/2380
* Fix GCP service connector login to overwrite existing valid credentials by @stefannica in https://github.com/zenml-io/zenml/pull/2392
* Update `has_custom_name` for legacy artifacts by @avishniakov in https://github.com/zenml-io/zenml/pull/2384
* Use native VertexAI scheduler capability instead of old GCP official workaround by @francoisserra in https://github.com/zenml-io/zenml/pull/2310
* HyperAI integration: orchestrator and service connector by @christianversloot in https://github.com/zenml-io/zenml/pull/2372

**Full Changelog**: https://github.com/zenml-io/zenml/compare/0.55.1...0.55.2

# 0.55.1

**If you are actively using the Model Control Plane features, we suggest that you directly upgrade to 0.55.1, bypassing 0.55.0.**
Expand Down
10 changes: 5 additions & 5 deletions examples/quickstart/quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -628,8 +628,8 @@
" dataset_trn, dataset_tst = feature_engineering()\n",
" else:\n",
" # Load the datasets from an older pipeline\n",
" dataset_trn = client.get_artifact_version(id=train_dataset_id)\n",
" dataset_tst = client.get_artifact_version(id=test_dataset_id) \n",
" dataset_trn = client.get_artifact_version(name_id_or_prefix=train_dataset_id)\n",
" dataset_tst = client.get_artifact_version(name_id_or_prefix=test_dataset_id) \n",
"\n",
" trained_model = model_trainer(\n",
" dataset_trn=dataset_trn,\n",
Expand Down Expand Up @@ -970,8 +970,8 @@
"@pipeline\n",
"def inference(preprocess_pipeline_id: UUID):\n",
" \"\"\"Model batch inference pipeline\"\"\"\n",
" # random_state = client.get_artifact_version(id=preprocess_pipeline_id).metadata[\"random_state\"].value\n",
" # target = client.get_artifact_version(id=preprocess_pipeline_id).run_metadata['target'].value\n",
" # random_state = client.get_artifact_version(name_id_or_prefix=preprocess_pipeline_id).metadata[\"random_state\"].value\n",
" # target = client.get_artifact_version(name_id_or_prefix=preprocess_pipeline_id).run_metadata['target'].value\n",
" random_state = 42\n",
" target = \"target\"\n",
"\n",
Expand All @@ -981,7 +981,7 @@
" df_inference = inference_preprocessor(\n",
" dataset_inf=df_inference,\n",
" # We use the preprocess pipeline from the feature engineering pipeline\n",
" preprocess_pipeline=client.get_artifact_version(id=preprocess_pipeline_id),\n",
" preprocess_pipeline=client.get_artifact_version(name_id_or_prefix=preprocess_pipeline_id),\n",
" target=target,\n",
" )\n",
" inference_predict(\n",
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "zenml"
version = "0.55.1"
version = "0.55.2"
packages = [{ include = "zenml", from = "src" }]
description = "ZenML: Write production-ready ML code."
authors = ["ZenML GmbH <info@zenml.io>"]
Expand Down
2 changes: 1 addition & 1 deletion src/zenml/VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.55.1
0.55.2
2 changes: 1 addition & 1 deletion src/zenml/zen_server/deploy/helm/Chart.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
apiVersion: v2
name: zenml
version: "0.55.1"
version: "0.55.2"
description: Open source MLOps framework for portable production ready ML pipelines
keywords:
- mlops
Expand Down
4 changes: 2 additions & 2 deletions src/zenml/zen_server/deploy/helm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ ZenML is an open-source MLOps framework designed to help you create robust, main
To install the ZenML chart directly from Amazon ECR, use the following command:

```bash
# example command for version 0.55.1
helm install my-zenml oci://public.ecr.aws/zenml/zenml --version 0.55.1
# example command for version 0.55.2
helm install my-zenml oci://public.ecr.aws/zenml/zenml --version 0.55.2
```

Note: Ensure you have OCI support enabled in your Helm client and that you are authenticated with Amazon ECR.
Expand Down
24 changes: 24 additions & 0 deletions src/zenml/zen_stores/migrations/versions/0.55.2_release.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
"""Release [0.55.2].

Revision ID: 0.55.2
Revises: 0.55.1
Create Date: 2024-02-06 11:32:02.715174

"""


# revision identifiers, used by Alembic.
revision = "0.55.2"
down_revision = "0.55.1"
branch_labels = None
depends_on = None


def upgrade() -> None:
"""Upgrade database schema and/or data, creating a new revision."""
pass


def downgrade() -> None:
"""Downgrade database schema and/or data back to the previous revision."""
pass