-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A KedroPipelineModel cannot be loaded from mlflow if its catalog contains non deepcopy-able DataSets #122
Comments
Does removing the faulty line and using directly the initial_catalog make the model loadable again ? if Yes, we have two options :
Knowing that the |
After some investigation, the issues comes from the MLflowAbstractModelDataSet, and particularly the Note that this is a problem which occurs only when the DataSet is not deepcopiable (and not the underlying value the DataSet can load(), so we can quite safely assume that it should not occur often). If it does, we should consider a more radical solution among the ones you suggest. |
Description
I tried to load a KedroPipelineModel from mlflow, and I got a "cannot pickle context artifacts" error, which is due do the
Context
I cannot load a previously saved KedroPipelineModel generated by pipeline_ml_factory.
Steps to Reproduce
Save A KedroPipelineModel with a dataset that contains an object which cannot be deepcopied (for me, a keras tokenizer)
Expected Result
The model should be loaded
Actual Result
An error is raised
Your Environment
Include as many relevant details about the environment in which you experienced the bug:
kedro
andkedro-mlflow
version used: 0.16.5 and 0.4.0python -V
): 3.6.8Does the bug also happen with the last version on develop?
Yes
Potential solution
The faulty line is:
kedro-mlflow/kedro_mlflow/mlflow/kedro_pipeline_model.py
Line 45 in 63dcd50
The text was updated successfully, but these errors were encountered: