You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Model <optimum.neuron.modeling.NeuronModelForCausalLM object at 0x7fa4f8131d20> is not supported. Please provide a valid model either as string or NeuronModel. You can also provide non model then a default one will be used
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/mms/model_loader.py", line 145, in load model_service.initialize(service.context)
File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/handler_service.py", line 77, in initialize
self.model = self.load(self.model_dir)
File "/opt/ml/model/code/inference.py", line 18, in model_fn
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
File "/opt/conda/lib/python3.10/site-packages/optimum/neuron/pipelines/transformers/base.py", line 229, in pipeline
model, model_id, tokenizer, feature_extractor = load_pipeline(
File "/opt/conda/lib/python3.10/site-packages/optimum/neuron/pipelines/transformers/base.py", line 148, in load_pipeline
raise ValueError(
ValueError: Model <optimum.neuron.modeling.NeuronModelForCausalLM object at 0x7fa4f8131d20> is not supported. Please provide a valid model either as string or NeuronModel.
You can also provide non model then a default one will be used
The text was updated successfully, but these errors were encountered:
I cannot pass a
model
initiated fromNeuonModelForCausalLM
into the pipeline directly. See error belowcreates
The text was updated successfully, but these errors were encountered: