-
Notifications
You must be signed in to change notification settings - Fork 525
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError when trying to import Petals #368
Comments
Hi @JohnDevlopment, Can you please try using |
Okay, I made the change and it worked. Thanks! |
@borzunov I just had the same issue when trying the tutorial code: import torch
from transformers import AutoTokenizer
from petals import AutoDistributedModelForCausalLM
model_name = "enoch/llama-65b-hf"
# You could also use "meta-llama/Llama-2-70b-hf", "meta-llama/Llama-2-70b-chat-hf", or
# "bigscience/bloom" - basically, any Hugging Face Hub repo with a supported model architecture
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False, add_bos_token=False)
model = AutoDistributedModelForCausalLM.from_pretrained(model_name)
model = model.cuda()
But when I try to install
|
And if I can give my 2 cents about this issue, I'm not a Python dev but I'm a Rust & Node one, and a good practice is to declare dependencies to exact versions to avoid this kind of issues when distributing a package. However I don't know if that's a normal thing in Python. |
Hi @ivangabriele, What you see is an unrelated issue in hivemind (not related to transformers) - note that the exception is different. It happens when the script finishes, but the code should still work. The relevant issue is #237 (I'm surprised that your bug is still not fixed). The original issue was about an older Petals release (< 2.0.0) and we've moved on to require even newer transformers>=4.31.0 since then (that's a version with Llama 2 support released just a couple of days ago). Re fixing exact dependency versions - it may be okay for a Python web app (running in its own separate env) but it's not okay for a Python library like Petals. This is because people may build their own apps on top of the Petals client, which may require even newer version of transformers (e.g., with a model released just a few days ago). In fact, we initially fixed transformers version (since it's a library that often introduces breaking changes) but got many complaints from users due to exactly this reason (people needed newer models, sampling algorithms, fine-tuning methods, etc.). So now we just do our best to support the latest version if a few days/hours after it was released. |
Should be fixed in learning-at-home/hivemind#579 |
Fix bigscience-workshop/petals#237, bigscience-workshop/petals#368 (comment). (cherry picked from commit 6f5c471)
Following the instructions from the getting started guide, I installed Petals like so:
pip3 install git+https://github.com/bigscience-workshop/petals
But when I tried to import something from petals:
I get this error:
TypeError: unsupported operand type(s) for +: 'NoneType' and 'list'
If I do
import petals
by itself, I get the same error.The text was updated successfully, but these errors were encountered: