Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError when trying to import Petals #368

Closed
JohnDevlopment opened this issue Jul 18, 2023 · 6 comments
Closed

TypeError when trying to import Petals #368

JohnDevlopment opened this issue Jul 18, 2023 · 6 comments

Comments

@JohnDevlopment
Copy link

Following the instructions from the getting started guide, I installed Petals like so:
pip3 install git+https://github.com/bigscience-workshop/petals

But when I tried to import something from petals:

from petals import AutoDistributedModelForCausalLM

I get this error:
TypeError: unsupported operand type(s) for +: 'NoneType' and 'list'

If I do import petals by itself, I get the same error.

@borzunov
Copy link
Collaborator

Hi @JohnDevlopment,

Can you please try using pip install transformers==4.30.2? We have a small compatibility issue with transformers 4.31.0 that was released just an hour ago, we're gonna fix it soon!

@JohnDevlopment
Copy link
Author

Okay, I made the change and it worked. Thanks!

@ivangabriele
Copy link

ivangabriele commented Jul 22, 2023

@borzunov I just had the same issue when trying the tutorial code:

import torch
from transformers import AutoTokenizer
from petals import AutoDistributedModelForCausalLM

model_name = "enoch/llama-65b-hf"
# You could also use "meta-llama/Llama-2-70b-hf", "meta-llama/Llama-2-70b-chat-hf", or
# "bigscience/bloom" - basically, any Hugging Face Hub repo with a supported model architecture

tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False, add_bos_token=False)
model = AutoDistributedModelForCausalLM.from_pretrained(model_name)
model = model.cuda()
➜  Binaries python petals-client.py         
You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at https://github.com/huggingface/transformers/pull/24565
Jul 22 17:12:14.097 [INFO] Make sure you follow the LLaMA's terms of use: https://bit.ly/llama2-license for LLaMA 2, https://bit.ly/llama-license for LLaMA 1
Jul 22 17:12:14.097 [INFO] Using DHT prefix: llama-65b-hf
Exception ignored in: <function P2P.__del__ at 0x7f3ae93503a0>
Traceback (most recent call last):
  File "/home/ivan/.local/lib/python3.10/site-packages/hivemind/p2p/p2p_daemon.py", line 636, in __del__
  File "/home/ivan/.local/lib/python3.10/site-packages/hivemind/p2p/p2p_daemon.py", line 664, in _terminate
  File "/home/ivan/.local/lib/python3.10/site-packages/multiaddr/multiaddr.py", line 254, in value_for_protocol
TypeError: 'NoneType' object is not callable

But when I try to install transformers v4.30.2, it tells me that petals v2.0.0.post3 requires transformers >=v4.31.0:

➜  Binaries pip install transformers==4.30.2
Defaulting to user installation because normal site-packages is not writeable
Collecting transformers==4.30.2
  Obtaining dependency information for transformers==4.30.2 from https://files.pythonhosted.org/packages/5b/0b/e45d26ccd28568013523e04f325432ea88a442b4e3020b757cf4361f0120/transformers-4.30.2-py3-none-any.whl.metadata
  Using cached transformers-4.30.2-py3-none-any.whl.metadata (113 kB)
Requirement already satisfied: filelock in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (3.12.0)
Requirement already satisfied: huggingface-hub<1.0,>=0.14.1 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (0.14.1)
Requirement already satisfied: numpy>=1.17 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (1.24.3)
Requirement already satisfied: packaging>=20.0 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (23.1)
Requirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from transformers==4.30.2) (5.4.1)
Requirement already satisfied: regex!=2019.12.17 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (2023.5.5)
Requirement already satisfied: requests in /usr/lib/python3/dist-packages (from transformers==4.30.2) (2.25.1)
Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (0.13.3)
Requirement already satisfied: safetensors>=0.3.1 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (0.3.1)
Requirement already satisfied: tqdm>=4.27 in /home/ivan/.local/lib/python3.10/site-packages (from transformers==4.30.2) (4.65.0)
Requirement already satisfied: fsspec in /home/ivan/.local/lib/python3.10/site-packages (from huggingface-hub<1.0,>=0.14.1->transformers==4.30.2) (2023.5.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/ivan/.local/lib/python3.10/site-packages (from huggingface-hub<1.0,>=0.14.1->transformers==4.30.2) (4.5.0)
Using cached transformers-4.30.2-py3-none-any.whl (7.2 MB)
DEPRECATION: distro-info 1.1build1 has a non-standard version number. pip 23.3 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of distro-info or contact the author to suggest that they release a version with a conforming version number. Discussion can be found at https://github.com/pypa/pip/issues/12063
DEPRECATION: gpg 1.16.0-unknown has a non-standard version number. pip 23.3 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of gpg or contact the author to suggest that they release a version with a conforming version number. Discussion can be found at https://github.com/pypa/pip/issues/12063
Installing collected packages: transformers
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
petals 2.0.0.post3 requires transformers<5.0.0,>=4.31.0, but you have transformers 4.30.2 which is incompatible.
Successfully installed transformers-4.30.2
➜  Binaries python petals-client.py         
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /home/ivan/Binaries/petals-client.py:3 in <module>                                               │
│                                                                                                  │
│    1 import torch                                                                                │
│    2 from transformers import AutoTokenizer                                                      │
│ ❱  3 from petals import AutoDistributedModelForCausalLM                                          │
│    4                                                                                             │
│    5 model_name = "enoch/llama-65b-hf"                                                           │
│    6 # You could also use "meta-llama/Llama-2-70b-hf", "meta-llama/Llama-2-70b-chat-hf", or      │
│                                                                                                  │
│ /home/ivan/.local/lib/python3.10/site-packages/petals/__init__.py:18 in <module>                 │
│                                                                                                  │
│   15                                                                                             │
│   16                                                                                             │
│   17 if not os.getenv("PETALS_IGNORE_DEPENDENCY_VERSION"):                                       │
│ ❱ 18 │   assert (                                                                                │
│   19 │   │   version.parse("4.31.0") <= version.parse(transformers.__version__) < version.par    │
│   20 │   ), "Please install a proper transformers version: pip install transformers>=4.31.0,<    │
│   21                                                                                             │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
AssertionError: Please install a proper transformers version: pip install transformers>=4.31.0,<5.0.0

@ivangabriele
Copy link

ivangabriele commented Jul 22, 2023

And if I can give my 2 cents about this issue, I'm not a Python dev but I'm a Rust & Node one, and a good practice is to declare dependencies to exact versions to avoid this kind of issues when distributing a package. However I don't know if that's a normal thing in Python.

@borzunov
Copy link
Collaborator

borzunov commented Jul 22, 2023

Hi @ivangabriele,

What you see is an unrelated issue in hivemind (not related to transformers) - note that the exception is different. It happens when the script finishes, but the code should still work. The relevant issue is #237 (I'm surprised that your bug is still not fixed).

The original issue was about an older Petals release (< 2.0.0) and we've moved on to require even newer transformers>=4.31.0 since then (that's a version with Llama 2 support released just a couple of days ago).

Re fixing exact dependency versions - it may be okay for a Python web app (running in its own separate env) but it's not okay for a Python library like Petals. This is because people may build their own apps on top of the Petals client, which may require even newer version of transformers (e.g., with a model released just a few days ago).

In fact, we initially fixed transformers version (since it's a library that often introduces breaking changes) but got many complaints from users due to exactly this reason (people needed newer models, sampling algorithms, fine-tuning methods, etc.). So now we just do our best to support the latest version if a few days/hours after it was released.

@borzunov
Copy link
Collaborator

borzunov commented Jul 22, 2023

Should be fixed in learning-at-home/hivemind#579

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants