Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Index out of bounds for BART and other model architectures #6

Open
Gimperion opened this issue Apr 20, 2023 · 3 comments
Open

Index out of bounds for BART and other model architectures #6

Gimperion opened this issue Apr 20, 2023 · 3 comments

Comments

@Gimperion
Copy link

I keep getting an index 50264 is out of bounds for dimension 0 with size 50264 or something similar when converting BART and some other models to LSG.

The issue seems to be this line of code in the update_global method -
positions[1:] += u[mask_id].unsqueeze(0)

@ccdv-ai
Copy link
Owner

ccdv-ai commented Apr 20, 2023

Hi @Gimperion

Can you share your transformers version and a snippet of code you did use?

@Gimperion
Copy link
Author

I replicated the error with both transformers 4.26.0 and 4.28.1. Here's the snippet:

from lsg_converter import LSGConverter

converter = LSGConverter(max_sequence_length=4096)

model, tokenizer = converter.convert_from_pretrained("sshleifer/distilbart-cnn-6-6", block_size=256, sparsity_factor=2)

@ccdv-ai
Copy link
Owner

ccdv-ai commented Apr 20, 2023

I think I found the problem @Gimperion
Something is wrong with the model and the tokenizer.
The <mask> token has the index 50264 while the model config states that "vocab_size": 50264 .
Since the first token has the index 0, there are in practice 50265 tokens in the vocabulary, the index is thus out of bounds.

If you try to do an inference with the <mask> token, it fails.

If you really need to convert the model you have two possibilities:

  • expand the token embedding matrix
  • use random_global_init=True or --random_global_init to skip the step with the mask token

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants