Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LlamaRotaryEmbedding Input Argument Is Inconsistent with Hugging Face #525

Open
austin362667 opened this issue Jan 16, 2025 · 0 comments · May be fixed by #526
Open

LlamaRotaryEmbedding Input Argument Is Inconsistent with Hugging Face #525

austin362667 opened this issue Jan 16, 2025 · 0 comments · May be fixed by #526
Assignees

Comments

@austin362667
Copy link
Collaborator

🐛 Describe the bug

Now LlamaRotaryEmbedding takes LlamaConfig as input arg, instead of original head_dim.

https://github.com/huggingface/transformers/blob/99e0ab6ed888136ea4877c6d8ab03690a1478363/src/transformers/models/llama/modeling_llama.py#L83

Reproduce

>       rotary_emb = LlamaRotaryEmbedding(head_dim, device=device)

test/transformers/test_rope.py:136: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = LlamaRotaryEmbedding(), config = 41, device = 'cuda'

    def __init__(self, config: LlamaConfig, device=None):
        super().__init__()
        # BC: "rope_type" was originally "type"
        if hasattr(config, "rope_scaling") and config.rope_scaling is not None:
            self.rope_type = config.rope_scaling.get("rope_type", config.rope_scaling.get("type"))
        else:
            self.rope_type = "default"
>       self.max_seq_len_cached = config.max_position_embeddings
E       AttributeError: 'int' object has no attribute 'max_position_embeddings'

/usr/local/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py:90: AttributeError
=========================== short test summary info ============================
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-1-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-2-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-1-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-2-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-3-423-73-213-92]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype0-1e-05-1e-05-3-423-73-155-92]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-1-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-2-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-1-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-2-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-3-423-73-213-92]
FAILED test/transformers/test_rope.py::test_correctness[True-dtype1-0.1-1e-05-3-423-73-155-92]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-1-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-2-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-1-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-2-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-3-423-73-213-92]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype0-1e-05-1e-05-3-423-73-155-92]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-1-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-2-128-32-32-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-1-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-2-128-32-8-64]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-3-423-73-213-92]
FAILED test/transformers/test_rope.py::test_correctness[False-dtype1-0.1-1e-05-3-423-73-155-92]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype0-1e-05-1e-05-1-2-2-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype0-1e-05-1e-05-1-2-1-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype0-1e-05-1e-05-9-7-41-41-41]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype1-0.1-1e-05-1-2-2-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype1-0.1-1e-05-1-2-1-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[True-dtype1-0.1-1e-05-9-7-41-41-41]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype0-1e-05-1e-05-1-2-2-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype0-1e-05-1e-05-1-2-1-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype0-1e-05-1e-05-9-7-41-41-41]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype1-0.1-1e-05-1-2-2-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype1-0.1-1e-05-1-2-1-2-8]
FAILED test/transformers/test_rope.py::test_functional_correctness[False-dtype1-0.1-1e-05-9-7-41-41-41]
===== 36 failed, 798 passed, 215 skipped, 42 warnings in 225.22s (0:03:45) =====

Versions

v0.5.2

@austin362667 austin362667 changed the title Upate LlamaRotaryEmbedding inputs LlamaRotaryEmbedding Input Argument Is Inconsistent with Hugging Face Jan 16, 2025
@austin362667 austin362667 linked a pull request Jan 16, 2025 that will close this issue
3 tasks
@austin362667 austin362667 self-assigned this Jan 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant