-
-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BugFix] Fix test breakages from transformers 4.45 upgrade #8829
Changes from 2 commits
4f53397
e2ae1bb
66c0c19
a5b289c
ce1d477
4eaa8e1
899003b
562f816
51b9abc
8e7f2b6
57b7328
0ebd4fb
4a924c8
0c30e87
9f2fac8
2b6948c
45e2b54
f0584fa
27b96c1
cd105be
315ff90
8fdad1c
6decd70
59bc78d
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1740,7 +1740,7 @@ def _get_and_verify_max_len( | |
"with rope_scaling. Please raise an issue so we can " | ||
"investigate.") | ||
|
||
if rope_type == "mrope": | ||
if rope_type in ("mrope", "default"): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. "mrope" gets renamed to "default" in the Qwen2-VL config class There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Qwen2-VL cannot be run in
But if we install the older version of transformers mentioned in the docs ( There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks @DarkLight1337 yeah I was just making a change to update the scaling type in the config back to "mrope" if it's "default". @ywang96 found this open issue huggingface/transformers#33401 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Change made in ce1d477 |
||
scaling_factor = 1 | ||
else: | ||
assert "factor" in rope_scaling | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -167,7 +167,7 @@ def get_lora_tokenizer(lora_request: LoRARequest, *args, | |
return None | ||
try: | ||
tokenizer = get_tokenizer(lora_request.lora_path, *args, **kwargs) | ||
except OSError as e: | ||
except (OSError, ValueError) as e: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
# No tokenizer was found in the LoRA folder, | ||
# use base model tokenizer | ||
logger.warning( | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_get_logits_warper
was rolled into_get_logits_processor