Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DRY sampler improvements #6053

Merged
merged 7 commits into from
Jun 13, 2024
Merged

Conversation

belladoreai
Copy link
Contributor

@belladoreai belladoreai commented May 25, 2024

I was asked by @p-e-w to split some of the changes from #6047 into a separate PR here.

This PR contains the data type performance improvement for DRY, and a minor fix to prevent crash on large repetitive inputs.

Edit: now also contains change to cap the max match length to 50

See the main PR for more info.

Checklist:

modules/sampler_hijack.py Outdated Show resolved Hide resolved
modules/sampler_hijack.py Show resolved Hide resolved
modules/sampler_hijack.py Outdated Show resolved Hide resolved
@belladoreai belladoreai requested a review from p-e-w May 27, 2024 13:38
Copy link
Contributor

@p-e-w p-e-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM modulo style nits.

modules/sampler_hijack.py Outdated Show resolved Hide resolved
modules/sampler_hijack.py Outdated Show resolved Hide resolved
@belladoreai
Copy link
Contributor Author

@oobabooga Ready for merge

Copy link
Contributor

@p-e-w p-e-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good now! Probably makes sense to merge this before DRY is merged into master.

@Hunterius8
Copy link

Quickly compared this version to the previous two, looks like it has still gotten a little faster.

drycomparison2

The tokens per second only decrease by ~5% at a range of about 21500 tokens now, compared to the previous version's ~11%.

@jojje
Copy link

jojje commented Jun 3, 2024

LGTM.

Seems @p-e-w is also good with it. What's your view @oobabooga ?

It is an improvement over the original. If further optimization turns out to be necessary, such can be done in separate PRs.

@Vhallo
Copy link
Contributor

Vhallo commented Jun 10, 2024

Good to see the performance issues being solved. Seems like it might be worthwhile to integrate this into Exllamav2 / TabbyAPI now as well?

@oobabooga
Copy link
Owner

Thanks for the reviews, merging now before merging DRY to the main branch.

@oobabooga oobabooga merged commit 3abafee into oobabooga:dev Jun 13, 2024
PoetOnTheRun pushed a commit to PoetOnTheRun/text-generation-webui that referenced this pull request Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants