-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extend the token length if at all possible! #6
Comments
WebUI splits your long prompts by chunks of 75 tokens each. It tries to do this intelligently, for example, when reaching 75th token it backtracks to find a nearest comma – to split at it, rather than in the middle of a sentence. (You can control, how much it can backtrack by dedicated option in Settings of WebUI, by default its 20). When you reach this limit, your xx/75 token counter becomes xx/150. That's how you can check, how many "chunks" you have. I don't recommend relying on this behavior blindly. Much better is to use the keyword This way it would be much better for your final result, because those chunks are working separately but together. It is good to keep similar things in one chunk, rather than at random. So, for example, your long prompt can be: Now, where Embedding Merge can help? You might try to save tokens with it, when you are slightly over the limit already. For example, you could try to use An embedding alone cannot be larger than 75 tokens. It is physically impossible! But you can try to fit there many things, for example Actually, my Embedding Merge is a big failure, because I thought it would solve a different problem: binding properties to objects. Here are some more ideas on the subject: There are only two unique features that Embedding Merge actually gives:
Generally, EM is more for research rather than for everyday usage. |
Related: #4 (comment) |
True! And actually lmao I use EM a TON in practice - XD i'm one of those "WORK DUMBER NOT SMARTER" model makers :D |
Closing this beause i think i FINALLY UNDERSTAND how to get around this :D |
Dunno if it IS POSSIBLE, and i'm aware this is sort of updated at will but you saved me a TON OF TIME AND I MADE LIKE 10 + embeds last night with this.
Some prompts have 100-200 tokens, and if possible it'd be interesting to see if you COULD in theory extend the token length with this plugin.
<3
Much adoration.
ThANK YOU!!
The text was updated successfully, but these errors were encountered: