-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Aggregate function to help concatenate prompts #4
Labels
enhancement
New feature or request
Comments
This was referenced Jan 12, 2023
simonw
added a commit
that referenced
this issue
Jan 12, 2023
simonw
added a commit
that referenced
this issue
Jan 12, 2023
I'm going to ship an alpha with this now so I can deploy and test it a bit before writing the docs. |
simonw
added a commit
that referenced
this issue
Jan 12, 2023
Example query using the new function: with texts as (
select body from blog_entry order by id desc limit 5
)
select json_object('pre', openai_build_prompt(openai_strip_tags(body), 'Context:
------------
', '
------------
Given the above context, answer the following question: ' || :question, 30, 1000)) from texts |
simonw
added a commit
to simonw/simonwillisonblog-backup
that referenced
this issue
Jan 12, 2023
simonw
added a commit
that referenced
this issue
Jan 13, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I keep getting errors because I send prompts that were too large for the model.
With GPT-3 I believe it's possible to count tokens in advance. I want an aggregate function to help me assembly a prompt by truncating each text as necessary to fit.
The text was updated successfully, but these errors were encountered: