Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add PaLM-2 models [EXPERIMENTAL] #2087

Merged
merged 9 commits into from
Dec 6, 2023
Merged

Add PaLM-2 models [EXPERIMENTAL] #2087

merged 9 commits into from
Dec 6, 2023

Conversation

JosselinSomervilleRoberts
Copy link
Contributor

This adds 5 PaLM-2 models and a VertexAI client:

  • text-bison@001
  • text-bison-32k
  • text-unicorn@001
  • code-bison@001
  • code-bison-32k
    Other models are available here but I only added those necessary and relevant to HELM (although Imagen could be added for HEIM).

There are several issues related to the lack of documentation that have been opened:

I have not been able to verify the context length due to the very small quota I have and used what is provided in the API ref.
We should add a max_output_tokens to WindowService as some models have a very low value for this which would probably break on some scenarios (code-gecko@001 had a limit of 64 output tokens). (See #2086)

@yifanmai yifanmai merged commit 7faab61 into main Dec 6, 2023
6 checks passed
@yifanmai yifanmai deleted the joss-bison branch December 6, 2023 18:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants