Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Naming for Gemma2B and Others #384

Open
smokestacklightnin opened this issue Apr 7, 2024 · 1 comment
Open

Naming for Gemma2B and Others #384

smokestacklightnin opened this issue Apr 7, 2024 · 1 comment

Comments

@smokestacklightnin
Copy link
Contributor

The Gemma2B model is provided by Ollama, vLLM, and possibly others. Originally, we included it as ragna.assistants.Gemma2B from Ollama, but there could be a naming conflict if vLLM is included in the future. In order to prevent that possibility in the future, I renamed it to ragna.assistants.OllamaGemma2B. The same issue applies to other models provided by Ollama, but only Gemma2B is currently included in Ragna.

It would be helpful to get others' opinions and ideas on this naming scheme.

@pmeier mentioned that we should make a decision before the next Ragna release.

@pmeier
Copy link
Member

pmeier commented Jul 12, 2024

I have an RFD in #450 that, if accepted, would make this discussion obsolete. We would have an OllamaAssistant and a VllmAssistant and just provide the respective model as parameter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants