Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added various settings for ollama and llamacpp using to increase configurability. #1677

Conversation

icsy7867
Copy link
Contributor

@icsy7867 icsy7867 commented Mar 4, 2024

llama-cpp https://llama-cpp-python.readthedocs.io/en/latest/api-reference/
https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html#

ollama - https://github.com/run-llama/llama_index/blob/eeb2a60387b8ae1994005ad0eebb672ee02074ff/llama-index-integrations/llms/llama-index-llms-ollama/llama_index/llms/ollama/base.py

No configurable changes. -
openailike - https://docs.llamaindex.ai/en/stable/examples/llm/localai.html#localai

Not sure about the model_kwargs. The value is references for openai, but I could not find documentation on what values were allowed.
openai - https://github.com/run-llama/llama_index/blob/eeb2a60387b8ae1994005ad0eebb672ee02074ff/llama-index-integrations/llms/llama-index-llms-openai/llama_index/llms/openai/base.py
https://docs.llamaindex.ai/en/stable/examples/llm/openai.html

For the text/description I used the values found here:
https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

LlamaCPP, where it used the same K/V, had the same values. However my setup is currently using ollama, need some testing done for LlamaCPP.

I also added the temperature under the main llm.settings. This should allow the models that supports this value to be edited/changed.

@imartinez imartinez deleted the branch zylon-ai:feature/upgrade-llamaindex March 6, 2024 16:51
@imartinez imartinez closed this Mar 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants