Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add local LLM support (with function calling) #97

Merged
merged 21 commits into from
Oct 23, 2023
Merged

Add local LLM support (with function calling) #97

merged 21 commits into from
Oct 23, 2023

Conversation

cpacker
Copy link
Collaborator

@cpacker cpacker commented Oct 23, 2023

Closes #18
(will then move discussion to #67)

See README in https://github.com/cpacker/MemGPT/tree/localllm/memgpt/local_llm

Included: basic proof of concept tested on airoboros 70b 2.1 hosted locally with webui (non-streaming endpoint)


Example using prompt formatting for JSON output only (so no inner monologue, hence the --no_verify flag):

# running airoboros behind a textgen webui server

export OPENAI_API_BASE = <pointing at webui server>
export BACKEND_TYPE = webui

$ python3 main.py --no_verify

Running... [exit by typing '/exit']
💭 Bootup sequence complete. Persona activated. Testing messaging functionality.

💭 None
🤖 Welcome! My name is Sam. How can I assist you today?
Enter your message: My name is Brad, not Chad...

💭 None
⚡🧠 [function] updating memory with core_memory_replace:
         First name: Chad
        → First name: Brad

@cpacker cpacker changed the title [WIP] Add local LLM support (with function calling) Add local LLM support (with function calling) Oct 23, 2023
@cpacker cpacker requested a review from vivi October 23, 2023 06:27
Copy link
Contributor

@vivi vivi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔥

memgpt/local_llm/llm_chat_completion_wrappers/airoboros.py Outdated Show resolved Hide resolved
memgpt/openai_tools.py Show resolved Hide resolved
memgpt/local_llm/webui_settings.py Outdated Show resolved Hide resolved
Copy link
Contributor

@vivi vivi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!!

@cpacker cpacker merged commit 027c25e into main Oct 23, 2023
@cpacker cpacker deleted the localllm branch October 23, 2023 19:34
mattzh72 pushed a commit that referenced this pull request Oct 9, 2024
Add local LLM support (with function calling)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature Request] Support for local LLMs like Ollama
2 participants