Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support for llama models proper messaging #114

Merged
merged 29 commits into from
Oct 19, 2024

Conversation

jorgeantonio21
Copy link
Contributor

@jorgeantonio21 jorgeantonio21 commented Oct 15, 2024

Add support for Llama 2 models and update prompt formatting

Motivation

As Llama models continue to evolve, it's crucial to support both Llama 2 and Llama 3 models in our codebase. This PR aims to add support for Llama 2 models and update the prompt formatting logic to accommodate both model families, ensuring our application can work seamlessly with the latest advancements in language models.

Description

This PR introduces the following changes:

  1. Added support for Llama 2 models:

    • Added new model variants: meta-llama/Meta-Llama-2-7b, meta-llama/Llama-2-7b-chat-hf, and meta-llama/Llama-2-70b-hf
    • Updated the Model enum to include these new variants
  2. Implemented model-specific prompt formatting:

    • Added a new messages_to_prompt method to the Model enum
    • Implemented messages_to_llama2_prompt function for Llama 2 models
    • Implemented messages_to_llama3_prompt function for Llama 3 models
    • Updated the RequestBody::to_generate_request method to use the new model-specific prompt formatting
  3. Updated the ToolCall struct:

    • Added a function_call_string method to generate a string representation of function calls for Llama 3 prompts
  4. Added comprehensive unit tests:

    • Tests for both Llama 2 and Llama 3 prompt formatting
    • Tests for various message combinations and edge cases
  5. Updated documentation and comments to reflect the new changes

The prompt formatting follows the official Llama documentation:

  • Llama 2 Messaging formatting: docs
  • Llama 3 Messaging formatting: docs

Breaking Changes

There are no breaking changes in this PR. The existing functionality for Llama 3 models remains unchanged, and the new Llama 2 support is added in a backwards-compatible manner.

However, users should be aware that the prompt formatting for Llama 2 models differs from Llama 3 models. When switching between model families, ensure that your input adheres to the correct format for the chosen model.

@jorgeantonio21 jorgeantonio21 merged commit cf10063 into main Oct 19, 2024
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants