Skip to content

llm-workflow-engine/lwe-plugin-provider-chat-together

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLM Workflow Engine (LWE) Chat Together Provider plugin

Chat Together Provider plugin for LLM Workflow Engine

Access to Together chat models.

Installation

From packages

Install the latest version of this software directly from github with pip:

pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-together

From source (recommended for development)

Install the latest version of this software directly from git:

git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-together.git

Install the development package:

cd lwe-plugin-provider-chat-together
pip install -e .

Configuration

Add the following to config.yaml in your profile:

plugins:
  enabled:
    - provider_chat_together
    # Any other plugins you want enabled...
  # THIS IS OPTIONAL -- By default the plugin loads all model data via an API
  # call on startup. This does make startup time longer, and the CLI completion
  # for selecting models is very long!
  # You can instead provide a 'models' object here with the relevant data, and
  # It will be used instead of an API call.
  provider_chat_together:
    models:
      # 'id' parameter of the model as it appears in the API.
      # This is also listed on the model's summary page on the OpenRouter
      # website.
      "meta-llama/Llama-3-8b-chat-hf":
        # The only parameter, and it's required.
        max_tokens: 8192

Usage

From a running LWE shell:

/provider chat_together
/model model_name meta-llama/Llama-3-8b-chat-hf

About

LLM Workflow Engine (LWE) Chat Together Provider plugin

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages