Skip to content

Commit

Permalink
[FEAT][Together}
Browse files Browse the repository at this point in the history
  • Loading branch information
Your Name committed Oct 12, 2024
1 parent 79c056b commit cd418b6
Show file tree
Hide file tree
Showing 9 changed files with 300 additions and 257 deletions.
47 changes: 47 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,53 @@ print(out)

---

## `TogetherLLM` Documentation

The `TogetherLLM` class is designed to simplify the interaction with Together's LLM models. It provides a straightforward way to run tasks on these models, including support for concurrent and batch processing.

### Initialization

To use `TogetherLLM`, you need to initialize it with your API key, the name of the model you want to use, and optionally, a system prompt. The system prompt is used to provide context to the model for the tasks you will run.

Here's an example of how to initialize `TogetherLLM`:
```python
import os
from swarm_models import TogetherLLM

model_runner = TogetherLLM(
api_key=os.environ.get("TOGETHER_API_KEY"),
model_name="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
system_prompt="You're Larry fink",
)
```
### Running Tasks

Once initialized, you can run tasks on the model using the `run` method. This method takes a task string as an argument and returns the response from the model.

Here's an example of running a single task:
```python
task = "How do we allocate capital efficiently in your opinion Larry?"
response = model_runner.run(task)
print(response)
```
### Running Multiple Tasks Concurrently

`TogetherLLM` also supports running multiple tasks concurrently using the `run_concurrently` method. This method takes a list of task strings and returns a list of responses from the model.

Here's an example of running multiple tasks concurrently:
```python
tasks = [
"What are the top-performing mutual funds in the last quarter?",
"How do I evaluate the risk of a mutual fund?",
"What are the fees associated with investing in a mutual fund?",
"Can you recommend a mutual fund for a beginner investor?",
"How do I diversify my portfolio with mutual funds?",
]
responses = model_runner.run_concurrently(tasks)
for response in responses:
print(response)
```


## **Enterprise-Grade Features**

Expand Down
2 changes: 1 addition & 1 deletion examples/ollama_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@
model = OllamaModel(model_name="", host="")


model.run("What is the theory of the universe")
model.run("What is the theory of the universe")
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "poetry.core.masonry.api"

[tool.poetry]
name = "swarm-models"
version = "0.1.0"
version = "0.1.1"
description = "Swarm Models - Pytorch"
license = "MIT"
authors = ["Kye Gomez <kye@apac.ai>"]
Expand All @@ -30,6 +30,7 @@ diffusers = "*"
loguru = "*"
pydantic = "*"
langchain-community = "0.0.29"
together = "*"



Expand Down
3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,5 @@ transformers
diffusers
loguru
pydantic
langchain-community=="0.0.29"
langchain-community=="0.0.29"
together
2 changes: 1 addition & 1 deletion swarm_models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
)
from swarm_models.popular_llms import ReplicateChat as Replicate
from swarm_models.qwen import QwenVLMultiModal # noqa: E402
from swarm_models.together import TogetherLLM # noqa: E402
from swarm_models.model_types import ( # noqa: E402
AudioModality,
ImageModality,
Expand All @@ -42,6 +41,7 @@
from swarm_models.ollama_model import OllamaModel
from swarm_models.sam_two import GroundedSAMTwo
from swarm_models.utils import * # NOQA
from swarm_models.together_llm import TogetherLLM

__all__ = [
"BaseLLM",
Expand Down
Loading

0 comments on commit cd418b6

Please sign in to comment.