- Core - Fundamental tools for working with prompts and LLMs.
- Document Search - Handles vector search to retrieve relevant documents.
- CLI - The
ragbits
shell command, enabling tools such as GUI prompt management. - Guardrails - Ensures response safety and relevance.
- Evaluation - Unified evaluation framework for Ragbits components.
- Flow Controls - Manages multi-stage chat flows for performing advanced actions (coming soon).
- Structured Querying - Queries structured data sources in a predictable manner (coming soon).
- Caching - Adds a caching layer to reduce costs and response times (coming soon).
To use the complete Ragbits stack, install the ragbits
package:
pip install ragbits
Alternatively, you can use individual components of the stack by installing their respective packages: ragbits-core
, ragbits-document-search
, ragbits-cli
.
First, create a prompt and a model for the data used in the prompt:
from pydantic import BaseModel
from ragbits.core.prompt import Prompt
class Dog(BaseModel):
breed: str
age: int
temperament: str
class DogNamePrompt(Prompt[Dog, str]):
system_prompt = """
You are a dog name generator. You come up with funny names for dogs given the dog details.
"""
user_prompt = """
The dog is a {breed} breed, {age} years old, and has a {temperament} temperament.
"""
Next, create an instance of the LLM and the prompt:
from ragbits.core.llms.litellm import LiteLLM
llm = LiteLLM("gpt-4o")
example_dog = Dog(breed="Golden Retriever", age=3, temperament="friendly")
prompt = DogNamePrompt(example_dog)
Finally, generate a response from the LLM using the prompt:
response = await llm.generate(prompt)
print(f"Generated dog name: {response}")
Ragbits is licensed under the MIT License.
We welcome contributions! Please read CONTRIBUTING.md for more information.