Skip to content

Commit

Permalink
Merge pull request #16 from Agenta-AI/optional_params
Browse files Browse the repository at this point in the history
Optional parameters now can be added in the code | Code can now be also run with cli without any change
  • Loading branch information
mmabrouk authored May 19, 2023
2 parents 8de40c1 + 2139108 commit 9a04ba0
Show file tree
Hide file tree
Showing 8 changed files with 105 additions and 97 deletions.
24 changes: 11 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,18 @@
# Agenta Lab: Streamline Your LLM-App Development

Agenta is an open-source CI/CD platform designed to simplify and accelerate the development and deployment of LLM-powered applications such as chatbots, agents, Q&A systems, and more.
Agenta is an open-source CI/CD platform designed to simplify and accelerate the development and deployment of LLM-powered applications such as chatbots, agents, Q&A systems, and more. Agenta is targeted towards technnical audience bringing LLM-powered apps into production.

Building LLM-powered apps is currently very frustrating. You need to iterate over multiple versions and play around with 100s of parameters to find something that works. Agenta streamline this process to allow you to bring your app to production faster with the certainty that it works well.
Building LLM-powered apps is currently very frustrating. It involves a significant amount of trial and error and a lots of parameters to tune and countless iterations. Agenta simplifies this process, enabling you to quickly iterate, experiment, and optimize your LLM apps.

Agenta is targeted towards technnical developers building complex LLM-powered apps into production.
## What can you do with Agenta Lab?

[x] Playground to test different parameters: With a couple of lines modify your custom code to specify which parameters you want to experiment with. Then you (or your colleagues) can test your application and experiment with different parameters directly through a user-friendly web platform.
[x] Version evaluation: Create test sets, evaluate and compare different versions of your app
[] Regression Testing: Run regression tests based on real data whenevery you deploy a new version.
[x] Effortless API Deployment: Agenta allows developers to deploy their LLM applications as an API without any extra effort. (Currently only locally)Í
[] Monitoring and Logging: Agenta provides a dashboard to monitor and log your app's performance and usage. You can also monitor the performance of your app in production and compare it to previous versions.
[] A/B Testing & User Feedback: Experiment with different app versions and gather valuable user feedback for continuous improvement.
[] Automated Deployment: Push a commit to automatically deploy your app, saving time and minimizing human error.

## Why another platform for building LLM-apps?

Expand All @@ -16,16 +24,6 @@ There are a number of great platforms for building LLM apps, yet we find that no
- Collaboration with non technical users: We realized that building LLM-powered apps involves the collaboration between developer and domain experts who might not be technical. We wanted to build a tool that allows both to collaborate and build apps together. The developer writes the main code, while the domain expert can edit and modify parameters (e.g. prompts, hyperparameters, etc.) and label the results for evaluation
- Open-source: We wanted to be able to contribute to the platform and extend it to our needs.

## Features
- Automated Deployment: Push a commit to automatically deploy your app, saving time and minimizing human error.
- App Evaluation: Test and compare app performance with regression tests, output comparisons, and intermediate output analysis.
- A/B Testing & User Feedback: Experiment with different app versions and gather valuable user feedback for continuous improvement.
- Workflow Management: Launch evaluations, benchmarking, and labeling workflows to make informed decisions and ensure the quality of your app.
- Local Deployment: Deploy your app locally along with the required vector database for seamless integration.

Please note that some features mentioned above are part of our future roadmap. Currently, Agenta supports monitoring, logging, and evaluations.

Follow the steps below for installation and testing instructions.

## Architecture

Expand Down
85 changes: 76 additions & 9 deletions agenta-cli/agenta/agenta.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,14 @@
import argparse
import functools
import inspect
import os
import sys
from typing import Any, Callable, Optional

from dotenv import load_dotenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware

app = FastAPI()

origins = [
Expand All @@ -14,18 +23,76 @@
allow_headers=["*"],
)

def post(func):
"""post decorator

Arguments:
func -- _description_
class TextParam(str):

@classmethod
def __modify_schema__(cls, field_schema):
field_schema.update({"x-parameter": "text"})


class FloatParam(float):

@classmethod
def __modify_schema__(cls, field_schema):
field_schema.update({"x-parameter": "text"})


def post(func: Callable[..., Any]):
load_dotenv() # TODO: remove later when we have a better way to inject env variables
sig = inspect.signature(func)
func_params = sig.parameters

# find the optional parameters for the app
app_params = {name: param for name, param in func_params.items()
if param.annotation in {TextParam, FloatParam}}
# find the default values for the optional parameters
for name, param in app_params.items():
default_value = param.default if param.default is not param.empty else None
app_params[name] = default_value

@functools.wraps(func)
def wrapper(*args, **kwargs):
kwargs = {**app_params, **kwargs}
return func(*args, **kwargs)

new_params = []
for name, param in sig.parameters.items():
if name in app_params:
new_params.append(
inspect.Parameter(
name,
inspect.Parameter.KEYWORD_ONLY,
default=app_params[name],
annotation=Optional[param.annotation]

)
)
else:
new_params.append(param)

wrapper.__signature__ = sig.replace(parameters=new_params)

Returns:
_description_
"""
route = f"/{func.__name__}"
app.post(route)(func)
return func
app.post(route)(wrapper)

# check if the module is being run as the main script
if os.path.splitext(os.path.basename(sys.argv[0]))[0] == os.path.splitext(os.path.basename(inspect.getfile(func)))[0]:
parser = argparse.ArgumentParser()
# add arguments to the command-line parser
for name, param in sig.parameters.items():
if name in app_params:
# For optional parameters, we add them as options
parser.add_argument(f"--{name}", type=type(param.default),
default=param.default)
else:
# For required parameters, we add them as arguments
parser.add_argument(name, type=param.annotation)

args = parser.parse_args()
print(func(**vars(args)))

return wrapper


def get(func):
Expand Down
3 changes: 1 addition & 2 deletions agenta-cli/agenta/docker/docker-assets/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,7 @@

import agenta
import app # This will register the routes with the FastAPI application
from dotenv import load_dotenv # Import the load_dotenv function
from dotenv import load_dotenv # I∑mport the load_dotenv function

if __name__ == "__main__":
load_dotenv() # Load the environment variables from .env
run("agenta:app", host="0.0.0.0", port=80)
5 changes: 2 additions & 3 deletions agenta-cli/agenta/docker/docker_utils.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
import docker
import os
import shutil
from pathlib import Path
from tempfile import TemporaryDirectory
from pathlib import Path
import shutil

import docker
from agenta.config import settings
from docker.models.images import Image

Expand Down
16 changes: 7 additions & 9 deletions agenta-cli/agenta/templates/simple_prompt/app.py
Original file line number Diff line number Diff line change
@@ -1,22 +1,20 @@
from agenta import post
from agenta import post, TextParam, FloatParam
from dotenv import load_dotenv
from langchain.chains import LLMChain
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate

default_prompt = "What is a good name for a company that makes {product}?"


@post
def completion(product: str) -> str:
llm = OpenAI(temperature=0.9)
def completion(product: str, temperature: FloatParam = 0.9, prompt_template: TextParam = default_prompt) -> str:
llm = OpenAI(temperature=temperature)
prompt = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
template=prompt_template,
)
chain = LLMChain(llm=llm, prompt=prompt)
output = chain.run(product=product)
return output


if __name__ == "__main__":
load_dotenv()
print(completion("socks"))
return output
42 changes: 0 additions & 42 deletions examples/pitch_genius/agenta.py

This file was deleted.

18 changes: 8 additions & 10 deletions examples/pitch_genius/app.py
Original file line number Diff line number Diff line change
@@ -1,26 +1,24 @@
from agenta import post
from agenta import post, TextParam, FloatParam
from dotenv import load_dotenv
from langchain.chains import LLMChain
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate
import os


@post
def generate(startup_name: str, startup_idea: str) -> str:
prompt_template = """
default_prompt = """
please write a short linkedin message (2 SENTENCES MAX) to an investor pitchin the following startup:
startup name: {startup_name}
startup idea: {startup_idea}"""
llm = OpenAI(temperature=0.9)


@post
def generate(startup_name: str, startup_idea: str, prompt_template: TextParam = default_prompt, temperature: FloatParam = 0.5) -> str:
llm = OpenAI(temperature=temperature)
prompt = PromptTemplate(
input_variables=["startup_name", "startup_idea"],
template=prompt_template)

chain = LLMChain(llm=llm, prompt=prompt)
output = chain.run(startup_name=startup_name, startup_idea=startup_idea)
return output


if __name__ == "__main__":
load_dotenv()
print(generate("Agenta AI", "Developer tool for LLM-powered apps"))
9 changes: 0 additions & 9 deletions examples/pitch_genius/main.py

This file was deleted.

0 comments on commit 9a04ba0

Please sign in to comment.