Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tj202/Lambda Definition for Provider #79

Merged
merged 2 commits into from
Feb 18, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Table of Contents
Agent Catalog User Guide <guide>
Command Line Tool <cli>
Agent Catalog API <api>
Provider Configuration <provider>
Environment Variables <env>
On Catalog Entries <entry>
Frequently Asked Questions <faqs>
Expand Down
135 changes: 135 additions & 0 deletions docs/source/provider.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
.. role:: python(code)
:language: python

Provider Configuration
======================

This section describes how to configure the ``agentc`` Provider's decorator based on the Agent Framework used. As mentioned in
the :doc:`api section <api>`, the Provider accepts a decorator (function) to apply to each result yielded by ``get_tools_for()``.

Based on the underlying Agent framework being used, this function can differ. For example, Langchain agents take tools as
``langchain_core.tools.BaseTool`` instances while LlamaIndex's agents take tools as ``llama_index.core.tools.BaseTool`` instances.

The decorator is a lambda function which allows the Provider to apply the necessary transformations like type conversion
according to framework to the tools before returning them. Following are various ways to configure the Provider's decorator:

Langchain/Langgraph/CrewAI
--------------------------

While using these frameworks, the decorator is a lambda function that takes the Agent Catalog tool and returns an instance of the
``langchain_core.tools.BaseTool`` class which is called by the ReAct agents during runtime.

Following is an example on how the provider can be defined to use with Langgraph and CrewAI agents:

.. code-block:: python

import agentc
from langchain_core.tools import tool
from langchain_openai.chat_models import ChatOpenAI
from langgraph.prebuilt import create_react_agent

llm = ChatOpenAI(model="gpt-4o", openai_api_key="<<OPENAI_API_KEY>>", temperature=0)

provider = agentc.Provider(
decorator=lambda t: tool(t.func)
)
tools = provider.get_item(name="<<TOOL>>", item_type="tool")


Langraph agent:

.. code-block:: python

# Define ReAct Agent using Langgraph
research_agent = create_react_agent(
model=llm,
tools=tools,
state_modifier="<<PROMPT>>",
)

CrewAI agent:

.. code-block:: python

from crewai import Agent, Crew, Process

# Define Agent using CrewAI
data_exploration_agent = Agent(
role="Data Exploration Specialist",
goal="Perform an exploratory data analysis (EDA) on the provided dataset ...",
tools=tools,
verbose=True
)

# Define the crew with agents and tasks
data_analysis_crew = Crew(
agents=[data_exploration_agent,<<OTHER_AGENTS>>],
tasks=[<<TASKS>>],
manager_llm=llm,
process=Process.hierarchical,
verbose=True
)

LlamaIndex
----------

For LlamaIndex agents, the decorator is a lambda function that takes the Agent Catalog tool and returns an instance of the
``llama_index.core.tools.BaseTool`` class which is called by them during runtime.

.. note::

Agent Catalog allows you to define your own tools which should be used to write the base logic. For example, to perform vector
search, the Agent Catalog ``semantic_search`` Tool should be used instead of LlamaIndex's ``QueryEngineTool`` which
can then be wrapped as a ``llama_index.core.tools.BaseTool`` instance in the decorator.

Following is an example that converts any Agent Catalog tool to a LlamaIndex ``FunctionTool`` instance and passes it to the ReAct agent:

.. code-block:: python

import agentc
from llama_index.core.tools.function_tool import FunctionTool
from llama_index.core.agent.react import ReActAgent
from llama_index.llms.openai.base import OpenAI

llm = OpenAI(model="gpt-4o")

provider = agentc.Provider(
decorator=lambda t: FunctionTool.from_defaults
(fn=t.func,
description=t.meta.description,
name=t.meta.name)
)
tools = provider.get_item(name="<<TOOL>>", item_type="tool"

agent = ReActAgent.from_tools(tools=tools, llm=llm, verbose=True, context="<<PROMPT>>")

Controlflow
-----------

For Controlflow agents, the decorator is a lambda function that takes the Agent Catalog tool and returns an instance of the
``controlflow.tools.Tool`` class which is called by them during runtime.

Following is an example that converts any Agent Catalog tool to a Controlflow tool/callable and passes it to the agent:

.. code-block:: python

import agentc
from controlflow.tools import Tool
from controlflow.agent import Agent
from langchain_openai.chat_models import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o", temperature=0)

provider = agentc.Provider(
decorator=lambda t: Tool.from_function(t.func),
)
tools = provider.get_item(name="<<TOOL>>", item_type="tool"

agent = Agent(
name="Starter Agent",
model=llm,
tools=tools
)


Information on using the Provider with more frameworks will be added soon!
Loading