Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs]: LlamaIndex ReAct agent tutorial added #2036

Merged
merged 1 commit into from
Nov 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
172 changes: 172 additions & 0 deletions docs/examples/llama-index-mem0.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
---
title: LlamaIndex ReAct Agent
---
Create a ReAct Agent with LlamaIndex which uses Mem0 as the memory store.

### Overview
A ReAct agent combines reasoning and action capabilities, making it versatile for tasks requiring both thought processes (reasoning) and interaction with tools or APIs (acting). Mem0 as memory enhances these capabilities by allowing the agent to store and retrieve contextual information from past interactions.

### Setup
```bash
pip install llama-index-core llama-index-memory-mem0
```

Initialize the LLM.
```python
import os
from llama_index.llms.openai import OpenAI

os.environ["OPENAI_API_KEY"] = "<your-openai-api-key>"
llm = OpenAI(model="gpt-4o")
```

Initialize the Mem0 client. You can find your API key [here](https://app.mem0.ai/dashboard/). Read about Mem0 [Open Source](https://docs.mem0.ai/open-source/quickstart).
```python
os.environ["MEM0_API_KEY"] = "<your-mem0-api-key>"

from llama_index.memory.mem0 import Mem0Memory

context = {"user_id": "david"}
memory_from_client = Mem0Memory.from_client(
context=context,
api_key=os.environ["MEM0_API_KEY"],
search_msg_limit=4, # optional, default is 5
)
```

Create the tools. These tools will be used by the agent to perform actions.
```python
from llama_index.core.tools import FunctionTool

def call_fn(name: str):
"""Call the provided name.
Args:
name: str (Name of the person)
"""
return f"Calling... {name}"

def email_fn(name: str):
"""Email the provided name.
Args:
name: str (Name of the person)
"""
return f"Emailing... {name}"

def order_food(name: str, dish: str):
"""Order food for the provided name.
Args:
name: str (Name of the person)
dish: str (Name of the dish)
"""
return f"Ordering {dish} for {name}"

call_tool = FunctionTool.from_defaults(fn=call_fn)
email_tool = FunctionTool.from_defaults(fn=email_fn)
order_food_tool = FunctionTool.from_defaults(fn=order_food)
```

Initialize the agent with tools and memory.
```python
from llama_index.core.agent import FunctionCallingAgent

agent = FunctionCallingAgent.from_tools(
[call_tool, email_tool, order_food_tool],
llm=llm,
memory=memory_from_client, # or memory_from_config
verbose=True,
)
```

Start the chat.
<Note> The agent will use the Mem0 to store the relavant memories from the chat. </Note>

Input
```python
response = agent.chat("Hi, My name is David")
print(response)
```
Output
```text
> Running step bf44a75a-a920-4cf3-944e-b6e6b5695043. Step input: Hi, My name is David
Added user message to memory: Hi, My name is David
=== LLM Response ===
Hello, David! How can I assist you today?
```

Input
```python
response = agent.chat("I love to eat pizza on weekends")
print(response)
```
Output
```text
> Running step 845783b0-b85b-487c-baee-8460ebe8b38d. Step input: I love to eat pizza on weekends
Added user message to memory: I love to eat pizza on weekends
=== LLM Response ===
Pizza is a great choice for the weekend! If you'd like, I can help you order some. Just let me know what kind of pizza you prefer!
```
Input
```python
response = agent.chat("My preferred way of communication is email")
print(response)
```
Output
```text
> Running step 345842f0-f8a0-42ea-a1b7-612265d72a92. Step input: My preferred way of communication is email
Added user message to memory: My preferred way of communication is email
=== LLM Response ===
Got it! If you need any assistance or have any requests, feel free to let me know, and I can communicate with you via email.
```

### Using the agent WITHOUT memory
Input
```python
agent = FunctionCallingAgent.from_tools(
[call_tool, email_tool, order_food_tool],
# memory is not provided
llm=llm,
verbose=True,
)
response = agent.chat("I am feeling hungry, order me something and send me the bill")
print(response)
```
Output
```text
> Running step e89eb75d-75e1-4dea-a8c8-5c3d4b77882d. Step input: I am feeling hungry, order me something and send me the bill
Added user message to memory: I am feeling hungry, order me something and send me the bill
=== LLM Response ===
Please let me know your name and the dish you'd like to order, and I'll take care of it for you!
```
<Note> The agent is not able to remember the past prefernces that user shared in previous chats. </Note>

### Using the agent WITH memory
Input
```python
agent = FunctionCallingAgent.from_tools(
[call_tool, email_tool, order_food_tool],
llm=llm,
# memory is provided
memory=memory_from_client, # or memory_from_config
verbose=True,
)
response = agent.chat("I am feeling hungry, order me something and send me the bill")
print(response)
```

Output
```text
> Running step 5e473db9-3973-4cb1-a5fd-860be0ab0006. Step input: I am feeling hungry, order me something and send me the bill
Added user message to memory: I am feeling hungry, order me something and send me the bill
=== Calling Function ===
Calling function: order_food with args: {"name": "David", "dish": "pizza"}
=== Function Output ===
Ordering pizza for David
=== Calling Function ===
Calling function: email_fn with args: {"name": "David"}
=== Function Output ===
Emailing... David
> Running step 38080544-6b37-4bb2-aab2-7670100d926e. Step input: None
=== LLM Response ===
I've ordered a pizza for you, and the bill has been sent to your email. Enjoy your meal! If there's anything else you need, feel free to let me know.
```
<Note> The agent is able to remember the past prefernces that user shared and use them to perform actions. </Note>
5 changes: 4 additions & 1 deletion docs/examples/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,7 @@ Here are some examples of how Mem0 can be integrated into various applications:
<Card title="Customer Support Agent" icon="square-4" href="/examples/customer-support-agent">
Develop a Personal AI Assistant that remembers user preferences, past interactions, and context to provide personalized and efficient assistance.
</Card>
</CardGroup>
<Card title="LlamaIndex Mem0" icon="square-5" href="/examples/llama-index-mem0">
Create a ReAct Agent with LlamaIndex which uses Mem0 as the memory store.
</Card>
</CardGroup>
3 changes: 2 additions & 1 deletion docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,8 @@
"examples/mem0-with-ollama",
"examples/personal-ai-tutor",
"examples/customer-support-agent",
"examples/personal-travel-assistant"
"examples/personal-travel-assistant",
"examples/llama-index-mem0"
]
}
],
Expand Down