Skip to content

Commit

Permalink
Move LLM Caching docs to topics (microsoft#1950)
Browse files Browse the repository at this point in the history
* Move LLM Caching docs to topics

* Update llm-caching.md
  • Loading branch information
jackgerrits authored Mar 11, 2024
1 parent 758c3a5 commit 3c51d92
Show file tree
Hide file tree
Showing 2 changed files with 49 additions and 50 deletions.
50 changes: 0 additions & 50 deletions website/docs/Use-Cases/agent_chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -327,56 +327,6 @@ With the pluggable auto-reply function, one can choose to invoke conversations w

Another approach involves LLM-based function calls, where LLM decides if a specific function should be invoked based on the conversation's status during each inference. This approach enables dynamic multi-agent conversations, as seen in scenarios like [multi-user math problem solving scenario](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_two_users.ipynb), where a student assistant automatically seeks expertise via function calls.

### LLM Caching

Since version 0.2.8, a configurable context manager allows you to easily
configure LLM cache, using either DiskCache or Redis. All agents inside the
context manager will use the same cache.

```python
from autogen import Cache

# Use Redis as cache
with Cache.redis(redis_url="redis://localhost:6379/0") as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)

# Use DiskCache as cache
with Cache.disk() as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

You can vary the `cache_seed` parameter to get different LLM output while
still using cache.

```python
# Setting the cache_seed to 1 will use a different cache from the default one
# and you will see different output.
with Cache.disk(cache_seed=1) as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

By default DiskCache uses `.cache` for storage. To change the cache directory,
set `cache_path_root`:

```python
with Cache.disk(cache_path_root="/tmp/autogen_cache") as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

For backward compatibility, DiskCache is on by default with `cache_seed` set to 41.
To disable caching completely, set `cache_seed` to `None` in the `llm_config` of the agent.

```python
assistant = AssistantAgent(
"coding_agent",
llm_config={
"cache_seed": None,
"config_list": OAI_CONFIG_LIST,
"max_tokens": 1024,
},
)
```

### Diverse Applications Implemented with AutoGen

The figure below shows six examples of applications built using AutoGen.
Expand Down
49 changes: 49 additions & 0 deletions website/docs/topics/llm-caching.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# LLM Caching

Since version [`0.2.8`](https://github.com/microsoft/autogen/releases/tag/v0.2.8), a configurable context manager allows you to easily
configure LLM cache, using either [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) or [`RedisCache`](/docs/reference/cache/redis_cache#rediscache). All agents inside the
context manager will use the same cache.

```python
from autogen import Cache

# Use Redis as cache
with Cache.redis(redis_url="redis://localhost:6379/0") as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)

# Use DiskCache as cache
with Cache.disk() as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

You can vary the `cache_seed` parameter to get different LLM output while
still using cache.

```python
# Setting the cache_seed to 1 will use a different cache from the default one
# and you will see different output.
with Cache.disk(cache_seed=1) as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

By default [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) uses `.cache` for storage. To change the cache directory,
set `cache_path_root`:

```python
with Cache.disk(cache_path_root="/tmp/autogen_cache") as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
```

For backward compatibility, [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) is on by default with `cache_seed` set to 41.
To disable caching completely, set `cache_seed` to `None` in the `llm_config` of the agent.

```python
assistant = AssistantAgent(
"coding_agent",
llm_config={
"cache_seed": None,
"config_list": OAI_CONFIG_LIST,
"max_tokens": 1024,
},
)
```

0 comments on commit 3c51d92

Please sign in to comment.