Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add testing for docs #151

Merged
merged 6 commits into from
Nov 2, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion docs/how-tos/set-configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ file.
Run the following command, and answer the questions when prompted:

```bash
ragna config
ragna init
```

![ragna config executed in the terminal showing questions and selections of the form: Which of the following statements describes best what you want to do? I want to try Ragna and its builtin components; How do you want to select the components? I want to manually select the builtin components I want to use. This continues to allow selecting the [Chroma] source storage and the [OpenAI/gpt-4] assistant.](images/ragna-config-wizard.png)
Expand Down Expand Up @@ -45,6 +45,11 @@ url = "http://127.0.0.1:31477"

You can use `ragna.toml` for setting configurations in your applications:

<!--
Using `py``` instesd of `python`` allows for syntax highlighting without doctesting.
This is a work around until https://github.com/koaning/mktestdocs/issues/7 is implemented.
-->

```py
from ragna import Config

Expand Down
37 changes: 19 additions & 18 deletions docs/tutorials/python-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jupyter lab --version

Start JupyterLab:

```
```bash
jupyter lab
```

Expand Down Expand Up @@ -50,7 +50,7 @@ sources that you provide. In Ragna, you can use text files to share this informa

Create a `ragna.txt` and add some relevant text to it, for example:

```py
```python
path = "ragna.txt"

with open(path, "w") as file:
Expand All @@ -75,7 +75,7 @@ vector database[^1], and similar to assistants, Ragna has a few built-in options

You select the demo source storage:

```py
```python
from ragna.source_storages import RagnaDemoSourceStorage
```

Expand All @@ -93,22 +93,21 @@ Ragna has the following built-in options:

Pick the demo assistant for this tutorial:

```py
```python
from ragna.assistants import RagnaDemoAssistant
```

!!! note

You need to get API keys and set relevant environment variables
to use all Assistants (except the `RagnaDemoAssistant`).
!!! note The RagnaDemoAssistant is not an assistant(LLM), instead it replies with the
your prompt and a static message. It is only to understand the Ragna API. You need to
get API keys and set relevant environment variables to use the supported assistants.

## Step 5: Start a chat

That's all the setup, you can now use Ragna to create a chat app.

### Create a `Rag` object with your configuration

```py
```python
from ragna.core import Rag

rag = Rag(config)
Expand All @@ -132,16 +131,18 @@ the `async` and `await` keywords with the function definition and call respectiv
You can provide your assistant, document, and source storage selections to the
`rag.chat` function, and share your prompt (question to the LLM) using `.answer()`:

```py
async with rag.chat(
documents=[path],
source_storage=RagnaDemoSourceStorage,
assistant=RagnaDemoAssistant,
) as chat:
prompt = "What is Ragna?"
answer = await chat.answer(prompt)
```python

print(answer)
async def main():
async with rag.chat(
documents=[path],
source_storage=RagnaDemoSourceStorage,
assistant=RagnaDemoAssistant,
) as chat:
prompt = "What is Ragna?"
answer = await chat.answer(prompt)

print(answer)
```

## Complete example script
Expand Down
47 changes: 27 additions & 20 deletions docs/tutorials/rest-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,19 @@ Ragna workflows starts with a configuration. You can select the components like
assistant (LLM) and source storage (vector database) and set options like the API
endpoint and cache location.

To quickly try out Ragna, you can use the `demo` configuration:
Use the CLI wizard and pick the first option to generate a basic configuration file:

```py
config = Config.demo()
```bash
ragna init
```

It includes the `RagnaDemoAssistant` and `RagnaDemoSourceStorage`.
Create a configuration using the file:

```python
from ragna import Config

config = Config.from_file("ragna.toml")
```

Learn more in [How to set configuration](../how-tos/set-configuration.md).

Expand All @@ -29,11 +35,10 @@ You can use the [`ragna api`](../references/cli.md#ragna-api) command to start t
API from you terminal. The command includes a `--config` option where you can chose your
preferred configuration.

This tutorial is using the `demo` configuration, so you can start the API using the
built-in `demo` config option:
The `ragna.toml` file generated by the CLI wizard will be used by default:

```bash
ragna api --config demo
ragna api
```

Once started, use the displayed URL to connect to the running API. By default, Ragna
Expand All @@ -46,7 +51,9 @@ choice!

Let's connect to the API with an `AsyncClient`:

```py
```python
import httpx

client = httpx.AsyncClient(base_url=config.api.url)
```

Expand All @@ -61,7 +68,7 @@ For demonstration or exploration alone, you set the password as an environment v
and use that for demo authentication:

```bash
export RAGNA_DEMO_AUTHENTICATION_PASSWORD="*****"
export RAGNA_DEMO_AUTHENTICATION_PASSWORD="my_password"
```

And, use this password with any username.
Expand All @@ -74,7 +81,7 @@ token = (
"/token",
data={
"username": USERNAME,
"password": "*****",
"password": "my_password",
},
)
).json()
Expand All @@ -98,7 +105,7 @@ Assistants (LLM)s.

### Create a document (optional)

```py
```python
path = "document.txt"

with open(path, "w") as file:
Expand All @@ -107,14 +114,14 @@ with open(path, "w") as file:

### Request upload information

```py
```python
response = await client.get("/document", params={"name": path.name})
document_info = response.json()
```

### Upload

```py
```python
response = await client.post(
document_info["url"],
data=document_info["data"],
Expand All @@ -129,7 +136,7 @@ Model) you want to use for the chat.

View options available as per your configuration:

```py
```python
response = await client.get("/components")
components = response.json()

Expand All @@ -148,7 +155,7 @@ components = response.json()
Select your preferred options. As per the demo configuration, the following snippet
selects the `RagnaDemoSourceStorage` and `RagnaDemoAssistant`.

```py
```python
SOURCE_STORAGE = components["source_storages"][0]["title"]
ASSISTANT = components["assistants"][0]["title"]
```
Expand All @@ -159,7 +166,7 @@ With selection and setup complete, you can start a Ragna chat.

### Create a new chat

```py
```python
response = await client.post(
"/chats",
json={
Expand All @@ -175,7 +182,7 @@ chat = response.json()

### Prepare the chat

```py
```python
CHAT_ID = chat["id"]

response = await client.post(f"/chats/{CHAT_ID}/prepare")
Expand All @@ -184,7 +191,7 @@ chat = response.json()

### Share prompts and get answers

```py
```python
response = await client.post(
f"/chats/{CHAT_ID}/answer", params={"prompt": "What is Ragna?"}
)
Expand All @@ -197,7 +204,7 @@ print(answer["message"])

### List available chats

```py
```python
response = await client.get("/chats")
chats = response.json()

Expand All @@ -206,6 +213,6 @@ print(chats)

### Delete chats

```py
```python
await client.delete(f"/chats/{CHAT_ID}")
```
2 changes: 2 additions & 0 deletions environment-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ dependencies:
- python-dotenv
- boto3
- pytest >=6
- pytest-mock
- mypy ==1.6.1
- pre-commit
- types-aiofiles
Expand All @@ -22,3 +23,4 @@ dependencies:
- mkdocstrings[python]
- mkdocs-gen-files
- material-plausible-plugin
- mktestdocs
21 changes: 21 additions & 0 deletions tests/test_docs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
from pathlib import Path

import pytest
from mktestdocs import check_md_file

HERE = Path(__file__).parent
DOCS_DIR = (HERE / ".." / "docs").resolve()


@pytest.mark.parametrize("path", DOCS_DIR.glob("**/*.md"), ids=str)
def test_files_good(mocker, path):
mocker.patch("builtins.open", mocker.mock_open())

# FIXME: The REST API tutorial uses await outside sync functions
# this would work in a Jupyter Notebook, but not in a Python script
# We'll also need to have the API running to test this properly.
if path.relative_to(DOCS_DIR).as_posix() == "tutorials/rest-api.md":
with pytest.raises(SyntaxError, match="'await' outside function"):
check_md_file(path, memory=True)
else:
check_md_file(path, memory=True)
Loading