Skip to content

Latest commit

 

History

History
192 lines (130 loc) · 5.69 KB

usage.md

File metadata and controls

192 lines (130 loc) · 5.69 KB

Usage

Set your API key

When running readmeai with a third-party service, you must provide a valid API key. For example, the OpenAI client is set as follows:

export OPENAI_API_KEY=<your_api_key>

# For Windows users:set OPENAI_API_KEY=<your_api_key>
Click to view environment variables for - Ollama, Anthropic, Google Gemini
Ollama

Refer to the Ollama documentation for more information on setting up the Ollama server.

To start, follow these steps:

  1. Pull your model of choice from the Ollama repository:

    ❯ ollama pull llama3.2:latest
  2. Start the Ollama server and set the OLLAMA_HOST environment variable:

    export OLLAMA_HOST=127.0.0.1 && ollama serve
Anthropic
  1. Export your Anthropic API key:

    export ANTHROPIC_API_KEY=<your_api_key>
Google Gemini
  1. Export your Google Gemini API key:

    export GOOGLE_API_KEY=<your_api_key

Using the CLI

Running with a LLM API service

Below is the minimal command required to run readmeai using the OpenAI client:

❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai 

Important

The default model set is gpt-3.5-turbo, offering the best balance between cost and performance.When using any model from the gpt-4 series and up, please monitor your costs and usage to avoid unexpected charges.

ReadmeAI can easily switch between API providers and models. We can run the same command as above with the Anthropic client:

❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai

And finally, with the Google Gemini client:

❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
Running with local models

We can also run readmeai with free and open-source locally hosted models using the Ollama:

❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
Running on a local codebase

To generate a README file from a local codebase, simply provide the full path to the project:

❯ readmeai --repository /users/username/projects/myproject --api openai

Adding more customization options:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
           --output readmeai.md \
           --api openai \
           --model gpt-4 \
           --badge-color A931EC \
           --badge-style flat-square \
           --header-style compact \
           --navigation-style fold \
           --temperature 0.9 \
           --tree-depth 2
           --logo LLM \
           --emojis solar
Running in offline mode

ReadmeAI supports offline mode, allowing you to generate README files without using a LLM API service.

❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai

docker{ width="2%" } Docker

Run the readmeai CLI in a Docker container:

❯ docker run -it --rm \
    -e OPENAI_API_KEY=$OPENAI_API_KEY \
    -v "$(pwd)":/app zeroxeli/readme-ai:latest \
    --repository https://github.com/eli64s/readme-ai \
    --api openai

streamlit{ width="2%" } Streamlit

Try readme-ai directly in your browser on Streamlit Cloud, no installation required.

See the readme-ai-streamlit repository on GitHub for more details about the application.

Warning

The readme-ai Streamlit web app may not always be up-to-date with the latest features. Please use the command-line interface (CLI) for the most recent functionality.

build-from-source{ width="2%" } From source

Click to run readmeai from source

bash{ width="2%" } Bash

If you installed the project from source with the bash script, run the following command:

  1. Activate the virtual environment:

    ❯ conda activate readmeai
  2. Run the CLI:

    ❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

poetry{ width="2%" } Poetry

  1. Activate the virtual environment:

    ❯ poetry shell
  2. Run the CLI:

    ❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

line break