ReadmeAI requires Python 3.9 or higher, plus one installation method of your choice:
Requirement | Details |
---|---|
• Python ≥3.9 | Core runtime |
Installation Method (choose one) | |
• pip | Default Python package manager |
• pipx | Isolated environment installer |
• uv | High-performance package manager |
• docker | Containerized environment |
ReadmeAI needs access to your repository to generate a README file. Current supported platforms include:
Platform | Details |
---|---|
File System | Local repository access |
GitHub | Industry-standard hosting |
GitLab | Full DevOps integration |
Bitbucket | Atlassian ecosystem |
ReadmeAI is model agnostic, with support for the following LLM API services:
Provider | Best For | Details |
---|---|---|
OpenAI | General use | Industry-leading models |
Anthropic | Advanced tasks | Claude language models |
Google Gemini | Multimodal AI | Latest Google technology |
Ollama | Open source | No API key needed |
Offline Mode | Local operation | No internet required |
ReadmeAI is available on PyPI as readmeai and can be installed as follows:
Install with pip (recommended for most users):
❯ pip install -U readmeai
With pipx
, readmeai will be installed in an isolated environment:
❯ pipx install readmeai
The fastest way to install readmeai is with uv:
❯ uv tool install readmeai
To run readmeai
in a containerized environment, pull the latest image from [Docker Hub][dockerhub-link]:
❯ docker pull zeroxeli/readme-ai:latest
Click to build readmeai
from source
-
Clone the repository:
❯ git clone https://github.com/eli64s/readme-ai
-
Navigate to the project directory:
❯ cd readme-ai
-
Install dependencies:
❯ pip install -r setup/requirements.txt
Alternatively, use the [setup script][setup-script] to install dependencies:
-
Run the setup script:
❯ bash setup/setup.sh
Or, use poetry
to build and install project dependencies:
-
Install dependencies with poetry:
❯ poetry install
Important
To use the Anthropic and Google Gemini clients, extra dependencies are required. Install the package with the following extras:
-
Anthropic:
❯ pip install "readmeai[anthropic]"
-
Google Gemini:
❯ pip install "readmeai[google-generativeai]"
-
Install Multiple Clients:
❯ pip install "readmeai[anthropic,google-generativeai]"
When running readmeai
with a third-party service, you must provide a valid API key. For example, the OpenAI
client is set as follows:
❯ export OPENAI_API_KEY=<your_api_key>
# For Windows users:
❯ set OPENAI_API_KEY=<your_api_key>
Click to view environment variables for - Ollama
, Anthropic
, Google Gemini
Ollama
Refer to the Ollama documentation for more information on setting up the Ollama server.
To start, follow these steps:
-
Pull your model of choice from the Ollama repository:
❯ ollama pull llama3.2:latest
-
Start the Ollama server and set the
OLLAMA_HOST
environment variable:❯ export OLLAMA_HOST=127.0.0.1 && ollama serve
Anthropic
-
Export your Anthropic API key:
❯ export ANTHROPIC_API_KEY=<your_api_key>
Google Gemini
-
Export your Google Gemini API key:
❯ export GOOGLE_API_KEY=<your_api_key
Below is the minimal command required to run readmeai
using the OpenAI
client:
❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai
Important
The default model set is gpt-3.5-turbo
, offering the best balance between cost and performance.When using any model from the gpt-4
series and up, please monitor your costs and usage to avoid unexpected charges.
ReadmeAI can easily switch between API providers and models. We can run the same command as above with the Anthropic
client:
❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai
And finally, with the Google Gemini
client:
❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
We can also run readmeai
with free and open-source locally hosted models using the Ollama:
❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
To generate a README file from a local codebase, simply provide the full path to the project:
❯ readmeai --repository /users/username/projects/myproject --api openai
Adding more customization options:
❯ readmeai --repository https://github.com/eli64s/readme-ai \
--output readmeai.md \
--api openai \
--model gpt-4 \
--badge-color A931EC \
--badge-style flat-square \
--header-style compact \
--navigation-style fold \
--temperature 0.9 \
--tree-depth 2
--logo LLM \
--emojis solar
ReadmeAI supports offline mode
, allowing you to generate README files without using a LLM API service.
❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai
Run the readmeai
CLI in a Docker container:
❯ docker run -it --rm \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
--repository https://github.com/eli64s/readme-ai \
--api openai
Try readme-ai directly in your browser on Streamlit Cloud, no installation required.
See the readme-ai-streamlit repository on GitHub for more details about the application.
Warning
The readme-ai Streamlit web app may not always be up-to-date with the latest features. Please use the command-line interface (CLI) for the most recent functionality.
Click to run readmeai
from source
If you installed the project from source with the bash script, run the following command:
-
Activate the virtual environment:
❯ conda activate readmeai
-
Run the CLI:
❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
-
Activate the virtual environment:
❯ poetry shell
-
Run the CLI:
❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
The pytest and nox frameworks are used for development and testing.
Install the dependencies with uv:
❯ uv pip install -r pyproject.toml --all-extras
Run the unit test suite using Pytest:
❯ make test
Using nox, test the app against Python versions 3.9
, 3.10
, 3.11
, and 3.12
:
❯ make test-nox
Tip
Nox is an automation tool for testing applications in multiple environments. This helps ensure your project is compatible with across Python versions and environments.