Natural language interface for your command line.
A lightweight and portable autopilot utility for CLI tasks that takes natural language as input and uses the LLM of your choice to take the appropriate actions by generating and parsing shell scripts. Find answers to questions and let AI execute commands with your permission in Safe Mode or enable Autopilot to automate tasks or script modules and microservices on the fly.
Warning: Giving LLMs shell-level access to your computer is dangerous and should only be done in sandbox or otherwise expendable environments.
I made cli-FSD for experimenting and problem solving in low stakes development environments. If you don't have access to a machine like that you can try it below:
- Python 3.10 or later (may work with earlier versions)
- pip 24.0 or later
- An OpenAI API key or Anthropic API key or Ollama running in the same environment as cli-FSD
- Pre-requisites:
-
Upgrade pip
python3 -m pip install --upgrade pip
One line install using pip:
pip install cli-FSD
(if you are testing the package, follow steps to setup and activate a virtual environment before running pip install.).
Manual Installation
-
Clone the repo:
git clone https://github.com/WazaCraft/cli-FSD cd cli-FSD
-
Set up a Python virtual environment:
python -m venv FSD
-
Activate the virtual environment:
-
On Windows:
.\FSD\Scripts\activate
-
On Unix or MacOS:
source FSD/bin/activate
-
-
Install the cli-FSD Python package:
pip install .
-
To start in safe-mode in your Terminal:
@ what time is it -s
-
To run in companion mode and process a specific task using autopilot type '@' from anywhere in your terminal followed by a command:
@ what time is it
-
For additional options, you can enter
CMD
mode by typingCMD
at any prompt.
Letting an LLM execute code on your computer is objectively dangerous. I've used cli-FSD on every computer I own but think it's important for users to understand the risk associated with this concept.
If you don't want to run it locally:
v0.94
- - Added support for Ollama (use -o to run cli-FSD using any supported local LLM model)
- - support for script custom gpt-4-turbo assistant OpenAI's Assistants API to revie v0.87
- - finish OpenAI Assistants integration (done but but needs to be made accessible in main.py)
- - integrate other LLM providers with http request APIs similar to /v1/completions (pass -c with your query to use Anthropic API's Claude 3 Opus)
- - improved error handling
v0.75
- - overhauled and refactored error resolution function to address a bug that sometimes prevented the resolution from executing
- - fixed niche text handling errors and submodule implementation for upcoming OpenAI AssistantsAPI integration v0.52
- - implement LLM error handling and resolution flows
- - refactor flags for SafeMode, Autopilot
- - refactor and expand CMD module
- - build advanced menu and config options
- - passive error detection and resolution for CLI interactions
- - voice control
- - voice notation
- - automation schedules and background states
Contributions to this project are welcome. Please fork the repository, make your changes, and submit a pull request for review.
Contributions to the main branch should aim to adhere to the project principles:
- portable (I'm avoiding unnecessary dependencies whenever possible)
- utility focused
This project is licensed under the GNU GPL - see the LICENSE.md file for details.
- OpenAI for providing the API used for generating chat responses.
- Flask and Flask-CORS for the web server functionality.