Skip to content

Commit

Permalink
Update readme with Medium article and simplify quickstarts by highlig…
Browse files Browse the repository at this point in the history
…hting binaries
  • Loading branch information
1runeberg committed Aug 23, 2024
1 parent 99ba84b commit c9bb4e8
Show file tree
Hide file tree
Showing 3 changed files with 21 additions and 50 deletions.
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ We provide [pre-built binaries/executables]() for various platforms, making it e

**Note for macOS and iOS users**: *Binaries are not provided due to platform restrictions. Please see the [Compiling on your own](docs/compiling.md) section.*

**Note for Windows users**: *You may encounter a SmartScreen warning since the binaries aren't signed. Rest assured they are safely built via GitHub CI when downloaded directly from the [Releases section](https://github.com/1runeberg/confichat/releases). You can also view the [full build logs](https://github.com/1runeberg/confichat/actions/workflows/publish_release.yml). And of course you can [build from source](docs/compiling.md).*
**Note for Windows users**: *You may encounter a SmartScreen warning since the binaries aren't signed. They are safely built via GitHub CI when downloaded directly from the [Releases section](https://github.com/1runeberg/confichat/releases). You can also view the [full build logs](https://github.com/1runeberg/confichat/actions/workflows/publish_release.yml). And of course you can [build from source](docs/compiling.md).*

❤️ If you find this app useful, consider sponsoring us in [GitHub Sponsors](https://github.com/sponsors/1runeberg) to help us secure necessary certificates and accounts for future binary distributions.

Expand All @@ -40,7 +40,10 @@ We provide [pre-built binaries/executables]() for various platforms, making it e

### 📖 2. Quick Start Guides

Get started quickly with **ConfiChat** by following one of our [quick start guides](docs/quickstart.md) depending on whether you want to use local models, online models, or both.
If you're completely new to offline LLMs, check out this easy [Three-Step guide to get started (including ConfiChat)](https://runeberg.medium.com/getting-started-with-local-ai-llms-in-three-easy-steps-bddebcf26570) - a no-coding, no-dependencies approach.

You can also get started quickly with **ConfiChat** by following one of our [quick start guides](docs/quickstart.md) depending on whether you want to use local models, online models, or both.


<br/>

Expand Down
8 changes: 1 addition & 7 deletions confichat/.idea/workspace.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

56 changes: 15 additions & 41 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,31 +62,15 @@ ollama pull llama3.1

This command will download the Llama 3.1 model to your local machine.

### 3. Set Up ConfiChat
### 3. Run ConfiChat

Next, download and set up the ConfiChat interface:

- Clone the ConfiChat repository:
```bash
git clone https://github.com/your-repository/ConfiChat.git
cd ConfiChat
```

- Install dependencies:
```bash
flutter pub get
```

- Run the application:
```bash
flutter run
```
Next, [download](https://github.com/1runeberg/confichat) and run ConfiChat.

Now, you're ready to start using ConfiChat with your local Llama 3.1 model!

### Additional Resources

For more detailed instructions and troubleshooting, please visit the [Ollama documentation](https://ollama.com/docs) and the [ConfiChat repository](https://github.com/your-repository/ConfiChat).
For more detailed instructions and troubleshooting, please visit the [Ollama documentation](https://ollama.com/docs)

---

Expand All @@ -104,25 +88,11 @@ To use OpenAI with ConfiChat, you first need to obtain an API key:

Keep your API key secure and do not share it publicly.

### 2. Set Up ConfiChat
### 2. Run ConfiChat

Next, download and set up the ConfiChat interface:
Next, [download](https://github.com/1runeberg/confichat) and run ConfiChat.

- Clone the ConfiChat repository:
```bash
git clone https://github.com/your-repository/ConfiChat.git
cd ConfiChat
```

- Install dependencies:
```bash
flutter pub get
```

- Run the application:
```bash
flutter run
```
Note: There may be a warning during first run as the binaries are unsigned.

### 3. Configure ConfiChat with Your API Key

Expand All @@ -136,7 +106,7 @@ ConfiChat is now configured to use OpenAI for its language model capabilities!

### Additional Resources

For more detailed instructions and troubleshooting, please visit the [OpenAI documentation](https://platform.openai.com/docs) and the [ConfiChat repository](https://github.com/your-repository/ConfiChat).
For more detailed instructions and troubleshooting, please visit the [OpenAI documentation](https://platform.openai.com/docs).

---

Expand All @@ -152,9 +122,11 @@ Follow the instructions in the [Install Ollama](#1-install-ollama) section above

Follow the instructions in the [Download a Model](#2-download-a-model) section above to download the Llama 3.1 model.

### 3. Set Up ConfiChat
### 3. Run ConfiChat

Follow the instructions in the [Set Up ConfiChat](#3-set-up-confichat) section above.
[Download](https://github.com/1runeberg/confichat) and run ConfiChat.

Note: There may be a warning during first run as the binaries are unsigned.

### 4. Get Your OpenAI API Key

Expand Down Expand Up @@ -198,9 +170,11 @@ llama-server -m /path/to/your/model --port 8080

This command will start the LlamaCpp server, which ConfiChat can connect to for processing language model queries.

### 3. Set Up ConfiChat
### 3. Run ConfiChat

[Download](https://github.com/1runeberg/confichat) and run ConfiChat.

Follow the instructions in the [Set Up ConfiChat](#3-set-up-confichat) section above.
Note: There may be a warning during first run as the binaries are unsigned.

### Additional Resources

Expand Down

0 comments on commit c9bb4e8

Please sign in to comment.