diff --git a/README.md b/README.md index 0189b6d1f3..9f47940ed1 100644 --- a/README.md +++ b/README.md @@ -298,7 +298,7 @@ To run local models, it is possible to use OpenAI compatible APIs, for instance
-To run local inference, you need to download the models first, for instance you can find `ggml` compatible models in [huggingface.com](https://huggingface.co/models?search=ggml) (for example vicuna, alpaca and koala). +To run local inference, you need to download the models first, for instance you can find `ggml` compatible models in [huggingface.co](https://huggingface.co/models?search=ggml) (for example vicuna, alpaca and koala). ### Start the API server