Is there any way that instead of OpenAI, we can use LLaMA & Alpaca local models #914
Replies: 2 comments
-
I would like to combine things but i think getting AutoGPT right will enable all of us to have to code things and addapt things in a way each individual user likes. https://vicuna.lmsys.org those models would definitly be itneressting to make a "reduce cost or free mode" for AutoGPT! But fixing the constant changing errors would already be a big step to enabling you to go into that direction on your own or in a small group. Check out the promt examples in the discussion formums! |
Beta Was this translation helpful? Give feedback.
-
I know that they are actively working on a plugin model (especially @BillSchumacher) with the ability to extend AutoGPT with new commands but also give it the ability to connect to other local LLMs. I think once we have the plugin model the community can explode with extensions without bloating the AutoGPT codebase. |
Beta Was this translation helpful? Give feedback.
-
As from hugging face we do get lora quatized models, and i am pretty sure there would be large number of open source model will be coming up, it will be way good if we could use local models.
Same goes for other pinecone and Voice AI.
Any thoughts ?
Beta Was this translation helpful? Give feedback.
All reactions