-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Documentation] Topic Category for AutoGen + Non-OpenAI Models #1994
Comments
@lestan @marklysze would you like to take a stab at this? |
Yep, I'm happy to give it a go. A category under topics and then documentation for each proxy stack under that sounds good. I feel people think more of how they're going to run it rather than the specific model (and models may be mentioned throughout the different pages). If we go by proxy stack, do you see something like these under the new topic:
In terms of full examples (the Notebooks section) - it would be good to add some full notebooks/examples (which I can do in parallel with the documentation) for non-OpenAI models and the proxy stacks - not sure if a sub-section for non-OpenAI models would be useful. I can start tackling together.ai, LiteLLM+Ollama, LM Studio, vLLM. @lestan, if you are interested then you could do Ollama as I noticed you already have code for that with #767 - I can tackle this if you're not able to at this stage. I'm sure I'll have some questions but I'll think them through and wait for your reply. |
Would be glad to! Let me review what's already there and also look at Mark's suggestions. @marklysze will be in touch |
Per Ollama, feel free to use this OllamaModelClient I made or suggest the user use the OpenAI endpoint like |
That's awesome work, thanks @descention! I can't wait to try it out as well... |
Yes this is good. I can start first with a LM studio example for now. And laying out the parent folder structure |
@ekzhu, can you look at my repo+branch I created as I put in a folder structure: |
Yes. Let's use that structure to get started |
Closing this issue thanks to the hard work of @marklysze |
Hey @lestan, just a note that the first round of non-OpenAI documentation has been merged. If you are still up for doing the Ollama example, please see the LiteLLM+Ollama, LM Studio and vLLM pages to see what has been done already. If you're short on time and need a hand to do it, I can help. Thanks! |
Is your feature request related to a problem? Please describe.
Many LLMs are catching up with GPT-4. We want to have documentation on how to use them with AutoGen.
Describe the solution you'd like
A topic category (see the topics page) with a collection of pages about how to run some AutoGen conversation pattern or application scenario using non-OpenAI models. We can organize the pages by having one page for each model, or we can organize by having one page for each proxy stack.
Additional context
We aim to close the following issues by providing example solutions in the documentation.
You can find examples of notebooks in
/website/docs/topics/
.To build documentation locally, see: https://microsoft.github.io/autogen/docs/Contribute#documentation
The text was updated successfully, but these errors were encountered: