-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New provider: Amazon Bedrock (AWS) #1174
Comments
It might be supported later, and if possible, you can also provide a PR. |
could you tell me when will metagpt add bedrock, and how to add provider by myself |
about next week, you can refs any llm provider under |
Thank you very much and I look forward to your updates. I have just started to contact metagpt and I am still familiar with some concepts. I will read the code carefully. I found another problem examples/rag_search.py can not run |
I resolved |
Yes, rag is extra module, so you need install it. You can ref to |
Thank you!I still waiting the bedrock support,Is the bedrock ok now? |
@chosenlu @dgallitelli |
Nice, thank you! I'll give it a try and provide some feedback. Quick question: are the Access key and Secret access key strict requirements? Or can they be inferred from environmental variables and therefore omitted in the LLM config? |
@dgallitelli no, for now, you should config them in LLM config. |
this is the sample code error message:Input should be 'openai', 'anthropic', 'claude', 'spark', 'zhipuai', 'fireworks', 'open_llm', 'gemini', 'metagpt', 'azure', 'ollama', 'qianfan', 'dashscope', 'moonshot', 'mistral' or 'yi' [type=enum, input_value='bedrock', input_type=str]why I am seeing no 'bedrock' support by now? |
what does this mean |
Feature description
Please include support for Amazon Bedrock models. These models can be from Amazon, Anthropic, AI21, Cohere, Mistral, or Meta Llama 2.
Your Feature
The text was updated successfully, but these errors were encountered: