Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New provider: Amazon Bedrock (AWS) #1174

Closed
dgallitelli opened this issue Apr 10, 2024 · 13 comments
Closed

New provider: Amazon Bedrock (AWS) #1174

dgallitelli opened this issue Apr 10, 2024 · 13 comments
Assignees

Comments

@dgallitelli
Copy link

Feature description
Please include support for Amazon Bedrock models. These models can be from Amazon, Anthropic, AI21, Cohere, Mistral, or Meta Llama 2.

Your Feature

  1. Create a new LLM Provides under metagpt/provider for Amazon Bedrock
  2. Include it in the LLMType available
@dgallitelli dgallitelli changed the title New LLMType: Amazon Bedrock New provider: Amazon Bedrock (AWS) Apr 10, 2024
@seehi
Copy link
Contributor

seehi commented Apr 11, 2024

It might be supported later, and if possible, you can also provide a PR.
FYI: @better629

@better629 better629 self-assigned this Apr 11, 2024
@chosenlu
Copy link

could you tell me when will metagpt add bedrock, and how to add provider by myself

@better629
Copy link
Collaborator

about next week, you can refs any llm provider under metagpt/provider and complete methods from BaseLLM

@chosenlu
Copy link

Thank you very much and I look forward to your updates. I have just started to contact metagpt and I am still familiar with some concepts. I will read the code carefully.

I found another problem examples/rag_search.py can not run
it looks like the llama-index update cause
Traceback (most recent call last):
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/user/rag_search/rag_search.py", line 7, in
from metagpt.rag.engines import SimpleEngine
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/lib/python3.9/site-packages/metagpt/rag/engines/init.py", line 3, in
from metagpt.rag.engines.simple import SimpleEngine
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/lib/python3.9/site-packages/metagpt/rag/engines/simple.py", line 31, in
from metagpt.rag.factories import (
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/lib/python3.9/site-packages/metagpt/rag/factories/init.py", line 3, in
from metagpt.rag.factories.retriever import get_retriever
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/lib/python3.9/site-packages/metagpt/rag/factories/retriever.py", line 15, in
from metagpt.rag.retrievers.bm25_retriever import DynamicBM25Retriever
File "/Users/chosenlu/Job/Python/meta_gpt_agent_newst/lib/python3.9/site-packages/metagpt/rag/retrievers/bm25_retriever.py", line 8, in
from llama_index.retrievers.bm25 import BM25Retriever
ModuleNotFoundError: No module named 'llama_index.retrievers'

@chosenlu
Copy link

I resolved
it looks like I need install:
pip install llama-index-retrievers-bm25
pip install llama-index-embeddings-azure-openai
pip install llama-index-llms-azure-openai
pip install llama-index-embeddings-gemini
pip install llama-index-embeddings-ollama

@better629
Copy link
Collaborator

Yes, rag is extra module, so you need install it. You can ref to https://docs.deepwisdom.ai/main/en/guide/get_started/installation.html#install-submodules if meet problems.

@chosenlu
Copy link

Thank you!I still waiting the bedrock support,Is the bedrock ok now?

@better629
Copy link
Collaborator

@chosenlu @dgallitelli
see the pr #1231

@dgallitelli
Copy link
Author

Nice, thank you! I'll give it a try and provide some feedback. Quick question: are the Access key and Secret access key strict requirements? Or can they be inferred from environmental variables and therefore omitted in the LLM config?

@better629
Copy link
Collaborator

@dgallitelli no, for now, you should config them in LLM config.

@xdstone1on163
Copy link

this is the sample code error message:

Input should be 'openai', 'anthropic', 'claude', 'spark', 'zhipuai', 'fireworks', 'open_llm', 'gemini', 'metagpt', 'azure', 'ollama', 'qianfan', 'dashscope', 'moonshot', 'mistral' or 'yi' [type=enum, input_value='bedrock', input_type=str]

why I am seeing no 'bedrock' support by now?

@seehi
Copy link
Contributor

seehi commented Sep 30, 2024

@qingerVT
Copy link

qingerVT commented Jan 2, 2025

what does this mean
"botocore.errorfactory.ValidationException: An error occurred (ValidationException) whe
n calling the InvokeModelWithResponseStream operation: Invocation of model ID anthropi
c.claude-3-opus-20240229-v1:0 with on-demand throughput isn’t supported. Retry your re
quest with the ID or ARN of an inference profile that contains this model"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants