Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Functions not found for a group chat #252

Closed
MoRaouf opened this issue Oct 15, 2023 · 6 comments · Fixed by #294
Closed

Functions not found for a group chat #252

MoRaouf opened this issue Oct 15, 2023 · 6 comments · Fixed by #294

Comments

@MoRaouf
Copy link

MoRaouf commented Oct 15, 2023

I'm trying to create a group chat with functions as LangChain tools passed to UserProxyAgent. The issue is that all functions throw an error of not being found whenever UserProxyAgent tries to execute them. Tried also with normal function definitions instead of LangChain tools & it gives the same error.

Here is the code
config_list = autogen.config_list_from_json(
    env_or_file="OAI_CONFIG_LIST.json",
    file_location=".",
)

llm_config = {
    # "seed": 42,  
    "temperature": 0,
    "config_list": config_list,
    "request_timeout": 120,
}


# ==============================================================================
# Functions
#==============================================================================

# ------------------------------------------------------
class WebSearchToolInput(BaseModel):
    query: str = Field(description = "Query to search for on Google")

class WebSearchTool(BaseTool):
    name = "web_search"
    description = "A function used to search for a query on Google"
    args_schema: Type[BaseModel] = WebSearchToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, query: str):
        # logic here
    
# ------------------------------------------------------
class ScrapeToolInput(BaseModel):
    url: str = Field(description = "A website URL to scrape for data")

class ScrapeTool(BaseTool):
    name = "scrape"
    description = "A function to scrape a website for data"
    args_schema: Type[BaseModel] = ScrapeToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, url: str):
        # logic here

# ------------------------------------------------------
class KeywordResearchToolInput(BaseModel):
    keywords: List["str"] = Field(description = "List of Keywords to search for SEO Optimization")

class KeywordResearchTool(BaseTool):
    name = "keyword_research"
    description = "A function used to search for SEO Optimization related keywords"
    args_schema: Type[BaseModel] = KeywordResearchToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, keywords: List[str]) -> str:
        # logic here


# ==============================================================================
# Agents
#==============================================================================

market_researcher_llm_config = llm_config.copy()
market_researcher_llm_config["functions"] = [generate_oai_func(WebSearchTool()), generate_oai_func(ScrapeTool())]

Market_Researcher = autogen.AssistantAgent(
   name="Market_Researcher",
   system_message=""" A Market Researcher. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=market_researcher_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
)

# ------------------------------------------------------

seo_specialist_llm_config = llm_config.copy()
seo_specialist_llm_config["functions"] = [generate_oai_func(KeywordResearchTool())]

SEO_Specialist = autogen.AssistantAgent(
   name="SEO_Specialist",
   system_message="""An SEO Specialist. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=seo_specialist_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
)

# --------------------------- Data Analyst ---------------------------

data_analyst_llm_config = llm_config.copy()

Data_Analyst = autogen.AssistantAgent(
   name="Data_Analyst",
   system_message="""A Data Analyst. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=data_analyst_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
)

# ------------------------------------------------------

# research_assistant_llm_config = llm_config.copy()

Research_Assistant = autogen.UserProxyAgent(
    name="Research_Assistant",
    system_message='''Assistant for the Market Research team.''',
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
    # llm_config=research_assistant_llm_config,
    code_execution_config={"work_dir": "coding"},
    human_input_mode = "TERMINATE",
    max_consecutive_auto_reply=10,
    function_map={
            "web_search": WebResearchTool()._run,
            "scrape": ScrapeTool()._run,
            "keyword_research": KeywordResearchTool()._run,
        }
    
)

# ------------------------------------------------------
class MarketResearchTeamToolInput(BaseModel):
    instructions: str = Field(description = "Detailed instructions for the Market Research Team")

class MarketResearchTeamTool(BaseTool):
    name = "market_research_team"
    description = ""
    args_schema: Type[BaseModel] = MarketResearchTeamToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, instructions: str):

        groupchat = autogen.GroupChat(
            agents=[Market_Researcher, SEO_Specialist, Data_Analyst, Research_Assistant],
            messages=[],
            max_round=10
            )
        
        new_llm_config = llm_config.copy()
        new_llm_config["functions"] = [generate_oai_func(WebSearchTool()), 
                                                     generate_oai_func(ScrapeTool()),
                                                     generate_oai_func(KeywordResearchTool())]

        manager = autogen.GroupChatManager(groupchat=groupchat, 
                                           name = "Market_Research_Team_Chat_Manager",
                                           llm_config=new_llm_config)

        Research_Assistant.initiate_chat(
            manager, 
            message=instructions)

        timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
        with open(f"chat_history/Market_Research_Team_chat_{timestamp}.json", "w") as json_file:
            json.dump(groupchat.messages, json_file, indent=2)


MarketResearchTeamTool()._run(instructions="Do a quick research about AI?")
@sonichi
Copy link
Contributor

sonichi commented Oct 16, 2023

@kevin666aa are you the right person to answer this question?

@yiranwu0
Copy link
Collaborator

What does generate_oai_func() do? Does the function description generated match keys in function_map? ("web_search", etc)

@MoRaouf
Copy link
Author

MoRaouf commented Oct 16, 2023

@kevin666aa, Here is the code for generate_oai_func():

generate_oai_func()
def generate_oai_func(tool):

    if hasattr(tool, "required_args"):
        function_schema = {
            "name": tool.name.lower().replace (' ', '_'),
            "description": tool.description,
            "parameters": {
                "type": "object",
                "properties": {},
                "required": [*tool.required_args],
            },
        }
        
    else:
        function_schema = {
            "name": tool.name.lower().replace (' ', '_'),
            "description": tool.description,
            "parameters": {
                "type": "object",
                "properties": {},
            },
        }

    if tool.args is not None:
      function_schema["parameters"]["properties"] = tool.args

    return function_schema

Here is the output of generate_oai_function(WebSearchTool()):

{'description': 'A function used to search for a query on Google',
 'name': 'web_search',
 'parameters': {'properties': {'query': {'description': 'Query to search for on Google',
                                         'title': 'Query',
                                         'type': 'string'}},
                'required': ['query'],
                'type': 'object'}}

@Siafu
Copy link

Siafu commented Oct 16, 2023

I'm having the same issue.

@MoRaouf
Copy link
Author

MoRaouf commented Oct 16, 2023

@sonichi @kevin666aa Adding the function_map of the relevant tools (i.e., functions) to the relevant AssistantAgents in the Group Chat regardless of adding them to the UserProxyAgent will solve the issue. Every AssistantAgent suggesting a function call will be taking the lead to execute it (& not the UserProxyAgent).

Here is the code
config_list = autogen.config_list_from_json(
    env_or_file="OAI_CONFIG_LIST.json",
    file_location=".",
)

llm_config = {
    # "seed": 42,  
    "temperature": 0,
    "config_list": config_list,
    "request_timeout": 120,
}


# ==============================================================================
# Functions
#==============================================================================

# ------------------------------------------------------
class WebSearchToolInput(BaseModel):
    query: str = Field(description = "Query to search for on Google")

class WebSearchTool(BaseTool):
    name = "web_search"
    description = "A function used to search for a query on Google"
    args_schema: Type[BaseModel] = WebSearchToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, query: str):
        # logic here
    
# ------------------------------------------------------
class ScrapeToolInput(BaseModel):
    url: str = Field(description = "A website URL to scrape for data")

class ScrapeTool(BaseTool):
    name = "scrape"
    description = "A function to scrape a website for data"
    args_schema: Type[BaseModel] = ScrapeToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, url: str):
        # logic here

# ------------------------------------------------------
class KeywordResearchToolInput(BaseModel):
    keywords: List["str"] = Field(description = "List of Keywords to search for SEO Optimization")

class KeywordResearchTool(BaseTool):
    name = "keyword_research"
    description = "A function used to search for SEO Optimization related keywords"
    args_schema: Type[BaseModel] = KeywordResearchToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, keywords: List[str]) -> str:
        # logic here


# ==============================================================================
# Agents
#==============================================================================

market_researcher_llm_config = llm_config.copy()
market_researcher_llm_config["functions"] = [generate_oai_func(WebSearchTool()), generate_oai_func(ScrapeTool())]

Market_Researcher = autogen.AssistantAgent(
   name="Market_Researcher",
   system_message=""" A Market Researcher. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=market_researcher_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
   function_map={
        "web_search": WebSearchTool()._run,
        "scrape": ScrapeTool()._run,
        "keyword_research": KeywordResearchTool()._run,
    }
)

# ------------------------------------------------------

seo_specialist_llm_config = llm_config.copy()
seo_specialist_llm_config["functions"] = [generate_oai_func(KeywordResearchTool())]

SEO_Specialist = autogen.AssistantAgent(
   name="SEO_Specialist",
   system_message="""An SEO Specialist. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=seo_specialist_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
   function_map={
        "web_search": WebSearchTool()._run,
        "scrape": ScrapeTool()._run,
        "keyword_research": KeywordResearchTool()._run,
    }
)

# --------------------------- Data Analyst ---------------------------

data_analyst_llm_config = llm_config.copy()

Data_Analyst = autogen.AssistantAgent(
   name="Data_Analyst",
   system_message="""A Data Analyst. Reply "TERMINATE" in the end when everything is done.
    """,
   llm_config=data_analyst_llm_config,
   is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
   function_map={
        "web_search": WebSearchTool()._run,
        "scrape": ScrapeTool()._run,
        "keyword_research": KeywordResearchTool()._run,
    }
)

# ------------------------------------------------------

# research_assistant_llm_config = llm_config.copy()

Research_Assistant = autogen.UserProxyAgent(
    name="Research_Assistant",
    system_message='''Assistant for the Market Research team.''',
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
    # llm_config=research_assistant_llm_config,
    code_execution_config={"work_dir": "coding"},
    human_input_mode = "TERMINATE",
    max_consecutive_auto_reply=10,   
)

# ------------------------------------------------------
class MarketResearchTeamToolInput(BaseModel):
    instructions: str = Field(description = "Detailed instructions for the Market Research Team")

class MarketResearchTeamTool(BaseTool):
    name = "market_research_team"
    description = ""
    args_schema: Type[BaseModel] = MarketResearchTeamToolInput
    required_args: List = [field for field in args_schema.__annotations__.keys()]

    def _run(self, instructions: str):

        groupchat = autogen.GroupChat(
            agents=[Market_Researcher, SEO_Specialist, Data_Analyst, Research_Assistant],
            messages=[],
            max_round=10
            )

        manager = autogen.GroupChatManager(groupchat=groupchat, 
                                           name = "Market_Research_Team_Chat_Manager",
                                           llm_config=llm_config)

        Research_Assistant.initiate_chat(
            manager, 
            message=instructions)

        timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
        with open(f"chat_history/Market_Research_Team_chat_{timestamp}.json", "w") as json_file:
            json.dump(groupchat.messages, json_file, indent=2)


MarketResearchTeamTool()._run(instructions="Do a quick research about AutoGen without any SEO keyword research, return only top 2 searches")
Here is the output
Research_Assistant (to Market_Research_Team_Chat_Manager):

Do a quick research about AutoGen without any SEO keyword research, return only top 2 searches

--------------------------------------------------------------------------------
Market_Researcher (to Market_Research_Team_Chat_Manager):

***** Suggested function Call: web_search *****
Arguments: 
{
  "query": "AutoGen"
}
***********************************************

--------------------------------------------------------------------------------

>>>>>>>> EXECUTING FUNCTION web_search...
Market_Researcher (to Market_Research_Team_Chat_Manager):

***** Response from calling function "web_search" *****
{'searchParameters': {'q': 'AutoGen', 'type': 'search', 'engine': 'google'}, ...............................}
*******************************************************

--------------------------------------------------------------------------------
Market_Researcher (to Market_Research_Team_Chat_Manager):

Here are the top 2 search results for "AutoGen":

1. AutoGen - GitHub
   - Link: [AutoGen - GitHub](https://github.com/microsoft/autogen)
   - Description: AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks. It provides a customizable and conversable framework for simplifying the orchestration, optimization, and automation of LLM workflows.

2. AutoGen DNA and RNA Extraction Devices & Service
   - Link: [AutoGen DNA and RNA Extraction Devices & Service](https://autogen.com/)
   - Description: AutoGen specializes in DNA and RNA extraction devices and services. They offer a variety of instrumentation and consumable kits that meet your DNA and RNA extraction needs.

Please let me know if you need more information.

--------------------------------------------------------------------------------
SEO_Specialist (to Market_Research_Team_Chat_Manager):

Thank you for the information. These are the top 2 search results for "AutoGen":

1. AutoGen - GitHub: AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks. It provides a customizable and conversable framework for simplifying the orchestration, optimization, and automation of LLM workflows. You can find more information on their [GitHub page](https://github.com/microsoft/autogen).

2. AutoGen DNA and RNA Extraction Devices & Service: AutoGen specializes in DNA and RNA extraction devices and services. They offer a variety of instrumentation and consumable kits that meet your DNA and RNA extraction needs. You can learn more about their products and services on their [website](https://autogen.com/).        

Let me know if there's anything else I can help with.

--------------------------------------------------------------------------------
Data_Analyst (to Market_Research_Team_Chat_Manager):

TERMINATE

--------------------------------------------------------------------------------

In the source code of generate_function_call_reply(), It takes in the Sender of the message that contains the function_call & then executes the function. This makes sense for the solution I found as the senders are always AssistantAgents. On the other hand, the notebook of function calling found here shows that the user_proxy is the one executing the function suggested by the chatbot (i.e., AssisatntAgent) which I dont understand.

@MoRaouf MoRaouf closed this as completed Oct 16, 2023
@MoRaouf MoRaouf reopened this Oct 16, 2023
@yiranwu0
Copy link
Collaborator

yiranwu0 commented Oct 16, 2023

@MoRaouf Thanks. This is an undesired behavior in groupchat, since only userproxyagent is equipped with function_call execution, and the group_manager might not select other agents after seeing a function call.

From your description, this should be the problem then. A potentially solution is put something like this in user_proxy_agent's system_message: "You will be responsible for executing any function calls." So that the group_manager is more likely to select it as the next speaker, but this is not deterministic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants