diff --git a/notebook/agentchat_microsoft_fabric.ipynb b/notebook/agentchat_microsoft_fabric.ipynb new file mode 100644 index 00000000000..f6e20d86cf6 --- /dev/null +++ b/notebook/agentchat_microsoft_fabric.ipynb @@ -0,0 +1,829 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "be5a8d87", + "metadata": {}, + "source": [ + "# Use AutoGen in Microsoft Fabric\n", + "\n", + "AutoGen offers conversable LLM agents, which can be used to solve various tasks with human or automatic feedback, including tasks that require using tools via code.\n", + "Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n", + "\n", + "[Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/get-started/microsoft-fabric-overview) is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. Its pre-built AI models include GPT-x models such as `gpt-4-turbo`, `gpt-4`, `gpt-4-8k`, `gpt-4-32k`, `gpt-35-turbo`, `gpt-35-turbo-16k` and `gpt-35-turbo-instruct`, etc. It's important to note that the Azure Open AI service is not supported on trial SKUs and only paid SKUs (F64 or higher, or P1 or higher) are supported. Azure Open AI is being enabled in stages, with access for all users expected by March 2024.\n", + "\n", + "In this notebook, we demonstrate how to use `AssistantAgent` and `UserProxyAgent` to write code and execute the code. Here `AssistantAgent` is an LLM-based agent that can write Python code (in a Python coding block) for a user to execute for a given task. `UserProxyAgent` is an agent which serves as a proxy for the human user to execute the code written by `AssistantAgent`, or automatically execute the code. Depending on the setting of `human_input_mode` and `max_consecutive_auto_reply`, the `UserProxyAgent` either solicits feedback from the human user or returns auto-feedback based on the result of code execution (success or failure and corresponding outputs) to `AssistantAgent`. `AssistantAgent` will debug the code and suggest new code if the result contains error. The two agents keep communicating to each other until the task is done.\n", + "\n", + "## Requirements\n", + "\n", + "AutoGen requires `Python>=3.8`. To run this notebook example, please install:\n", + "```bash\n", + "pip install \"pyautogen\"\n", + "```\n", + "\n", + "Also, this notebook depends on Microsoft Fabric pre-built LLM endpoints. Running it elsewhere may encounter errors." + ] + }, + { + "cell_type": "markdown", + "id": "34ce050c-134a-4787-9655-73d9bd7afb6b", + "metadata": { + "nteract": { + "transient": { + "deleting": false + } + } + }, + "source": [ + "## AutoGen version < 0.2.0\n", + "\n", + "For AutoGen version < 0.2.0, the Azure OpenAI endpoint is pre-configured." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6a6b4a95-5766-442d-9de5-b7fc1fb3d140", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:07:36.8889779Z", + "execution_start_time": "2023-12-11T05:07:36.8886587Z", + "livy_statement_state": "available", + "parent_msg_id": "4aa7c4ee-8126-4206-8a8b-b38491ff16dc", + "queued_time": "2023-12-11T05:07:11.6799575Z", + "session_id": null, + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": -1 + }, + "text/plain": [ + "StatementMeta(, , -1, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": {}, + "execution_count": null, + "metadata": {}, + "output_type": "execute_result" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Collecting pyautogen<0.2.0\n", + " Downloading pyautogen-0.1.14-py3-none-any.whl (88 kB)\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m88.8/88.8 kB\u001b[0m \u001b[31m6.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25hRequirement already satisfied: diskcache in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen<0.2.0) (5.6.3)\n", + "Requirement already satisfied: flaml in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen<0.2.0) (2.1.1.dev2)\n", + "Requirement already satisfied: openai<1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen<0.2.0) (0.27.8)\n", + "Collecting python-dotenv (from pyautogen<0.2.0)\n", + " Downloading python_dotenv-1.0.0-py3-none-any.whl (19 kB)\n", + "Requirement already satisfied: termcolor in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen<0.2.0) (2.3.0)\n", + "Requirement already satisfied: requests>=2.20 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai<1->pyautogen<0.2.0) (2.31.0)\n", + "Requirement already satisfied: tqdm in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai<1->pyautogen<0.2.0) (4.66.1)\n", + "Requirement already satisfied: aiohttp in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai<1->pyautogen<0.2.0) (3.8.6)\n", + "Requirement already satisfied: NumPy>=1.17.0rc1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from flaml->pyautogen<0.2.0) (1.24.3)\n", + "Requirement already satisfied: charset-normalizer<4,>=2 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.20->openai<1->pyautogen<0.2.0) (3.3.1)\n", + "Requirement already satisfied: idna<4,>=2.5 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.20->openai<1->pyautogen<0.2.0) (3.4)\n", + "Requirement already satisfied: urllib3<3,>=1.21.1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.20->openai<1->pyautogen<0.2.0) (1.26.17)\n", + "Requirement already satisfied: certifi>=2017.4.17 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.20->openai<1->pyautogen<0.2.0) (2023.7.22)\n", + "Requirement already satisfied: attrs>=17.3.0 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (23.1.0)\n", + "Requirement already satisfied: multidict<7.0,>=4.5 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (6.0.4)\n", + "Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (4.0.3)\n", + "Requirement already satisfied: yarl<2.0,>=1.0 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (1.9.2)\n", + "Requirement already satisfied: frozenlist>=1.1.1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (1.4.0)\n", + "Requirement already satisfied: aiosignal>=1.1.2 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from aiohttp->openai<1->pyautogen<0.2.0) (1.3.1)\n", + "Installing collected packages: python-dotenv, pyautogen\n", + "Successfully installed pyautogen-0.1.14 python-dotenv-1.0.0\n", + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m23.1.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m23.3.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\n", + "Note: you may need to restart the kernel to use updated packages.\n" + ] + }, + { + "data": {}, + "execution_count": null, + "metadata": {}, + "output_type": "execute_result" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Warning: PySpark kernel has been restarted to use updated packages.\n", + "\n" + ] + } + ], + "source": [ + "%pip install \"pyautogen<0.2.0\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "448f26d0-d1f7-4b2a-8dab-035ff2abbedc", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:18:00.2585542Z", + "execution_start_time": "2023-12-11T05:17:59.8269627Z", + "livy_statement_state": "available", + "parent_msg_id": "0c686a15-8b9c-4479-ac26-2cca81b21cf3", + "queued_time": "2023-12-11T05:17:59.3165049Z", + "session_id": "865e72a4-f70b-46cf-8421-9f25745bd9bd", + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": 27 + }, + "text/plain": [ + "StatementMeta(, 865e72a4-f70b-46cf-8421-9f25745bd9bd, 27, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Choose different models\n", + "config_list = [\n", + " {\n", + " 'model': 'gpt-4-turbo',\n", + " },\n", + "]\n", + "\n", + "# Set temperature, timeout and other LLM configurations\n", + "llm_config={\n", + " \"config_list\": config_list,\n", + " \"temperature\": 0,\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "793b6eb1-f8af-4b98-809d-21fd53f7de41", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:18:21.8907776Z", + "execution_start_time": "2023-12-11T05:18:01.7118817Z", + "livy_statement_state": "available", + "parent_msg_id": "a3a03b66-c113-4b91-872f-213880814fbd", + "queued_time": "2023-12-11T05:18:01.293131Z", + "session_id": "865e72a4-f70b-46cf-8421-9f25745bd9bd", + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": 28 + }, + "text/plain": [ + "StatementMeta(, 865e72a4-f70b-46cf-8421-9f25745bd9bd, 28, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[33muser_proxy\u001b[0m (to assistant):\n", + "\n", + "\n", + "Who should read this paper: https://arxiv.org/abs/2308.08155\n", + "\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[33massistant\u001b[0m (to user_proxy):\n", + "\n", + "To determine who should read the paper titled \"Learning to Prompt for Continual Learning\" available on arXiv, we need to first understand the abstract and the topics covered in the paper. I will fetch the abstract from the provided URL and analyze its content to suggest the target audience.\n", + "\n", + "```python\n", + "# filename: fetch_arxiv_abstract.py\n", + "import requests\n", + "from bs4 import BeautifulSoup\n", + "\n", + "# Function to get the abstract of the paper from arXiv\n", + "def get_arxiv_abstract(url):\n", + " response = requests.get(url)\n", + " if response.status_code == 200:\n", + " soup = BeautifulSoup(response.content, 'html.parser')\n", + " abstract_text = soup.find('blockquote', class_='abstract').text\n", + " # Clean up the abstract text\n", + " abstract_text = abstract_text.replace('Abstract: ', '').strip()\n", + " return abstract_text\n", + " else:\n", + " return \"Error: Unable to fetch the abstract from arXiv.\"\n", + "\n", + "# URL of the paper\n", + "paper_url = 'https://arxiv.org/abs/2308.08155'\n", + "\n", + "# Get the abstract of the paper\n", + "abstract = get_arxiv_abstract(paper_url)\n", + "print(abstract)\n", + "```\n", + "\n", + "Please run the above Python script to fetch the abstract of the paper. Once we have the abstract, I will analyze it to suggest the appropriate audience.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[31m\n", + ">>>>>>>> EXECUTING CODE BLOCK 0 (inferred language is python)...\u001b[0m\n", + "\u001b[33muser_proxy\u001b[0m (to assistant):\n", + "\n", + "exitcode: 0 (execution succeeded)\n", + "Code output: \n", + "Abstract:AutoGen is an open-source framework that allows developers to build LLM applications via multiple agents that can converse with each other to accomplish tasks. AutoGen agents are customizable, conversable, and can operate in various modes that employ combinations of LLMs, human inputs, and tools. Using AutoGen, developers can also flexibly define agent interaction behaviors. Both natural language and computer code can be used to program flexible conversation patterns for different applications. AutoGen serves as a generic infrastructure to build diverse applications of various complexities and LLM capacities. Empirical studies demonstrate the effectiveness of the framework in many example applications, with domains ranging from mathematics, coding, question answering, operations research, online decision-making, entertainment, etc.\n", + "\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[33massistant\u001b[0m (to user_proxy):\n", + "\n", + "Based on the abstract provided, the paper titled \"AutoGen: An Open-Source Framework for Building LLM Applications with Conversable Agents\" seems to be focused on a framework that enables developers to create applications using large language models (LLMs) with agents that can interact through conversation to accomplish tasks.\n", + "\n", + "The target audience for this paper would likely include:\n", + "\n", + "1. **Software Developers and Engineers** who are interested in building applications that leverage large language models and conversational agents.\n", + "\n", + "2. **Researchers in Artificial Intelligence and Machine Learning** who are working on natural language processing, conversational AI, and the integration of human inputs with AI agents.\n", + "\n", + "3. **Product Managers and Technical Leads** who are looking to understand how conversational AI can be applied to various domains such as mathematics, coding, question answering, operations research, online decision-making, and entertainment.\n", + "\n", + "4. **Educators and Students** in computer science and related fields who are interested in the latest developments in AI frameworks and applications.\n", + "\n", + "5. **Innovators and Entrepreneurs** in the tech industry who are exploring new ways to incorporate AI into their products and services.\n", + "\n", + "6. **AI Enthusiasts and Hobbyists** who have a keen interest in the practical applications of large language models and conversational interfaces.\n", + "\n", + "The paper would be particularly relevant for those who are looking to understand or utilize the AutoGen framework to build complex applications that require the capabilities of LLMs.\n", + "\n", + "If you are part of or know someone who belongs to these groups, this paper would be a valuable read.\n", + "\n", + "TERMINATE\n", + "\n", + "--------------------------------------------------------------------------------\n" + ] + } + ], + "source": [ + "import autogen\n", + "\n", + "# create an AssistantAgent instance named \"assistant\"\n", + "assistant = autogen.AssistantAgent(\n", + " name=\"assistant\",\n", + " llm_config=llm_config,\n", + ")\n", + "\n", + "# create a UserProxyAgent instance named \"user_proxy\"\n", + "user_proxy = autogen.UserProxyAgent(\n", + " name=\"user_proxy\",\n", + " human_input_mode=\"NEVER\", # input() doesn't work, so needs to be \"NEVER\" here\n", + " max_consecutive_auto_reply=10,\n", + " is_termination_msg=lambda x: x.get(\"content\", \"\").rstrip().endswith(\"TERMINATE\"),\n", + " code_execution_config={\n", + " \"work_dir\": \"coding\",\n", + " \"use_docker\": False, # set to True or image name like \"python:3\" to use docker\n", + " },\n", + " llm_config=llm_config,\n", + " system_message=\"\"\"Reply TERMINATE if the task has been solved at full satisfaction.\n", + "Otherwise, reply CONTINUE, or the reason why the task is not solved yet.\"\"\"\n", + ")\n", + "\n", + "# the assistant receives a message from the user, which contains the task description\n", + "user_proxy.initiate_chat(\n", + " assistant,\n", + " message=\"\"\"\n", + "Who should read this paper: https://arxiv.org/abs/2308.08155\n", + "\"\"\",\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "a958cf54-23e8-46e8-be78-782c1a17bc82", + "metadata": { + "nteract": { + "transient": { + "deleting": false + } + } + }, + "source": [ + "## AutoGen version >= 0.2.0\n", + "\n", + "For AutoGen version >= 0.2.0, we need to set up an API endpoint because the version of the openai-python package is different from the pre-configured version." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "83867b85-6fb2-4ca1-8859-206f0b854b24", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:23:56.8983159Z", + "execution_start_time": "2023-12-11T05:23:56.8981286Z", + "livy_statement_state": "available", + "parent_msg_id": "cb272a67-8c4b-4e7f-8dfe-153b85d6b7fd", + "queued_time": "2023-12-11T05:23:43.2251661Z", + "session_id": null, + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": -1 + }, + "text/plain": [ + "StatementMeta(, , -1, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": {}, + "execution_count": null, + "metadata": {}, + "output_type": "execute_result" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Collecting pyautogen>=0.2.0\n", + " Downloading pyautogen-0.2.2-py3-none-any.whl (124 kB)\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m124.0/124.0 kB\u001b[0m \u001b[31m8.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25hRequirement already satisfied: diskcache in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen>=0.2.0) (5.6.3)\n", + "Requirement already satisfied: flaml in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen>=0.2.0) (2.1.1.dev2)\n", + "Collecting openai~=1.3 (from pyautogen>=0.2.0)\n", + " Downloading openai-1.3.8-py3-none-any.whl (221 kB)\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m221.5/221.5 kB\u001b[0m \u001b[31m37.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25hRequirement already satisfied: python-dotenv in /nfs4/pyenv-b962c9b1-be7a-4052-b362-e359a86c2a98/lib/python3.10/site-packages (from pyautogen>=0.2.0) (1.0.0)\n", + "Requirement already satisfied: termcolor in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen>=0.2.0) (2.3.0)\n", + "Requirement already satisfied: tiktoken in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from pyautogen>=0.2.0) (0.5.1)\n", + "Requirement already satisfied: anyio<5,>=3.5.0 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai~=1.3->pyautogen>=0.2.0) (3.7.1)\n", + "Collecting distro<2,>=1.7.0 (from openai~=1.3->pyautogen>=0.2.0)\n", + " Downloading distro-1.8.0-py3-none-any.whl (20 kB)\n", + "Collecting httpx<1,>=0.23.0 (from openai~=1.3->pyautogen>=0.2.0)\n", + " Downloading httpx-0.25.2-py3-none-any.whl (74 kB)\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.0/75.0 kB\u001b[0m \u001b[31m40.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25hRequirement already satisfied: pydantic<3,>=1.9.0 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai~=1.3->pyautogen>=0.2.0) (1.10.9)\n", + "Requirement already satisfied: sniffio in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai~=1.3->pyautogen>=0.2.0) (1.3.0)\n", + "Requirement already satisfied: tqdm>4 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai~=1.3->pyautogen>=0.2.0) (4.66.1)\n", + "Requirement already satisfied: typing-extensions<5,>=4.5 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from openai~=1.3->pyautogen>=0.2.0) (4.5.0)\n", + "Requirement already satisfied: NumPy>=1.17.0rc1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from flaml->pyautogen>=0.2.0) (1.24.3)\n", + "Requirement already satisfied: regex>=2022.1.18 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from tiktoken->pyautogen>=0.2.0) (2023.8.8)\n", + "Requirement already satisfied: requests>=2.26.0 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from tiktoken->pyautogen>=0.2.0) (2.31.0)\n", + "Requirement already satisfied: idna>=2.8 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from anyio<5,>=3.5.0->openai~=1.3->pyautogen>=0.2.0) (3.4)\n", + "Requirement already satisfied: exceptiongroup in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from anyio<5,>=3.5.0->openai~=1.3->pyautogen>=0.2.0) (1.1.3)\n", + "Requirement already satisfied: certifi in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from httpx<1,>=0.23.0->openai~=1.3->pyautogen>=0.2.0) (2023.7.22)\n", + "Collecting httpcore==1.* (from httpx<1,>=0.23.0->openai~=1.3->pyautogen>=0.2.0)\n", + " Downloading httpcore-1.0.2-py3-none-any.whl (76 kB)\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m76.9/76.9 kB\u001b[0m \u001b[31m39.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25hRequirement already satisfied: h11<0.15,>=0.13 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai~=1.3->pyautogen>=0.2.0) (0.14.0)\n", + "Requirement already satisfied: charset-normalizer<4,>=2 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.26.0->tiktoken->pyautogen>=0.2.0) (3.3.1)\n", + "Requirement already satisfied: urllib3<3,>=1.21.1 in /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages (from requests>=2.26.0->tiktoken->pyautogen>=0.2.0) (1.26.17)\n", + "Installing collected packages: httpcore, distro, httpx, openai, pyautogen\n", + " Attempting uninstall: openai\n", + " Found existing installation: openai 0.27.8\n", + " Not uninstalling openai at /home/trusted-service-user/cluster-env/trident_env/lib/python3.10/site-packages, outside environment /nfs4/pyenv-b962c9b1-be7a-4052-b362-e359a86c2a98\n", + " Can't uninstall 'openai'. No files were found to uninstall.\n", + " Attempting uninstall: pyautogen\n", + " Found existing installation: pyautogen 0.1.14\n", + " Uninstalling pyautogen-0.1.14:\n", + " Successfully uninstalled pyautogen-0.1.14\n", + "Successfully installed distro-1.8.0 httpcore-1.0.2 httpx-0.25.2 openai-1.3.8 pyautogen-0.2.2\n", + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m23.1.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m23.3.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\n", + "Note: you may need to restart the kernel to use updated packages.\n" + ] + }, + { + "data": {}, + "execution_count": null, + "metadata": {}, + "output_type": "execute_result" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Warning: PySpark kernel has been restarted to use updated packages.\n", + "\n" + ] + } + ], + "source": [ + "%pip install \"pyautogen>=0.2.0\"" + ] + }, + { + "cell_type": "markdown", + "id": "c485fcab", + "metadata": {}, + "source": [ + "## Set your API endpoint" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "13005ac5-7f2a-4ba6-85b9-d45671093be2", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:27:12.0400654Z", + "execution_start_time": "2023-12-11T05:27:10.9380797Z", + "livy_statement_state": "available", + "parent_msg_id": "8429d912-c8af-41c2-bfde-697adb0bbf46", + "queued_time": "2023-12-11T05:27:10.4608238Z", + "session_id": "865e72a4-f70b-46cf-8421-9f25745bd9bd", + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": 36 + }, + "text/plain": [ + "StatementMeta(, 865e72a4-f70b-46cf-8421-9f25745bd9bd, 36, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2023-12-11:05:27:11,251 WARNING [synapse_mlflow_utils.py:244] To save or load Apache Spark model files, please attach a Lakehouse.\n" + ] + } + ], + "source": [ + "from synapse.ml.mlflow import get_mlflow_env_config\n", + "\n", + "mlflow_env_configs = get_mlflow_env_config()\n", + "access_token = mlflow_env_configs.driver_aad_token\n", + "prebuilt_AI_base_url = mlflow_env_configs.workload_endpoint + \"cognitive/openai/\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1470b833-9cf2-4735-a28d-57d30714f562", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:27:12.9516846Z", + "execution_start_time": "2023-12-11T05:27:12.5600767Z", + "livy_statement_state": "available", + "parent_msg_id": "7512dc56-5ad2-46eb-a0f7-3a62d15e7385", + "queued_time": "2023-12-11T05:27:11.574982Z", + "session_id": "865e72a4-f70b-46cf-8421-9f25745bd9bd", + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": 37 + }, + "text/plain": [ + "StatementMeta(, 865e72a4-f70b-46cf-8421-9f25745bd9bd, 37, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "config_list = [\n", + " {\n", + " 'model': 'gpt-4-turbo',\n", + " 'api_key': access_token,\n", + " 'base_url': prebuilt_AI_base_url,\n", + " 'api_type': 'azure',\n", + " 'api_version': '2023-08-01-preview',\n", + " },\n", + "]" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "951c0d05-1d58-4b42-88ea-7303c1da88aa", + "metadata": {}, + "outputs": [ + { + "data": { + "application/vnd.livy.statement-meta+json": { + "execution_finish_time": "2023-12-11T05:28:09.3148816Z", + "execution_start_time": "2023-12-11T05:27:37.4931459Z", + "livy_statement_state": "available", + "parent_msg_id": "4c9275dc-25d3-4204-8641-fc8ed22b7d54", + "queued_time": "2023-12-11T05:27:37.0516131Z", + "session_id": "865e72a4-f70b-46cf-8421-9f25745bd9bd", + "session_start_time": null, + "spark_pool": null, + "state": "finished", + "statement_id": 38 + }, + "text/plain": [ + "StatementMeta(, 865e72a4-f70b-46cf-8421-9f25745bd9bd, 38, Finished, Available)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[33muser_proxy\u001b[0m (to assistant):\n", + "\n", + "What date is today? Compare the year-to-date gain for META and TESLA.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[33massistant\u001b[0m (to user_proxy):\n", + "\n", + "To get the current date, we can write a simple Python script to print out today's date using the `datetime` module. Then, to compare the year-to-date (YTD) gain for META (Meta Platforms Inc.) and TESLA (Tesla, Inc.), we need to retrieve the stock prices from the beginning of the current year and the most recent closing price for both companies and calculate the percentage change.\n", + "\n", + "Here's the plan to solve the task step by step:\n", + "1. Write and execute a Python script to get today's date.\n", + "2. Use a Python script to retrieve the opening stock price for both Meta Platforms Inc. (META) and Tesla, Inc. (TSLA) as of the first trading day of the current year.\n", + "3. Retrieve the most recent closing stock price for both companies.\n", + "4. Calculate the percentage change from the opening price to the latest closing price for both stocks.\n", + "5. Compare the YTD gains and display the result.\n", + "\n", + "First, let's start with step 1 by getting today's date:\n", + "\n", + "```python\n", + "# filename: get_current_date.py\n", + "import datetime\n", + "\n", + "def get_current_date():\n", + " # Get today's date\n", + " return datetime.date.today()\n", + "\n", + "# Print the current date\n", + "print(f\"Today's date is: {get_current_date()}\")\n", + "```\n", + "\n", + "Please execute the above script to get today's date. After that, we will proceed to the next steps of retrieving stock prices and comparing YTD gains.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[31m\n", + ">>>>>>>> EXECUTING CODE BLOCK 0 (inferred language is python)...\u001b[0m\n", + "\u001b[33muser_proxy\u001b[0m (to assistant):\n", + "\n", + "exitcode: 0 (execution succeeded)\n", + "Code output: \n", + "Today's date is: 2023-12-11\n", + "\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[33massistant\u001b[0m (to user_proxy):\n", + "\n", + "It seems there might be a typo in your output since today cannot be December 11, 2023, considering the knowledge cutoff date is in early 2023. However, I will proceed assuming today's date is correctly given as December 11, 2023.\n", + "\n", + "To move forward with the next steps, I will utilize Python code to do the following:\n", + "- Fetch the historical stock data for META and TESLA.\n", + "- Extract the relevant opening prices at the start of the current year and the latest available closing prices.\n", + "- Calculate the YTD gains for both stocks.\n", + "\n", + "This will require accessing financial data through an API such as Yahoo Finance. We'll use the `yfinance` library to fetch the stock data. This library must be installed in your Python environment. If it's not already installed, please install it by executing `pip install yfinance` before running the following script.\n", + "\n", + "Let's fetch the stock data and calculate the YTD gains:\n", + "\n", + "```python\n", + "# filename: compare_ytd_gains.py\n", + "import yfinance as yf\n", + "from datetime import datetime\n", + "\n", + "# Function to calculate the YTD gain of a stock\n", + "def calculate_ytd_gain(ticker):\n", + " # Get data from the start of the year to the current date\n", + " start_of_year = datetime(datetime.now().year, 1, 1)\n", + " current_date = datetime.now().strftime('%Y-%m-%d')\n", + " data = yf.download(ticker, start=start_of_year.strftime('%Y-%m-%d'), end=current_date)\n", + "\n", + " # Ensure we have data to compute the gain\n", + " if data.empty:\n", + " return None\n", + "\n", + " # Get the first available opening price of the year and the most recent available closing price\n", + " opening_price = data['Open'].iloc[0]\n", + " closing_price = data['Close'].iloc[-1]\n", + "\n", + " # Calculate YTD gain and return it\n", + " ytd_gain = ((closing_price - opening_price) / opening_price) * 100\n", + " return ytd_gain\n", + "\n", + "# Get the YTD gains\n", + "meta_ytd_gain = calculate_ytd_gain('META')\n", + "tesla_ytd_gain = calculate_ytd_gain('TSLA')\n", + "\n", + "# Output the YTD gains\n", + "print(f\"Year-to-Date gain for Meta Platforms Inc. (META): {meta_ytd_gain:.2f}%\")\n", + "print(f\"Year-to-Date gain for Tesla, Inc. (TSLA): {tesla_ytd_gain:.2f}%\")\n", + "\n", + "# Compare the YTD gains\n", + "if meta_ytd_gain is not None and tesla_ytd_gain is not None:\n", + " if meta_ytd_gain > tesla_ytd_gain:\n", + " print(\"META has a higher YTD gain than TESLA.\")\n", + " elif meta_ytd_gain < tesla_ytd_gain:\n", + " print(\"TESLA has a higher YTD gain than META.\")\n", + " else:\n", + " print(\"META and TESLA have the same YTD gain.\")\n", + "else:\n", + " print(\"Unable to calculate YTD gains, possibly due to missing data.\")\n", + "```\n", + "\n", + "Please execute the above code to compare the Year-to-Date gains of META and TESLA. Remember, the actual output will depend on the stock prices on the dates fetched. If there are difficulties or errors encountered when executing the code, please inform me so that we can troubleshoot accordingly.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[31m\n", + ">>>>>>>> EXECUTING CODE BLOCK 0 (inferred language is python)...\u001b[0m\n", + "\u001b[33muser_proxy\u001b[0m (to assistant):\n", + "\n", + "exitcode: 0 (execution succeeded)\n", + "Code output: \n", + "\n", + "[*********************100%%**********************] 1 of 1 completed\n", + "\n", + "[*********************100%%**********************] 1 of 1 completed\n", + "Year-to-Date gain for Meta Platforms Inc. (META): 170.92%\n", + "Year-to-Date gain for Tesla, Inc. (TSLA): 105.82%\n", + "META has a higher YTD gain than TESLA.\n", + "\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[33massistant\u001b[0m (to user_proxy):\n", + "\n", + "The executed code has successfully calculated the Year-to-Date (YTD) gains for Meta Platforms Inc. (META) and Tesla, Inc. (TSLA). According to the output you provided:\n", + "\n", + "- META has a YTD gain of 170.92%.\n", + "- TESLA has a YTD gain of 105.82%.\n", + "\n", + "Based on these results, Meta Platforms Inc. (META) has a higher YTD gain compared to Tesla, Inc. (TSLA) as of the current date.\n", + "\n", + "Please let me know if you need assistance with any other queries.\n", + "\n", + "TERMINATE\n", + "\n", + "--------------------------------------------------------------------------------\n" + ] + } + ], + "source": [ + "import autogen\n", + "\n", + "# create an AssistantAgent named \"assistant\"\n", + "assistant = autogen.AssistantAgent(\n", + " name=\"assistant\",\n", + " llm_config={\n", + " # \"cache_seed\": 42, # seed for caching and reproducibility\n", + " \"config_list\": config_list, # a list of OpenAI API configurations\n", + " # \"temperature\": 0, # temperature for sampling\n", + " }, # configuration for autogen's enhanced inference API which is compatible with OpenAI API\n", + ")\n", + "# create a UserProxyAgent instance named \"user_proxy\"\n", + "user_proxy = autogen.UserProxyAgent(\n", + " name=\"user_proxy\",\n", + " human_input_mode=\"NEVER\",\n", + " max_consecutive_auto_reply=10,\n", + " is_termination_msg=lambda x: x.get(\"content\", \"\").rstrip().endswith(\"TERMINATE\"),\n", + " code_execution_config={\n", + " \"work_dir\": \"coding\",\n", + " \"use_docker\": False, # set to True or image name like \"python:3\" to use docker\n", + " },\n", + ")\n", + "# the assistant receives a message from the user_proxy, which contains the task description\n", + "user_proxy.initiate_chat(\n", + " assistant,\n", + " message=\"\"\"What date is today? Compare the year-to-date gain for META and TESLA.\"\"\",\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1006fec8-87c6-43cd-a857-4ecd37fbfa86", + "metadata": { + "jupyter": { + "outputs_hidden": false, + "source_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernel_info": { + "name": "synapse_pyspark" + }, + "kernelspec": { + "display_name": "Synapse PySpark", + "language": "Python", + "name": "synapse_pyspark" + }, + "language_info": { + "name": "python" + }, + "notebook_environment": {}, + "nteract": { + "version": "nteract-front-end@1.0.0" + }, + "save_output": true, + "spark_compute": { + "compute_id": "/trident/default", + "session_options": { + "conf": {}, + "enableDebugMode": false + } + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/website/docs/Ecosystem.md b/website/docs/Ecosystem.md index eec567d4418..63cdd93fcd7 100644 --- a/website/docs/Ecosystem.md +++ b/website/docs/Ecosystem.md @@ -5,8 +5,17 @@ This page lists libraries that have integrations with Autogen for LLM applicatio ## MemGPT + AutoGen -![Agent Chat Example](img/ecosystem-memgpt.png) +![MemGPT Example](img/ecosystem-memgpt.png) MemGPT enables LLMs to manage their own memory and overcome limited context windows. You can use MemGPT to create perpetual chatbots that learn about you and modify their own personalities over time. You can connect MemGPT to your own local filesystems and databases, as well as connect MemGPT to your own tools and APIs. The MemGPT + AutoGen integration allows you to equip any AutoGen agent with MemGPT capabilities. - [MemGPT + AutoGen Documentation with Code Examples](https://memgpt.readme.io/docs/autogen) + + +## Microsoft Fabric + AutoGen + +![Fabric Example](img/ecosystem-fabric.png) + +[Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/get-started/microsoft-fabric-overview) is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. In this notenook, we give a simple example for using AutoGen in Microsoft Fabric. + +- [Microsoft Fabric + AutoGen Code Examples](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_microsoft_fabric.ipynb) diff --git a/website/docs/img/ecosystem-fabric.png b/website/docs/img/ecosystem-fabric.png new file mode 100644 index 00000000000..181302dcdc1 Binary files /dev/null and b/website/docs/img/ecosystem-fabric.png differ