Langchain with LCEL and using latest APIs
This is a major release!
It includes the following updates:
- Demo Site is UP again!
- Latest Azure AI Search API (2023-10-01-preview)
- Latest Azure OpenAI API (2023-12-01-preview)
- Uses Integrated chunking and vectorization in Azure AI Search
- Newest version of LangChain which uses LCEL and Runnable protocol
- Everything works with GPT-3.5-Turbo and of course GPT-4-Turbo (1106 versions and newer). This unblocks customers that:
- Cannot afford GPT4
- Are not comfortable with the performance (speed) of GPT4 and are willing to exchange quality of the responses for price and speed.
- All agents use the newest OpenAI tools API (preferred over the Functions API that is being deprecated). This enhancement allows for the parallel calling of Functions/Tools.
- New Notebook: "Building our First RAG bot - Skill: talk to Search Engine". Works good with GPT3.5-Turbo-1106
- Dynamic selection of LLM model at runtime
- Streaming and Async works easily using LCEL
- New Notebook: "Using the Bot Service API programmatically." With this notebook now you can access the Bot Service endpoint programmatically in any language. This allows customers to connect the bot to their ReactJS or AngularJS front end.