The main goal of this document is to provide a detailed guide for setting up and running the demo project using GitHub Codespaces, including implementing AI search with Semantic Kernel and improving the response message.
Important: We strongly advise to also watch the recorded demo videos to get better insights on how to run the demo.
- Clone the repository.
- Create a new CodeSpace in the repository.
- You should have Azure OpenAI Keys and Endpoints with the following models deployed:
- GPT4o
- Ada002
- Check the requirements and preparation steps to add the LLMs keys to the projects.
- Start by running both projects: Products and Store.
- Run first Products.
- Select C# and Default Configuration.
- From VSCode Web, change the port from Private to Public.
- Enable popup if needed.
- Test the URL with the suffix [/api/product], e.g.,
https://fluffy-waffle-6444gxrpqq2x5v7-5228.app.github.dev/api/product
.
- With Products project running, run also the project Store.
- Select C# and Default Configuration.
- From VSCode Web, change the port from Private to Public.
- The Product page should be ready to be used.
- Select Products and see the list of products.
- Test the Current Search: Keyword mode.
- Search "Camping", 4 results.
- Search "something for rainy days", 0 results.
- Stop both projects.
-
Add a better search experience using GPT model and an Embedding Model.
-
Use Semantic Kernel to orchestrate this.
-
Add NuGet packages:
-
Microsoft.SemanticKernel
-
Microsoft.SemanticKernel.Plugins.Memory
-
System.Linq.Async
-
Use the commands:
dotnet add package Microsoft.SemanticKernel --version 1.16.0 dotnet add package Microsoft.SemanticKernel.Plugins.Memory --version 1.16.0-alpha dotnet add package System.Linq.Async --version 6.0.1
-
-
Explain the process of a vector search.
-
Include hidden file
.\src\Products\Memory\MemoryContext.cs
. -
Edit the
.\src\Products\Products.csproj
and delete the lines:<ItemGroup> <Compile Remove="Memory\**" /> <Content Remove="Memory\**" /> <EmbeddedResource Remove="Memory\**" /> <None Remove="Memory\**" /> </ItemGroup>
-
Build the Products project.
-
Add User Secrets with the Azure OpenAI Keys:
dotnet user-secrets init dotnet user-secrets set "AZURE_OPENAI_MODEL" "gpt-4o" dotnet user-secrets set "AZURE_OPENAI_ENDPOINT" "<>.openai.azure.com/" dotnet user-secrets set "AZURE_OPENAI_APIKEY" "Key" dotnet user-secrets set "AZURE_OPENAI_ADA02" "text-embedding-ada-002"
-
Open the file
MemoryContext.cs
and explain the file:#pragma warning disabled
due to Semantic Kernel experimental features.InitMemoryContext()
method creates the ChatCompletion Service and the Embedding Generation Support and initializes chat history.FillProductsAsync()
creates a vector database in memory with the current list of products.Search()
performs the search in the VectorMemory, and if a product is found, returns the product details from the Product DB.
-
Add
MemoryContext
toapp.Services
, uncomment Code, and uncommentapp.InitSemanticMemory()
. -
Enable the Endpoint for AI Search in
.\src\Products\Endpoints\ProductEndpoints.cs
by uncommenting the AI Search Endpoint. -
Update FrontEnd to use the new endpoint
.\src\Store\Services\ProductService.cs
. -
Run both projects again.
-
Test Search:
- Search "Camping", 1 result.
- Search "something for rainy days", 1 result.
-
The current logic shows a product result if found, and if not found, asks the question to the GPT model. Test this with these sentences:
- "Hi, my name is Bruno, can you help me with math operations?"
- "What is my name?"
-
Update the
InitMemoryContext()
code:// create chat history _chatHistory = new ChatHistory(); _chatHistory.AddSystemMessage("You are a useful assistant. You always reply with a short and funny message. If you don't know an answer, you say 'I don't know that.' You only answer questions related to outdoor camping products. For any other type of questions, explain to the user that you only answer outdoor camping products questions. Do not store memory of the chat conversation.");
-
Test these sentences again and see the difference in the response:
- "Hi, my name is Bruno, can you help me with math operations?"
- "What is my name?"
-
In the
Search()
function, add this code before the return:// let's improve the response message var prompt = @$"You are an intelligent assistant helping Contoso Inc clients with their search about outdoor product. Generate a catchy and friendly message using the following information: - User Question: {search} - Found Product Name: {firstProduct.Name} Include the found product information in the response to the user question."; _chatHistory.AddUserMessage(prompt); var resultPrompt = await _chat.GetChatMessageContentsAsync(_chatHistory); responseText = resultPrompt[^1].Content;
-
Explain the process to change and improve the prompt, like adding the product description and price. Prompty is here to help!
-
Add the NuGet package:
Microsoft.SemanticKernel.Prompty
-
Use the command:
dotnet add package Microsoft.SemanticKernel.Prompty --version 1.16.0-alpha
-
-
Copy the supporting prompty files from
.\srcDemo\Products\
to.\src\Products\
.- Note: The
.env
file should be previously completed with the Azure OpenAI information.
- Note: The
-
Install the Prompty extension (if it is not installed).
-
Open the file
aisearchresponse.prompty
. -
Run the prompt.
-
Add changes to the prompt to get a better response.
-
Change the improve message code to this one:
// let's improve the response message KernelArguments kernelArguments = new() { { "productid", $"{firstProduct.Id.ToString()}" }, { "productname", $"{firstProduct.Name}" }, { "productdescription", $"{firstProduct.Description}" }, { "productprice", $"{firstProduct.Price}" }, { "question", $"{search}" } }; var prompty = _kernel.CreateFunctionFromPromptyFile("aisearchresponse.prompty"); responseText = await prompty.InvokeAsync<string>(_kernel, kernelArguments);
- Open the solution
.\src BackUp\02 Aspire\eShopLite-Aspire.sln
. - Run, using AppHost as StartUp project.
- Perform a search.
- Perform a general review of the traces, telemetry, and more.