Website Search Experience is undergoing a transformation through Conversational AI, driven by the potent GPT models from OpenAI. With the new Sitecore + Azure OpenAI connector, you can harness the robust natural language processing capabilities of GPT models and leverage them to build an intelligent chatbot powered by your real-time website content/data. This empowers you to craft compelling chatbots that not only enhance user engagement but also contribute to improved user retention.
This blog details the approach and the steps involved in setting up an AI-driven Chat Experience powered by your real-time business content/data.
This solution adheres to the Retrieval Augmented Generation approach, an artificial intelligence framework facilitating data retrieval from external sources to anchor Large Language Models (LLMs) in the most current business content. This approach enables the development of a chatbot that seamlessly integrates the natural language processing capabilities of LLMs with data extracted from our enterprise systems.
To realize this integration, the essential components are as follows:
Cognitive Search: Serving as our external data source, Cognitive Search houses all business content and data, complete with vector embeddings. Whenever content authors publish content in Sitecore, the published content, along with vector embeddings, must be pushed to Cognitive Search via APIs. Data or media can also be ingested into Cognitive Search from any existing or new Blob Storage.
Azure OpenAI Service: This approach requires an Embedding model that transforms our content into embeddings and a GPT model to respond to user queries. The GPT model utilizes grounding(which refers to retrieving documents matching the user’s query from Cognitive Search and including them as input data in our prompt) to generate the response for the user’s query.
Chat WebApp: Functioning as the host for the chat experience, this web application acts as an orchestrator. It generates a customized prompt for the GPT model(incorporating the user’s input, grounding, and conversation history), and sends back the completion from the GPT model to the user.
Here’s a high-level diagram outlining the solution:
Fortunately, there’s no need to manually build/create the Chat Web App, and other components within the box in the above diagram. You can effortlessly generate all of them with just a few clicks using Azure AI Studio. The Sitecore + Azure OpenAI Connector(AI Chatbot Module), simplifies the process of generating and pushing content and embeddings in the required format into Cognitive Search, which is the driving force behind the entire chat experience.
FEATURES OF THE MODULE:
- Customizable & Extendable
- Indexes Published Content
- Select Templates & Fields to be indexed
- Generates Embeddings
- Sanitizes Content
- Adds Page URL as unique identifier for Citations
- Works with individual & bulk publish(sub-items)
- Executes indexing as a post-publish background job
- Follows Security Best Practices like RBAC
Here are the high-level steps involved in setting up this module. For detailed instructions, please refer to the Github Repository.
- Create Azure OpenAI Service and deploy an embedding model & gpt model leveraging Azure AI Studio
- Configure data source and parameters in Azure AI Studio
- Deploy the chat experience as a new Azure Web App or Power Virtual Agent bot from Azure AI Studio
- Deploy & configure the Sitecore + Azure OpenAI SPE Module for your implementation
- Implement the Security Best Practices covered in the documentation
- Embed the chat experience within your website
For more details on the solution and demo, please refer to this video:
Building an Intelligent Chat Experience powered by Real time Business ContentData with OpenAI-PGHSUG – YouTube
Source Code for this module is available on Github. This module is built for the Sitecore community and doesn’t require any license. This module doesn’t collect any of your information. Please check out and let me know if you have any feedback/issues/feature requests here. Thank you for using this module!