Use declarative agents to extend Microsoft 365 Copilot. You ingest external content to Microsoft 365 and create agents that use that content to answer questions authoritatively.
Suppose you have an external system where you store knowledge base articles. These articles contain information about different processes in your organization. You want to be able to easily find and discover relevant information from Microsoft 365. You also want Microsoft 365 Copilot to include information from these knowledge base articles in its responses.
Here, you create a declarative agent which can answer product support questions using external content ingested to Microsoft 365.
By the end of this module, you'll demonstrate your ability to create Graph connectors and use the data source in a declarative agent for Microsoft 365 Copilot that can answer questions using information from documents.
You'll be building a declarative agent for Microsoft 365 Copilot that's grounded in custom content ingested to Microsoft 365. You'll be building a Microsoft Graph connector that ingests external content to Microsoft 365. Here, we discuss the project business logic and target behavior. We also cover the accounts and software you'll need.
The Microsoft Graph connector:
- ingests external content to Microsoft 365 one time
- makes the ingested content available to everyone in the organization
The declarative agent:
- is available within Microsoft 365 Copilot
- provides examples of how users can ask it questions
- answers a user's questions using the information from the ingested content
- in the answer, includes a reference to the relevant piece of information
- politely refuses answering questions not related to its function
- admits when it doesn't have relevant information to answer a user's question
To complete this challenge, you need a Microsoft 365 tenant with the Microsoft 365 Copilot license, Visual Studio Code and the Teams Toolkit extension.
To complete this exercise, you need a Microsoft 365 tenant with the Microsoft 365 Copilot license. You access Microsoft 365 Copilot through the Microsoft 365 tenant. You need the Microsoft 365 Copilot license to get access to semantic index and be able to ground agents in external content ingested to Microsoft 365.
Here we provide sample markdown files for you to use in the challenge. The markdown files represent company policies stored in an external system.
You'll build the agent using Visual Studio Code. Visual Studio Code is an IDE that provides you with tooling to build agents for Microsoft 365 Copilot. To install Visual Studio Code for your operating system, visit https://code.visualstudio.com/.
Teams Toolkit is a Visual Studio Code extension that helps developers build agents for Microsoft 365 Copilot. It provides you with the project boilerplate and tasks to automate building and deploying agents. To install the Teams Toolkit extension for Visual Studio Code visit https://marketplace.visualstudio.com/items?itemName=TeamsDevApp.ms-teams-vscode-extension.
To get started, use the Teams Toolkit to create a new app with the declarative agent without an action template.
The organization needs an agent that answers employees questions using information from policy documents ingested into Microsoft 365. Here, you're build a Microsoft Graph connector that ingests the external content to Microsoft 365. You'll validate how it's working against requirements to check your work.
You can find external content using Microsoft search and Microsoft 365 Copilot chat.
At this point, you have a Microsoft Graph connector that ingests external content to your Microsoft 365 tenant. You can find the external content using Microsoft search.
- Navigate to Microsoft 365 Admin center, Search & intelligence settings. Is the external connection listed as a data source?
- Is the external connection in "Ready" state?
- Select the external connection. Does it show the correct number of items indexed?
- Go to the Microsoft 365 app. Use the search box to search for a policy. Is it listed among the search results?
- Go to Microsoft 365 Copilot chat. Ask it a question related to a policy. Does the answer contain a reference to the ingested content?
The organization needs an agent that answers employees questions using information from policy documents ingested into Microsoft 365. Here, you start building a declarative agent. You'll validate how it's working against requirements to check your work.
The declarative agent:
- is available within Microsoft 365 Copilot
- provides examples of how users can ask it questions
- politely refuses answering questions not related to its function
At this point, you have a declarative agent with custom instructions and examples of questions that the users might ask. The agent is visible in Microsoft 365 Copilot chat.
- Navigate to Microsoft 365 Copilot chat. Is your agent visible?
- In Microsoft 365 Copilot chat, navigate to the agent. Does it show examples of the questions that a user might ask?
- Ask a question not related to organization policies. Does the agent politely refuse to answer it?
The organization needs an agent that answers employees questions using information from policy documents ingested into Microsoft 365. Here, you're build a Microsoft Graph connector that ingests the external content to Microsoft 365. You'll validate how it's working against requirements to check your work.
The declarative agent:
- answers a user's questions using the information from the ingested content
- in the answer, includes a reference to the relevant content
- politely refuses answering questions not related to its function
- admits when it doesn't have relevant information to answer a user's question
At this point, you have a fully functional declarative agent grounded in external content ingested to Microsoft 365. The agent answers questions about products using the information from the referenced content.
- Navigate to your agent. Ask it a question about the BYOD (Bring Your Own Device) policy. Does the agent answer using the information from the document?
- Does the answer include a reference to the document?
- Is the reference pointing to the correct external document?
- Ask another question, this time about something that's not in the referenced policies. Does the agent admit that it doesn't have the information to answer the question?
- Ask a question not related to policies. Does the agent politely refuse to answer it?