Explore how MCP standardizes access to tools and data for AI applications, its underlying architecture, and how it simplifies the integration of new tools and connections to external systems (e.g., GitHub repos, Google Docs, local files).
We'd like to know you better so we can create more relevant courses. What do you do for work?
Instructor: Elie Schoppik
Explore how MCP standardizes access to tools and data for AI applications, its underlying architecture, and how it simplifies the integration of new tools and connections to external systems (e.g., GitHub repos, Google Docs, local files).
Build and deploy an MCP server that provides tools, resources, and prompts, and add it to the configuration of AI applications, such as Claude Desktop, to extend them.
Build an MCP-compatible application that hosts multiple MCP clients, each maintaining 1-to-1 connection to an MCP server.
Join MCP: Build Rich-Context AI Apps with Anthropic, a short course created in partnership with Anthropic and taught by Elie Schoppik.
Connecting AI applications to external systems to bring rich context to LLMs has often meant writing custom integrations for each use case. This has fragmented AI development between teams within a company and across the industry.
The Model Context Protocol (MCP) is an open protocol that standardizes how LLMs access tools, data, and prompts from external sources, simplifying how new context is integrated into AI applications. MCP, developed by Anthropic, is based on a client-server architecture. It defines the communication details between an MCP client, hosted inside the AI application, and an MCP server that exposes tools, resources, and prompt templates. The server can be a subprocess launched by the client and running locally, or an independent process running remotely.
In this hands-on course, youâll learn the core concepts of MCP and how to implement it in your AI Application. Youâll make a chatbot MCP-compatible, build and deploy an MCP server, and connect the chatbot to your MCP server and other open-source servers.
Hereâs what youâll do:
By the end of the course, youâll be able to build rich-context AI applications that can connect to a growing ecosystem of MCP servers, with minimal integration work.
Itâs helpful to be familiar with Python and have a basic understanding of LLM prompting and LLM application development.
Introduction
Why MCP
MCP Architecture
Chatbot Example
Creating an MCP Server
Creating an MCP Client
Connecting the MCP Chatbot to Reference Servers
Adding Prompt and Resource Features
Configuring Servers for Claude Desktop
Creating and Deploying Remote Servers
Conclusion
Appendix â Tips and Help
Course access is free for a limited time during the DeepLearning.AI learning platform beta!
Keep learning with updates on curated AI news, courses, and events, as well as Andrewâs thoughts from DeepLearning.AI!