Skip to content

Commit f95cada

Browse files
johanrinclaude
andcommitted
Add MCP server tutorial for Microsoft Foundry
Step-by-step guide covering: - What MCP is and how it works - Building an MCP server with Python - Deploying to Azure Container Apps - Connecting to Microsoft Foundry Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent ef41ba9 commit f95cada

10 files changed

Lines changed: 257 additions & 0 deletions

File tree

124 KB
Loading
155 KB
Loading
165 KB
Loading
151 KB
Loading
155 KB
Loading
140 KB
Loading
55.1 KB
Loading
119 KB
Loading
Lines changed: 257 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,257 @@
1+
---
2+
title: Deploy Your First MCP Server and Use it on Microsoft Foundry
3+
date: 2026-02-01
4+
images: ["post-cover.png"]
5+
description: Step-by-step tutorial to build and deploy your first MCP (Model Context Protocol) server on Azure Container Apps, then connect it to Microsoft Foundry.
6+
categories: ["Tutorials", "AI"]
7+
tags: ["MCP", "Microsoft Foundry", "Azure", "AI", "Tutorial"]
8+
---
9+
10+
Have you ever asked your Copilot, ChatGPT, or Gemini for information it simply couldn't access?
11+
12+
This is a common limitation of LLMs. They only know what they were trained on.
13+
14+
Yesterday's news? Tomorrow's weather? They can't help you there.
15+
16+
But what if we could supercharge our LLMs with these powers? That's exactly what MCP (Model Context Protocol) enables.
17+
18+
Curious to learn how? Let's go!
19+
20+
**In this tutorial, you'll learn how to:**
21+
22+
- Understand what MCP is and how it works
23+
- Build an MCP server with Python
24+
- Deploy it to Azure Container Apps
25+
- Connect it to Microsoft Foundry
26+
27+
## Meet MCP
28+
29+
To unlock these superpowers, LLMs use MCP.
30+
31+
MCP stands for **[Model Context Protocol](https://modelcontextprotocol.io)**, an open-source standard developed by Anthropic to connect LLMs to external systems and extend their capabilities.
32+
33+
**Think of MCP as a universal API for LLMs.**
34+
35+
It follows a client-server architecture. Clients are AI applications like Microsoft Foundry, Claude Desktop, or Visual Studio Code. They connect to MCP servers that provide additional context and capabilities.
36+
37+
For example, MCP servers can connect to your Gmail, query your database, or fetch live weather data. This gives your LLM access to real, up-to-date information.
38+
39+
For developers, MCP offers a standardized way to extend LLM capabilities: build once, work with any compatible client.
40+
41+
## How MCP works
42+
43+
![MCP client-server architecture diagram showing data and transport layers](images/mcp.png)
44+
45+
MCP is composed of two layers: a data layer and a transport layer.
46+
47+
The data layer defines **primitives**, which describe what clients and servers can offer each other:
48+
49+
- **Tools**: functions that clients can execute to perform actions like API calls or file operations
50+
- **Resources**: data exposed to clients, such as file contents or database records
51+
- **Prompts**: reusable templates that structure interactions with the LLM, like system prompts
52+
53+
The transport layer manages communication and authentication between clients and servers. MCP supports two transport mechanisms:
54+
55+
- **Stdio**: for local communication
56+
- **Streamable HTTP**: for remote server communication
57+
58+
All messages use [JSON-RPC 2.0](https://www.jsonrpc.org/) as the communication protocol.
59+
60+
## Build your first MCP server
61+
62+
First, let's see what happens when we ask a weather question in Microsoft Foundry without MCP.
63+
64+
![Microsoft Foundry agent response without MCP tool - generic weather answer](images/answer-without-tool.png)
65+
66+
As expected, the agent using GPT-4.1 cannot access live data and returns a generic response.
67+
68+
Let's correct that!
69+
70+
### Prerequisites
71+
72+
- Python 3.12+
73+
- [uv](https://docs.astral.sh/uv/) (recommended) or pip
74+
- [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) installed and logged in
75+
- A Dockerfile (included in the [GitHub repository](https://github.com/johanrin/my-first-mcp-server))
76+
77+
### Write the code
78+
79+
We will build a simple MCP server using the Python SDK to provide weather information.
80+
81+
All the code is available in this GitHub repository: [johanrin/my-first-mcp-server](https://github.com/johanrin/my-first-mcp-server)
82+
83+
First, let's create the project:
84+
85+
```shell
86+
# Create a new directory for our project
87+
uv init my-first-mcp-server
88+
cd my-first-mcp-server
89+
90+
# Create virtual environment and activate it
91+
uv venv
92+
.venv\Scripts\activate
93+
94+
# Install dependencies
95+
uv add mcp[cli] httpx
96+
```
97+
98+
Edit **main.py** to initialize your MCP server:
99+
100+
```python
101+
import os
102+
103+
import httpx
104+
from mcp.server.fastmcp import FastMCP
105+
106+
# Constants
107+
OPEN_METEO_API_URL = "https://api.open-meteo.com/v1/forecast"
108+
109+
# Configuration from environment variables
110+
HOST = os.getenv("MCP_HOST", "0.0.0.0")
111+
PORT = int(os.getenv("MCP_PORT", "8000"))
112+
113+
# Initialize MCP server
114+
mcp = FastMCP("weather", host=HOST, port=PORT)
115+
```
116+
117+
Next, add a helper function to fetch weather data from the Open-Meteo API:
118+
119+
```python
120+
async def fetch_weather(latitude: float, longitude: float) -> dict | str:
121+
"""Fetch weather data from Open-Meteo API."""
122+
try:
123+
async with httpx.AsyncClient() as client:
124+
response = await client.get(
125+
OPEN_METEO_API_URL,
126+
params={
127+
"latitude": latitude,
128+
"longitude": longitude,
129+
"current": "temperature_2m,windspeed_10m",
130+
},
131+
)
132+
response.raise_for_status()
133+
return response.json()
134+
except httpx.HTTPError as e:
135+
return f"Failed to fetch weather data: {e}"
136+
```
137+
138+
Now, implement the tool that the LLM will call:
139+
140+
```python
141+
@mcp.tool()
142+
async def get_weather(latitude: float, longitude: float) -> str:
143+
"""Get current weather for a location."""
144+
data = await fetch_weather(latitude, longitude)
145+
146+
if isinstance(data, str):
147+
return data
148+
149+
current = data["current"]
150+
temperature = current["temperature_2m"]
151+
wind_speed = current["windspeed_10m"]
152+
153+
return (
154+
f"Current Weather ({latitude}, {longitude}):\n"
155+
f"- Temperature: {temperature}°C\n"
156+
f"- Wind Speed: {wind_speed} km/h"
157+
)
158+
```
159+
160+
Finally, run the server. We use **streamable-http** transport since the server needs to be remotely accessible:
161+
162+
```python
163+
if __name__ == "__main__":
164+
mcp.run(transport="streamable-http")
165+
```
166+
167+
### Deploy on Azure
168+
169+
To make our MCP server publicly accessible, we'll deploy it to Azure Container Apps.
170+
171+
First, define the variables:
172+
173+
```shell
174+
RESOURCE_GROUP="mcp-weather-rg"
175+
LOCATION="eastus"
176+
CONTAINER_ENV="mcp-weather-env"
177+
CONTAINER_APP="weather-mcp-server"
178+
```
179+
180+
Create the resource group:
181+
182+
```shell
183+
az group create --name "$RESOURCE_GROUP" --location "$LOCATION"
184+
```
185+
186+
Create the Container Apps environment:
187+
188+
```shell
189+
az containerapp env create \
190+
--name "$CONTAINER_ENV" \
191+
--resource-group "$RESOURCE_GROUP" \
192+
--location "$LOCATION"
193+
```
194+
195+
Deploy the container app from source:
196+
197+
```shell
198+
az containerapp up \
199+
--name "$CONTAINER_APP" \
200+
--resource-group "$RESOURCE_GROUP" \
201+
--environment "$CONTAINER_ENV" \
202+
--source . \
203+
--ingress external \
204+
--target-port 8000
205+
```
206+
207+
![MCP server deployed on Azure Container Apps](images/container-app.png)
208+
209+
Your MCP server is successfully deployed and ready to be used by MCP clients!
210+
211+
## Use Your MCP Server in Microsoft Foundry
212+
213+
Now that we've deployed our MCP server, let's connect it to Microsoft Foundry.
214+
215+
To do that, go to the Agents section and click **Add** in the Tools section:
216+
217+
![Microsoft Foundry agent tools section with Add button](images/add-tool.png)
218+
219+
Select **Model Context Protocol (MCP)** in the Custom section and click **Create**:
220+
221+
![Selecting Model Context Protocol MCP option in Microsoft Foundry](images/select-custom-mcp.png)
222+
223+
Complete the **Name**, **MCP endpoint**, and **Authentication** sections, then click **Connect**.
224+
225+
![MCP server connection configuration in Microsoft Foundry](images/connect-custom-mcp-tool.png)
226+
227+
In my case, I completed:
228+
229+
- **Name**: weather
230+
- **Remote MCP Server endpoint**: https://weather-mcp-server.thankfulplant-71632a26.eastus.azurecontainerapps.io/mcp
231+
- **Authentication**: Unauthenticated
232+
233+
> **Note:** Don't forget to add **/mcp** at the end of the endpoint.
234+
235+
> **Warning:** Never use unauthenticated MCP servers in production. This is for demo purposes only.
236+
237+
Now you're all set to test your MCP server!
238+
239+
Save your configuration and ask about the weather in your favorite place.
240+
241+
For security reasons, you'll need to **Approve** whether you want to use the tool.
242+
243+
![MCP tool approval prompt in Microsoft Foundry](images/approve-mcp-call.png)
244+
245+
And there it is: a real-time weather response powered by your MCP server!
246+
247+
![Microsoft Foundry agent with MCP tool returning real-time weather data](images/answer-with-tool.png)
248+
249+
## Conclusion
250+
251+
You've successfully built an MCP server, deployed it to Azure Container Apps, and connected it to Microsoft Foundry. Your LLM can now access real-time weather data.
252+
253+
From here, you can extend your MCP server with additional tools, add authentication for production use, or explore other MCP primitives like Resources and Prompts.
254+
255+
That's it for me. Hope you learned something!
256+
257+
If you have any questions, feel free to reach out on my social networks!
64.3 KB
Loading

0 commit comments

Comments
 (0)