Many blogs and YouTube channels are currently emphasizing the relevance of Antrophic‘s Model Context Protocol MCP. They do this for a reason. MCP is the first attempt of an open standard model that connects LLM applications with data sources or services in a standardized way. Unfortunately, most explanations of MCP are either too basic or too difficult, and do not help grasp the idea behind MCP. In this article you‘ll mostly find a code example for weather requests from a user prompt. Play with it, change it, extend it, use it as a template for your project.
MCP introduces various roles for software components. There are LLMs on the client side which are hosted by inference tools like Ollama, HuggingFace or llama.cpp. MCP clients call remote services via the MCP Protocol. As a common prerequisite, the LLM or the Host should support tool calling. The tool respectively method implements the interaction between the MCP Client and MCP Server using the MCP Protocol. It is called whenever the LLM recognizes a location in the user prompt such as „How is the weather in San Francisco today?
Below is an example that implements an MCP‑conformant weather service using real REST calls between a client and a server. In this example the server exposes an MCP endpoint that expects a JSON request with the following structure:
```json
{
"mcp_version": "1.0",
"service": "weather",
"payload": {
"prompt": "What is the weather in Paris?"
}
}
```
The server uses a local language model (via Hugging Face’s NER pipeline) to extract a location entity (e.g. a geopolitical entity) from the prompt and then “tool calls” an external real weather provider (wttr.in) to retrieve current weather conditions. If the request is successful, the server responds with an MCP‑compliant JSON payload; for example:
```json
{
"mcp_version": "1.0",
"status": "ok",
"result": {
"location": "Paris",
"weather": "Sunny, 25°C"
},
"debug": "Extracted location and retrieved weather data successfully."
}
```
The client, which might be hosted by Ollama in a real deployment, collects a prompt from the user, assembles an MCP‑compliant request, posts it via REST to the server, and prints the JSON result.
Below is the full source code. Save it as, for example, `mcp_weather.py`. (Make sure you install the dependencies with `pip install flask requests transformers` prior to running.)
```python
import sys
import requests
from flask import Flask, request, jsonify
from transformers import pipeline
app = Flask(__name__)
# MCP configuration
MCP_VERSION = "1.0"
SERVICE_NAME = "weather"
# Create a local LLM using Hugging Face’s NER pipeline.
# This leverages a Named Entity Recognition model to extract location entities.
ner_pipeline = pipeline("ner", grouped_entities=True)
def extract_location(prompt):
"""
Extracts a location from the prompt using NER.
It scans for entity groups 'LOC' or 'GPE' and returns the first match.
"""
entities = ner_pipeline(prompt)
for ent in entities:
if ent["entity_group"] in ["LOC", "GPE"]:
# Clean the extracted word (sometimes artifacts occur).
return ent["word"].strip()
return None
def get_weather(location):
"""
Retrieves current weather data using the wttr.in service.
Returns a formatted string with the weather description and temperature (°C).
"""
url = f"http://wttr.in/{location}?format=j1"
try:
response = requests.get(url, timeout=5)
response.raise_for_status()
except Exception as e:
return f"Error fetching weather data for {location}: {e}"
data = response.json()
# Extract current condition: temperature and weather description.
current = data.get("current_condition", [{}])[0]
temp = current.get("temp_C", "Unknown")
desc = current.get("weatherDesc", [{}])[0].get("value", "Unknown")
return f"{desc}, {temp}°C"
@app.route("/mcp", methods=["POST"])
def mcp_endpoint():
"""
MCP endpoint for weather queries.
Expects an MCP-conformant JSON payload in the following form:
{
"mcp_version": "1.0",
"service": "weather",
"payload": { "prompt": "What is the weather in ..." }
}
Processes the request, extracting the location via the local LLM and then calling the weather tool.
"""
data = request.get_json()
if not data:
return jsonify({
"mcp_version": MCP_VERSION,
"status": "error",
"error": "Empty MCP request payload."
}), 400
if data.get("mcp_version") != MCP_VERSION:
return jsonify({
"mcp_version": MCP_VERSION,
"status": "error",
"error": "Unsupported MCP version."
}), 400
if data.get("service") != SERVICE_NAME:
return jsonify({
"mcp_version": MCP_VERSION,
"status": "error",
"error": f"Unsupported service: {data.get('service')}"
}), 400
payload = data.get("payload")
if not payload or "prompt" not in payload:
return jsonify({
"mcp_version": MCP_VERSION,
"status": "error",
"error": "Missing required payload: 'prompt'."
}), 400
prompt = payload["prompt"]
location = extract_location(prompt)
if not location:
return jsonify({
"mcp_version": MCP_VERSION,
"status": "error",
"error": "No valid location identified in prompt."
}), 400
weather_result = get_weather(location)
return jsonify({
"mcp_version": MCP_VERSION,
"status": "ok",
"result": {
"location": location,
"weather": weather_result
},
"debug": "Extracted location and retrieved weather data successfully."
})
def run_server():
"""
Runs the MCP-compliant weather server on port 5000.
"""
print("Starting MCP-compliant Weather Server on port 5000 with MCP version", MCP_VERSION)
app.run(port=5000)
def run_client():
"""
Runs the MCP client that collects a prompt from the user,
assembles an MCP request, posts it to the server, and prints the response.
"""
print("MCP-compliant Weather Client")
prompt = input("Enter your weather query (e.g., 'What is the weather in London?'): ")
mcp_request = {
"mcp_version": MCP_VERSION,
"service": SERVICE_NAME,
"payload": {
"prompt": prompt
}
}
url = "http://localhost:5000/mcp"
try:
response = requests.post(url, json=mcp_request)
response.raise_for_status()
except Exception as e:
print("Error connecting to MCP server:", e)
return
try:
response_json = response.json()
print("MCP Server Response:")
print(response_json)
except Exception as e:
print("Error parsing response JSON:", e)
print("Raw response:", response.text)
if __name__ == "__main__":
if len(sys.argv) > 1 and sys.argv[1].lower() == "server":
run_server()
else:
run_client()
```
---
How to Run the Example
1. Start the MCP Server:
Open a terminal and run:
```bash
python mcp_weather.py server
```
This starts the Flask server on port 5000.
2. Run the MCP Client:
Open a second terminal and run:
```bash
python mcp_weather.py
```
You’ll be prompted to enter a weather query. For example:
```
What is the weather in New York?
```
3. View the Result:
The client sends an MCP‑compliant JSON payload to the server. The server uses the local NER pipeline to extract the requested location, fetches the current weather from wttr.in, and returns an MCP response which the client prints.
This example demonstrates how to integrate a local LLM with real tool invocation while maintaining adherence to a REST‑based MCP protocol.