Artificial intelligence is developing at a remarkable pace. Models today can reason, write, code, and analyze information in ways that once seemed impossible.
But there’s one major limitation that still holds them back: context.
Most AI models don’t have access to your system, files, APIs, or live data. They only know what you tell them in a gesture.
Model context protocolalso known as MCP, was created to solve this problem. It enables AI models to run securely from your own tools, APIs, and systems via small, structured servers known as MCP servers.
In this guide, you will learn how to create your own MCP server using Python. We’ll go through each part of the code and I’ll explain how it works.
Finally, you’ll have a running MCP server that can add numbers, return random words, and fetch live weather data from the Internet. We will also see how to host this MCP server on the cloud.
What we will cover:
What is Model Context Protocol?
Before diving into the code, it’s important to understand what the model context protocol actually is.
MCP is an open standard that defines how AI models and external systems communicate. You can think of it as an API designed specifically for AI assistants.
If an API allows two software programs to exchange data, MCPAI allows the model to talk to your system. This opens up endless possibilities.
You can create an MCP server that lets chatgpt read files from your local machine, or one that calls your company’s internal APIs to fetch data. You can even expose your own Python functions so that the model can use them as tools.
MCP makes this communication organized, secure and scalable. It runs on familiar web technologies such as Events associated with the serveror SSE, which allows the server to send a real-time data stream to the client.
Setting up your environment
To follow along, you’ll need Python version 3.9 or higher. You can find the code for this example In this collection.
We will use a library called Fast MCP which simplifies the process of building MCP servers. You can install it using PIP:
pip install fastmcp requests
requests The library will be used later to make HTTP calls for example. Once installed, you are ready to create your first MCP server.
Creating a project
Create a new file called server.py And start by importing the necessary modules:
import logging
import os
import random
import sys
import requests
from mcp.server.fastmcp import FastMCP
Here’s what each one does:
loggingThe module records what your server is doing.osUsed to access environment variables such as port number.randomwill help us generate random words.sysAllows the script to gracefully exit in case of errors.requestsAllows us to fetch data directly from APIs.and finally,
FastMCPConverts our Python functions into tools that can be called via the MCP protocol.
Logging configuration
Logging gives you visibility into what your server is doing. This helps during development and is very important when you deploy your server to production.
name = "demo-mcp-server"
logging.basicConfig(
level=logging.INFO,
format='%(name)s - %(levelname)s - %(message)s',
handlers=(logging.StreamHandler())
)
logger = logging.getLogger(name)
This configuration prints log messages to the console in a simple format showing the server name, log level, and message. Every time a tool is run, a message will appear in the log like:
demo-mcp-server - INFO - Tool called: add(3, 5)
Creating an MCP Server
Next, we’ll create the server instance that will host our tools.
port = int(os.environ.get('PORT', 8080))
mcp = FastMCP(name, logger=logger, port=port)
The server will run on the port specified by the environment variable PORT. If this variable is not set, it defaults to 8080. FastMCP The object now represents your running MCP server.
Each function you decorate with @mcp.tool() Becomes an accessible tool that clients can call upon. Let’s start with a simple example: an add-on tool.
Example 1: Adding two numbers
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
logger.info(f"Tool called: add({a}, {b})")
return a + b
This tool takes two numbers, logs the call, and refunds them. to call add(3, 5) 8 will return.
Although this is simple, it shows the basic structure of every MCP tool: input parameters, logging statement, and return value.
Example 2: Returning a random password
Let’s create another tool that returns a random word from a small list.
@mcp.tool()
def get_secret_word() -> str:
"""Get a random secret word"""
logger.info("Tool called: get_secret_word()")
return random.choice(("apple", "banana", "cherry"))
When you call this function, it picks one of three random words. Each time you say it, you may get a different result. This function demonstrates how MCP tools can use logic or randomness just like any regular Python function.
Example 3: Fetching weather data
Now let’s do something more practical. We will build a tool that will fetch weather data directly from the web using the web requests The library
@mcp.tool()
def get_current_weather(city: str) -> str:
"""Get current weather for a city"""
logger.info(f"Tool called: get_current_weather({city})")
try:
endpoint = "
response = requests.get(f"{endpoint}/{city}", timeout=10)
response.raise_for_status()
return response.text
except requests.RequestException as e:
logger.error(f"Error fetching weather data: {str(e)}")
return f"Error fetching weather data: {str(e)}"
This tool accepts the city name, sends a request to the public weather service wttr.inand returns a text-based weather report. If there is a problem, such as a network timeout or an incorrect city name, the function logs an error and returns an explanatory message.
to call get_current_weather("London") will print a brief summary of the weather for that city.
The server is running
Once all your tools are defined, you can start the server. Add the following code to the bottom of your file:
if __name__ == "__main__":
logger.info(f"Starting MCP Server on port {port}...")
try:
mcp.run(transport="sse")
except Exception as e:
logger.error(f"Server error: {str(e)}")
sys.exit(1)
finally:
logger.info("Server terminated")
This block initializes the server using the Server Cent events transport method. If something goes wrong, it logs the error and shuts down cleanly.
Now you can run the server from your terminal:
python server.py
If everything is working, you will see:
demo-mcp-server - INFO - Starting MCP Server on port 8080...
Your MCP server is now live and ready to accept requests.
To test your tools, you need an MCP-compliant client such as ChatGPT with developer features or another app that supports the protocol. Once connected, the client will list your available tools.
For example, you can send a request like this:
{
"tool": "add",
"args": (5, 7)
}
The server will respond with:
{
"result": 12
}
The same applies to other tools such as get_secret_word or get_current_weather.
If you want to test the server directly without an MCP client, you can still send HTTP requests manually (although this bypasses the full protocol logic).
For example, to test your weather tool, you can send a simple get request:
curl http://localhost:8080/tool/get_current_weather?city=London
or in Python:
import requests
response = requests.get("http://localhost:8080/tool/get_current_weather", params={"city": "London"})
print(response.text)
It will not use MCP structures (eg sse streaming), but it’s a quick serious check that your server works.
Deploying your MCP server in Seoul
You can run this server locally for development. But if you want to use it in production applications, you have to deploy it to the server.
You can choose any cloud provider, such as AWS, Heroku, or others, to set up this project. But I will use Siola.
Seoul is a modern, usability-oriented platform-as-a-service provider. It offers application hosting, database, object storage, and static site hosting for your projects.
I’m using Sevilla for hosting for two reasons:
Each platform will charge you to create a cloud resource. Siola comes with a $50 credit for our use, so we won’t incur any costs for this instance.
Seoul has one Template for Python MCP Serverso it simplifies the manual installation and setup for each resource you’ll need to install.
Login Click on Seela and Templates. You can see Python MCP Server as one of the templates.

Click on the “Python MCP Server” template. You will see the resources required to deliver the application. Click on “Deploy Template”.

You can see that the resources are provisioned. If the deployment does not start automatically, click “Deploy Now”.

Wait a few minutes. Once the deployment is complete, you will see a green check mark.

Once the configuration is complete, click “View App”. You will get cloud url like https://python-mcp-server-rlfdk.sevalla.app. Use this as the base URL instead of localhost: 3000 URLs.
You now have a production-grade MCP server running on the cloud. You can plug it into any application to fetch data for our LLM applications.
Why Build Your Own MCP Server?
Building an MCP server gives you control and flexibility.
You can connect AI models directly to your database or internal systems, automate repetitive steps, and decide what data the AI ​​model can access.
It also allows you to experiment quickly. You can start small with a few simple tools and later expand to complex workflows.
By building your own MCP server, you’re not just writing code—you’re defining how intelligent systems interact with the real world through your logic and data.
Extend the server
Once you’ve mastered the basics, growing your server is easy. You can add tools that read and write files, query databases, communicate with APIs like GitHub or Slack, or monitor your system. Each new function becomes another tool your AI can use.
This modular approach allows you to build an entire ecosystem of AI-Avare tools, each performing a specific task but working together through the same MCP interface.
The result
In this tutorial, you learned how to create an MCP server in Python using the Fast MCP library. You configured logging, set up the server, defined various tools, and learned how to run and test it. You also see how easily these tools can expose real functionality, such as fetching live weather data or performing basic calculations.
This structure is simple and powerful. With just a few lines of Python code, you can build a bridge between your system and an intelligent model. The Model Context Protocol represents a step toward AI systems that can truly understand and interact with real-world data and functions.
Hope you enjoyed this article. Sign up for my free newsletter turingtalks.ai For more tutorials on AI. You can too Visit my website.