Navigation
MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture - MCP Implementation

MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture

Explore MCP's multi-server architecture with real-time SDIO/SSE integration and LangChain-MCP, enabling seamless scaling and high-performance async data streaming for enterprise workloads.

Developer Tools
4.9(146 reviews)
219 saves
102 comments

Users create an average of 10 projects per month with this tool

About MCP Multi-Server Demo

What is MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture?

This demo showcases the Model Context Protocol (MCP) implementation across multiple servers using different transport mechanisms. It combines a math service (via stdio) and a weather service (via Server-Sent Events/SSE) to demonstrate real-time inter-server communication and scalable system design. The solution integrates with LangChain agents to enable hybrid tool utilization from both backend services.

How to use MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture?

  1. Install dependencies using pip install -r requirements.txt
  2. Configure OpenAI API key in .env
  3. Run python main.py to start the agent with pre-configured servers
  4. Test queries like "3+5×12" for math operations or "weather in NYC" for real-time weather data

Full source code available here.

MCP Multi-Server Demo Features

Key Features of MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture?

  • Hybrid transport protocols: stdio for batch processing + SSE for real-time updates
  • Modular server architecture allowing independent scaling
  • LangChain agent integration for cross-service decision making
  • Simulated weather data with optional real-world API replacement
  • Standardized MCP protocol adherence for future expansion

Use cases of MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture?

Applications include:

  • Real-time data processing pipelines combining multiple services
  • AI agents requiring access to both static and dynamic data sources
  • API gateway implementations with protocol-agnostic backend communication
  • Education/training platforms demonstrating microservices communication patterns

MCP Multi-Server Demo FAQ

FAQ from MCP Multi-Server Demo: Real-Time Streaming & Scalable Architecture?

Why use both stdio and SSE?

stdio handles stateless batch operations while SSE enables real-time data streaming, demonstrating MCP's protocol flexibility

Can I add more servers?

Yes - the architecture supports adding new services through MCP client registration and protocol implementation

What dependencies are required?

Python 3.8+, OpenAI API key, and listed packages in requirements.txt

How does real-time streaming work?

The weather server pushes updates via HTTP/EventStream protocol, processed by the MCP client's SSE handler

What error handling exists?

Basic connection retries are implemented, with error logging for failed server communications

Content

MCP Multi-Server Demo with SSE Transport

This project demonstrates how to use the Model Context Protocol (MCP) with multiple servers using different transport methods (stdio and Server-Sent Events).

It is based on examples from: https://github.com/langchain-ai/langchain-mcp-adapters

Overview

The project consists of:

  1. A math server that provides basic arithmetic operations (add, multiply)
  2. A weather server that provides simulated weather information for different locations
  3. A main application that connects to both servers using the MultiServerMCPClient
  4. Integration with LangChain and OpenAI to create an agent that can use tools from both servers

Requirements

  • Python 3.8+
  • OpenAI API key

Installation

  1. Clone the repository:

    git clone https://github.com/yourusername/mcp-sse.git

cd mcp-sse
  1. Install the required dependencies:

    pip install -r requirements.txt

  2. Set up your OpenAI API key in a .env file:

    OPENAI_API_KEY=your-api-key-here

Project Structure

  • main.py: The main application that connects to both servers and runs the agent
  • math_server.py: A simple MCP server that provides math operations using stdio transport
  • weather_server.py: A simple MCP server that provides weather information using SSE transport
  • requirements.txt: List of Python dependencies
  • .env: Environment variables (contains your OpenAI API key)

How It Works

The application demonstrates how to use the MultiServerMCPClient to connect to multiple MCP servers with different transport methods:

  1. The math server uses stdio transport, which is a simple pipe-based communication method
  2. The weather server uses Server-Sent Events (SSE) transport, which is an HTTP-based protocol for server-to-client push notifications

The main application:

  1. Starts the weather server as a separate process
  2. Connects to both servers using the MultiServerMCPClient
  3. Creates a LangChain agent that can use tools from both servers
  4. Demonstrates using the agent to perform math calculations and get weather information

Usage

Run the main application:

python main.py

This will:

  1. Start the weather server on port 8000
  2. Connect to both the math and weather servers
  3. Run the agent with example queries for both math and weather

Example Queries

The demo includes two example queries:

  1. Math query: "what's (3 + 5) x 12?"
  2. Weather query: "what is the weather in nyc?"

Extending the Project

You can extend this project by:

  1. Adding more tools to the math or weather servers
  2. Creating additional MCP servers with different functionality
  3. Modifying the agent to handle more complex queries
  4. Connecting to real weather APIs instead of using simulated data

License

MIT

Acknowledgments

Related MCP Servers & Clients