Navigation
MCP Proxy: Slash Dev Time, Simplify API Workflows - MCP Implementation

MCP Proxy: Slash Dev Time, Simplify API Workflows

Slash dev time & complexity: MCP Proxy dynamically bridges AI agents to APIs via standardized tools, turning OpenAPI specs into seamless workflows. Developers love it!" )

Research And Data
4.6(150 reviews)
225 saves
105 comments

This tool saved users approximately 14057 hours last month!

About MCP Proxy

What is MCP Proxy: Slash Dev Time, Simplify API Workflows?

MCP Proxy is an open-source middleware designed to streamline the integration of APIs with Large Language Models (LLMs) like Cursor, Windsurf, and Claude Desktop. It automates configuration workflows, handles parameter type conversions, and provides dry-run capabilities to reduce development time by 50-70%. By parsing OpenAPI specifications, it eliminates manual setup steps while maintaining compatibility with major LLM orchestrators.

Key Features of MCP Proxy: Slash Dev Time, Simplify API Workflows?

  • Automatic Type Conversion: Converts string inputs to integers, floats, booleans, and dates based on API specifications
  • Built-in Workflow Tools: Preconfigured methods for server initialization, tool discovery, and parameter validation
  • Dry-Run Mode: Simulate API calls without execution to validate parameters and endpoints
  • Cross-Platform Compatibility: Works with Cursor, Windsurf, and Claude Desktop via standardized configuration profiles
  • Environment Abstraction: Centralized management of API credentials and server parameters through env vars

MCP Proxy Features

How to Use MCP Proxy: Slash Dev Time, Simplify API Workflows?

  1. Configure environment variables with your OpenAPI endpoint and server name
  2. Launch the proxy server using Python runtime environment
  3. Register the server in your LLM orchestrator's MCP configuration file
  4. Call APIs through standardized JSON-RPC 2.0 interfaces with automatic parameter handling

Advanced usage includes query string injection and parameter overrides for complex API requirements.

Use Cases of MCP Proxy: Slash Dev Time, Simplify API Workflows?

  • Rapid API prototyping during development sprints
  • Automating repetitive configuration tasks for multiple API endpoints
  • Creating sandbox environments for API testing without production impact
  • Integrating legacy systems into modern LLM-driven workflows

MCP Proxy FAQ

FAQ: MCP Proxy Best Practices

Q: How does parameter conversion work?
A: The proxy analyzes OpenAPI schema definitions to apply appropriate conversions (e.g., "true"→boolean, "2023-09-01"→date)

Q: Can I use custom error handling?
A: Yes, through extended JSON response formats in your API definitions

Q: What logging capabilities exist?
A: Built-in request/response logging with optional verbosity levels for debugging

Full documentation and source code available on GitHub

Content

OpenAPI to Model Context Protocol (MCP)

License: MIT Repo Size Last Commit Open Issues

OpenAPI-MCP

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools. This simplifies integration, eliminating the need for custom API wrappers.


Why MCP?

The Model Context Protocol (MCP), developed by Anthropic, standardizes communication between Large Language Models (LLMs) and external tools. By acting as a universal adapter, MCP enables AI agents to interface with external APIs seamlessly.


Key Features

  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • Dry Run Mode: Simulates API calls without execution for inspection.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from OpenAPI.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Query String Parsing: Supports direct passing of query parameters as a string.
  • Enhanced Parameter Handling: Automatically converts parameters to correct data types.
  • Extended Tool Metadata: Includes detailed parameter information for better LLM understanding.
  • FastMCP Transport: Optimized for stdio, works out-of-the-box with agents.

Quick Start

Installation

git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt

Environment Configuration

Variable Description Required Default
OPENAPI_URL URL to the OpenAPI specification Yes -
SERVER_NAME MCP server name No openapi_proxy_server
OAUTH_CLIENT_ID OAuth client ID No -
OAUTH_CLIENT_SECRET OAuth client secret No -
OAUTH_TOKEN_URL OAuth token endpoint URL No -
OAUTH_SCOPE OAuth scope No api

How It Works

  1. Parses OpenAPI spec using httpx and PyYAML if needed.
  2. Extracts operations and generates MCP-compatible tools with proper names.
  3. Authenticates using OAuth2 (if credentials are present).
  4. Builds input schemas based on OpenAPI parameter definitions.
  5. Handles calls via JSON-RPC 2.0 protocol with automatic error responses.
  6. Supports extended parameter information for improved LLM understanding.
  7. Handles query string parsing for easier parameter passing.
  8. Performs automatic type conversion based on OpenAPI schema definitions.
  9. Supports dry_run to inspect outgoing requests without invoking them.
sequenceDiagram
    participant LLM as LLM (Claude/GPT)
    participant MCP as OpenAPI-MCP Proxy
    participant API as External API

    Note over LLM, API: Communication Process
    
    LLM->>MCP: 1. Initialize (initialize)
    MCP-->>LLM: Metadata and tool list
    
    LLM->>MCP: 2. Request tools (tools_list)
    MCP-->>LLM: Detailed tool list from OpenAPI specification
    
    LLM->>MCP: 3. Call tool (tools_call)
    
    alt With OAuth2
        MCP->>API: Request OAuth2 token
        API-->>MCP: Access Token
    end
    
    MCP->>API: 4. Execute API call with proper formatting
    API-->>MCP: 5. API response (JSON)
    
    alt Type Conversion
        MCP->>MCP: 6. Convert parameters to correct data types
    end
    
    MCP-->>LLM: 7. Formatted response from API
    
    alt Dry Run Mode
        LLM->>MCP: Call with dry_run=true
        MCP-->>LLM: Display request information without executing call
    end

Built-in Tools

These tools are always available:

  • initialize – Returns server metadata and protocol version.
  • tools_list – Lists all registered tools (from OpenAPI and built-in) with extended metadata.
  • tools_call – Calls any tool by name with arguments.

Advanced Usage

Query String Passing

You can pass query parameters as a string in the kwargs parameter:

{
  "jsonrpc": "2.0",
  "method": "tools_call",
  "params": {
    "name": "get_pets",
    "arguments": {
      "kwargs": "status=available&limit=10"
    }
  },
  "id": 1
}

Parameter Type Conversion

The server automatically converts parameter values to the appropriate type based on the OpenAPI specification:

  • String parameters remain as strings
  • Integer parameters are converted using int()
  • Number parameters are converted using float()
  • Boolean parameters are converted from strings like "true", "1", "yes", "y" to True

LLM Orchestrator Configuration

Cursor (~/.cursor/mcp.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Cursor

Windsurf (~/.codeium/windsurf/mcp_config.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Windsurf

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json)

{
    "mcpServers": {
        "petstore3": {
            "command": "full_path_to_openapi_mcp/venv/bin/python",
            "args": ["full_path_to_openapi_mcp/src/server.py"],
            "env": {
                "SERVER_NAME": "petstore3",
                "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
            },
            "transport": "stdio"
        }
    }
}

Contributing

  • Fork this repo
  • Create a new branch
  • Submit a pull request with a clear description

License

MIT License


If you find it useful, give it a ⭐ on GitHub!

Related MCP Servers & Clients