Navigation
Petstore3: Premium Care & Unmatched Love - MCP Implementation

Petstore3: Premium Care & Unmatched Love

Petstore3: Where premium pet care meets unmatched love. Your furry friend’s joy is our mission. Shop smart, spoil right. 🐾

Developer Tools
4.1(193 reviews)
289 saves
135 comments

49% of users reported increased productivity after just one week

About Petstore3

What is Petstore3: Premium Care & Unmatched Love?

Petstore3 is an advanced AI-driven platform designed to bridge the gap between pet care services and seamless API integration. Leveraging the Model Context Protocol (MCP), it dynamically translates pet care workflows into standardized tools, ensuring effortless interaction between AI agents and veterinary systems. This eliminates manual API configuration while prioritizing the highest standards of pet welfare and user experience.

How to Use Petstore3: Premium Care & Unmatched Love?

Begin by configuring the MCP proxy server with your pet care API specifications:

  1. Clone the repository and install dependencies
  2. Set environment variables for your OpenAPI endpoint and OAuth credentials
  3. Deploy the server via your preferred LLM orchestrator (Cursor/Windsurf/Claude Desktop)
  4. Interact through standardized tool calls for pet records, medication tracking, and emergency services

Use dry_run mode to preview requests before execution, ensuring precision in critical care scenarios.

Petstore3 Features

Key Features of Petstore3: Premium Care & Unmatched Love?

  • Automated Compliance: Converts pet health data formats to match veterinary API requirements
  • Secure Authentication: OAuth2 support for HIPAA-compliant access to sensitive pet records
  • Smart Parameter Handling: Automatically validates and sanitizes medication dosages and treatment schedules
  • Emergency Protocols: Pre-configured tools for urgent care workflows with priority routing
  • User-Friendly Queries: Pass vet notes or symptom descriptions directly as query strings

Use Cases of Petstore3: Premium Care & Unmatched Love?

Power intelligent veterinary workflows such as:

  • Real-time allergy tracking across multiple pet databases
  • Automated prescription refills with dosage validation
  • Emergency care coordination between clinics and pet owners
  • Pet behavioral analysis through activity API integration

Example: A vet can instantly query vaccination histories across clinics using tools_call

Petstore3 FAQ

FAQ from Petstore3: Premium Care & Unmatched Love?

How does Petstore3 ensure data security?
Uses OAuth2 machine credentials and encrypts sensitive pet health information during transmission
Can I test requests without affecting real pets?
Yes - enable dry_run to simulate API calls for training or validation
What if my API uses non-standard parameters?
The auto-type conversion handles data format mismatches based on OpenAPI definitions
Does this work with existing vet systems?
Compatible with any API exposing an OpenAPI 3.0 specification

Content

OpenAPI to Model Context Protocol (MCP)

License: MIT Repo Size Last Commit Open Issues

OpenAPI-MCP

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools. This simplifies integration, eliminating the need for custom API wrappers.


Why MCP?

The Model Context Protocol (MCP), developed by Anthropic, standardizes communication between Large Language Models (LLMs) and external tools. By acting as a universal adapter, MCP enables AI agents to interface with external APIs seamlessly.


Key Features

  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • Dry Run Mode: Simulates API calls without execution for inspection.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from OpenAPI.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Query String Parsing: Supports direct passing of query parameters as a string.
  • Enhanced Parameter Handling: Automatically converts parameters to correct data types.
  • Extended Tool Metadata: Includes detailed parameter information for better LLM understanding.
  • FastMCP Transport: Optimized for stdio, works out-of-the-box with agents.

Quick Start

Installation

git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt

Environment Configuration

Variable Description Required Default
OPENAPI_URL URL to the OpenAPI specification Yes -
SERVER_NAME MCP server name No openapi_proxy_server
OAUTH_CLIENT_ID OAuth client ID No -
OAUTH_CLIENT_SECRET OAuth client secret No -
OAUTH_TOKEN_URL OAuth token endpoint URL No -
OAUTH_SCOPE OAuth scope No api

How It Works

  1. Parses OpenAPI spec using httpx and PyYAML if needed.
  2. Extracts operations and generates MCP-compatible tools with proper names.
  3. Authenticates using OAuth2 (if credentials are present).
  4. Builds input schemas based on OpenAPI parameter definitions.
  5. Handles calls via JSON-RPC 2.0 protocol with automatic error responses.
  6. Supports extended parameter information for improved LLM understanding.
  7. Handles query string parsing for easier parameter passing.
  8. Performs automatic type conversion based on OpenAPI schema definitions.
  9. Supports dry_run to inspect outgoing requests without invoking them.
sequenceDiagram
    participant LLM as LLM (Claude/GPT)
    participant MCP as OpenAPI-MCP Proxy
    participant API as External API

    Note over LLM, API: Communication Process
    
    LLM->>MCP: 1. Initialize (initialize)
    MCP-->>LLM: Metadata and tool list
    
    LLM->>MCP: 2. Request tools (tools_list)
    MCP-->>LLM: Detailed tool list from OpenAPI specification
    
    LLM->>MCP: 3. Call tool (tools_call)
    
    alt With OAuth2
        MCP->>API: Request OAuth2 token
        API-->>MCP: Access Token
    end
    
    MCP->>API: 4. Execute API call with proper formatting
    API-->>MCP: 5. API response (JSON)
    
    alt Type Conversion
        MCP->>MCP: 6. Convert parameters to correct data types
    end
    
    MCP-->>LLM: 7. Formatted response from API
    
    alt Dry Run Mode
        LLM->>MCP: Call with dry_run=true
        MCP-->>LLM: Display request information without executing call
    end

Built-in Tools

These tools are always available:

  • initialize – Returns server metadata and protocol version.
  • tools_list – Lists all registered tools (from OpenAPI and built-in) with extended metadata.
  • tools_call – Calls any tool by name with arguments.

Advanced Usage

Query String Passing

You can pass query parameters as a string in the kwargs parameter:

{
  "jsonrpc": "2.0",
  "method": "tools_call",
  "params": {
    "name": "get_pets",
    "arguments": {
      "kwargs": "status=available&limit=10"
    }
  },
  "id": 1
}

Parameter Type Conversion

The server automatically converts parameter values to the appropriate type based on the OpenAPI specification:

  • String parameters remain as strings
  • Integer parameters are converted using int()
  • Number parameters are converted using float()
  • Boolean parameters are converted from strings like "true", "1", "yes", "y" to True

LLM Orchestrator Configuration

Cursor (~/.cursor/mcp.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Cursor

Windsurf (~/.codeium/windsurf/mcp_config.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Windsurf

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json)

{
    "mcpServers": {
        "petstore3": {
            "command": "full_path_to_openapi_mcp/venv/bin/python",
            "args": ["full_path_to_openapi_mcp/src/server.py"],
            "env": {
                "SERVER_NAME": "petstore3",
                "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
            },
            "transport": "stdio"
        }
    }
}

Contributing

  • Fork this repo
  • Create a new branch
  • Submit a pull request with a clear description

License

MIT License


If you find it useful, give it a ⭐ on GitHub!

Related MCP Servers & Clients