Navigation
BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling - MCP Implementation

BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling

BigQuery MCP Server: Lightning-fast analytics, seamless scaling, and cloud-agnostic power. Your data’s new superheroβ€”no capes, just results. πŸš€

✨ Research And Data
4.0(115 reviews)
172 saves
80 comments

Ranked in the top 8% of all AI tools in its category

About BigQuery MCP Server

What is BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling?

BigQuery MCP Server is a robust middleware solution designed to streamline interaction with Google BigQuery. It enables developers and analysts to execute complex queries, manage datasets, and retrieve table metadata with optimized performance. By abstracting low-level API operations, it provides a secure, scalable interface for integrating BigQuery capabilities into applications. Key features include real-time query execution, dataset enumeration, and automated error handling to ensure reliable data processing.

How to Use BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling?

Deployment Workflow

  1. Install dependencies via package manager
  2. Configure authentication using ADC or service account keys
  3. Set location and quota limits via CLI arguments
  4. Run server using provided scripts or direct execution

API Interaction

Use the following endpoints with JSON payloads:

  • Query Execution: POST /query
  • Dataset Discovery: GET /datasets
  • Table Metadata: POST /table-info

Client Implementation

Refer to the included Node.js example demonstrating query execution patterns and error handling strategies.

BigQuery MCP Server Features

Key Features of BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling?

Security Framework

Role-based access control with support for BigQuery.user and custom IAM roles. Enforces partition filter validation to prevent unintended data scans.

Performance Optimization

Automatic query dry-run analysis, concurrent request handling, and result pagination management. Default 500GB processing limit with configurable overrides.

Development Agility

Type-safe API with TypeScript definitions, CLI execution wrapper, and detailed error response formatting including BigQuery-specific diagnostics.

Use Cases of BigQuery MCP Server: Lightning-fast Analytics, Seamless Scaling?

Data Analysis Pipelines

Power ETL workflows with automated query execution and result caching

Real-time Dashboards

Enable low-latency data exploration through pre-validated query templates

Automated Reporting

Schedule recurring query executions with metadata tracking and error notifications

Multi-tenancy Solutions

Isolate dataset access using server-side permissions validation

BigQuery MCP Server FAQ

FAQ: BigQuery MCP Server

What permissions are required?

BigQuery.jobsUser for query execution, BigQuery.dataViewer for dataset discovery, and BigQuery.tablesViewer for metadata access

How to handle quota errors?

Adjust --max-bytes argument or upgrade BigQuery tier in Google Cloud Console

Can I customize error responses?

Yes, through middleware hooks in the server configuration

What's the recommended monitoring setup?

Integrate with Stackdriver using the built-in metric exporter module

Content

BigQuery MCP Server

A Model Context Protocol (MCP) server for accessing Google BigQuery. This server enables Large Language Models (LLMs) to understand BigQuery dataset structures and execute SQL queries.

Features

Authentication and Connection Management

  • Supports Application Default Credentials (ADC) or service account key files
  • Configurable project ID and location settings
  • Authentication verification on startup

Tools

  1. query
* Execute read-only (SELECT) BigQuery SQL queries
* Configurable maximum results and bytes billed
* Security checks to prevent non-SELECT queries
  1. list_all_datasets
* List all datasets in the project
* Returns an array of dataset IDs
  1. list_all_tables_with_dataset
* List all tables in a specific dataset with their schemas
* Requires a datasetId parameter
* Returns table IDs, schemas, time partitioning information, and descriptions
  1. get_table_information
* Get table schema and sample data (up to 20 rows)
* Support for partitioned tables with partition filters
* Warnings for queries on partitioned tables without filters
  1. dry_run_query
* Check query validity and estimate cost without execution
* Returns processing size and estimated cost

Security Features

  • Only SELECT queries are allowed (read-only access)
  • Default limit of 500GB for query processing to prevent excessive costs
  • Partition filter recommendations for partitioned tables
  • Secure handling of authentication credentials

Installation

Local Installation

# Clone the repository
git clone https://github.com/yourusername/bigquery-mcp-server.git
cd bigquery-mcp-server

# Install dependencies
bun install

# Build the server
bun run build

# Install command to your own path.
cp dist/bigquery-mcp-server /path/to/your_place

Docker Installation

You can also run the server in a Docker container:

# Build the Docker image
docker build -t bigquery-mcp-server .

# Run the container
docker run -it --rm \
  bigquery-mcp-server \
  --project-id=your-project-id

Or using Docker Compose:

# Edit docker-compose.yml to set your project ID and other options
# Then run:
docker-compose up

MCP Configuration

To use this server with an MCP-enabled LLM, add it to your MCP configuration:

{
  "mcpServers": {
    "BigQuery": {
      "command": "/path/to/dist/bigquery-mcp-server",
      "args": [
        "--project-id",
        "your-project-id",
        "--location",
        "asia-northeast1",
        "--max-results",
        "1000",
        "--max-bytes-billed",
        "500000000000"
      ],
      "env": {
        "GOOGLE_APPLICATION_CREDENTIALS": "/path/to/service-account-key.json"
      }
    }
  }
}

You can also use Application Default Credentials instead of a service account key file:

{
  "mcpServers": {
    "BigQuery": {
      "command": "/path/to/dist/bigquery-mcp-server",
      "args": [
        "--project-id",
        "your-project-id",
        "--location",
        "asia-northeast1",
        "--max-results",
        "1000",
        "--max-bytes-billed",
        "500000000000"
      ]
    }
  }
}

Setting up Application Default Credentials

To authenticate using Application Default Credentials:

  1. Install the Google Cloud SDK if you haven't already:

    For macOS

brew install --cask google-cloud-sdk

# For other platforms, see: https://cloud.google.com/sdk/docs/install
  1. Run the authentication command:

    gcloud auth application-default login

  2. Follow the prompts to log in with your Google account that has access to the BigQuery project.

  3. The credentials will be saved to your local machine and automatically used by the BigQuery MCP server.

Testing

You can use inspector for testing and debugging.

npx @modelcontextprotocol/inspector dist/bigquery-mcp-server --project-id={{your_own_project}}

Usage

Using the Helper Script

The included run-server.sh script makes it easy to start the server with common configurations:

# Make the script executable
chmod +x run-server.sh

# Run with Application Default Credentials
./run-server.sh --project-id=your-project-id

# Run with a service account key file
./run-server.sh \
  --project-id=your-project-id \
  --location=asia-northeast1 \
  --key-file=/path/to/service-account-key.json \
  --max-results=1000 \
  --max-bytes-billed=500000000000

Manual Execution

You can also run the compiled binary directly:

# Run with Application Default Credentials
./dist/bigquery-mcp-server --project-id=your-project-id

# Run with a service account key file
./dist/bigquery-mcp-server \
  --project-id=your-project-id \
  --location=asia-northeast1 \
  --key-file=/path/to/service-account-key.json \
  --max-results=1000 \
  --max-bytes-billed=500000000000

Example Client

An example Node.js client is included in the examples directory:

# Make the example executable
chmod +x examples/sample-query.js

# Edit the example to set your project ID
# Then run it
cd examples
./sample-query.js

Command Line Options

  • --project-id: Google Cloud project ID (required)
  • --location: BigQuery location (default: asia-northeast1)
  • --key-file: Path to service account key file (optional)
  • --max-results: Maximum rows to return (default: 1000)
  • --max-bytes-billed: Maximum bytes to process (default: 500000000000, 500GB)

Required Permissions

The service account or user credentials should have one of the following:

  • roles/bigquery.user (recommended)

Or both of these:

  • roles/bigquery.dataViewer (for reading table data)
  • roles/bigquery.jobUser (for executing queries)

Example Usage

Query Tool

{
  "query": "SELECT * FROM `project.dataset.table` LIMIT 10",
  "maxResults": 100
}

List All Datasets Tool

// No parameters required

List All Tables With Dataset Tool

{
  "datasetId": "your_dataset"
}

Get Table Information Tool

{
  "datasetId": "your_dataset",
  "tableId": "your_table",
  "partition": "20250101"
}

Dry Run Query Tool

{
  "query": "SELECT * FROM `project.dataset.table` WHERE date = '2025-01-01'"
}

Error Handling

The server provides detailed error messages for:

  • Authentication failures
  • Permission issues
  • Invalid queries
  • Missing partition filters
  • Excessive data processing requests

Code Structure

The server is organized into the following structure:

src/
β”œβ”€β”€ index.ts              # Entry point
β”œβ”€β”€ server.ts             # BigQueryMcpServer class
β”œβ”€β”€ types.ts              # Type definitions
β”œβ”€β”€ tools/                # Tool implementations
β”‚   β”œβ”€β”€ query.ts          # query tool
β”‚   β”œβ”€β”€ list-datasets.ts  # list_all_datasets tool
β”‚   β”œβ”€β”€ list-tables.ts    # list_all_tables_with_dataset tool
β”‚   β”œβ”€β”€ table-info.ts     # get_table_information tool
β”‚   └── dry-run.ts        # dry_run_query tool
└── utils/                # Utility functions
    β”œβ”€β”€ args-parser.ts    # Command line argument parser
    └── query-utils.ts    # Query validation and response formatting

License

MIT

Related MCP Servers & Clients