A fully-compliant Model Context Protocol (MCP) server that surfaces Google BigQuery functionality to LLM agents and other MCP clients. Implements MCP specification rev 2025-03-26.
- Implements MCP specification rev 2025-03-26
- Supports stdio transport (default) and optional HTTP transport
- Exposes BigQuery operations through MCP Tools
- Includes direct stdio implementation optimized for Claude Desktop
- Supports pagination for long result sets
- Implements logging utilities
- Handles errors according to JSON-RPC standards
- Supports environment variable configuration for project ID and location
- Optimized INFORMATION_SCHEMA query handling
- Docker support for easy deployment
# Install from source
git clone https://github.com/haginot/bigquery-mcp.git
cd bigquery-mcp
pip install .
# Or using Poetry
poetry installThe server uses Google Cloud authentication. You need to set up authentication credentials:
# Set the environment variable to your service account key file
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service-account-key.json
# Or use gcloud to authenticate
gcloud auth application-default login# Start with stdio transport (default)
mcp-bigquery-server
# Start with HTTP transport
mcp-bigquery-server --http --port 8000
# Enable resource exposure
mcp-bigquery-server --expose-resources
# Set query timeout
mcp-bigquery-server --query-timeout-ms 60000from mcp_bigquery_server.server import BigQueryMCPServer
# Create and start the server
server = BigQueryMCPServer(
expose_resources=True,
http_enabled=True,
host="localhost",
port=8000,
query_timeout_ms=30000,
)
server.start()You can run the MCP server in a Docker container:
# Build the Docker image
docker build -t mcp-bigquery-server .
# Run with stdio transport (for use with Claude Desktop)
docker run -i --rm \
-v /path/to/credentials:/credentials \
-e GOOGLE_APPLICATION_CREDENTIALS=/credentials/service-account-key.json \
-e PROJECT_ID=your-project-id \
-e LOCATION=US \
mcp-bigquery-server --stdio
# Run with HTTP transport
docker run -p 8000:8000 --rm \
-v /path/to/credentials:/credentials \
-e GOOGLE_APPLICATION_CREDENTIALS=/credentials/service-account-key.json \
-e PROJECT_ID=your-project-id \
-e LOCATION=US \
mcp-bigquery-serverYou can also use docker-compose:
# Create a credentials directory and copy your service account key
mkdir -p credentials
cp /path/to/your/service-account-key.json credentials/service-account-key.json
# Start the HTTP server with docker-compose
docker-compose up
# Start the stdio server with docker-compose (for Claude Desktop)
docker-compose up mcp-bigquery-server-stdioTo use this MCP server with Claude Desktop:
-
Run the server with stdio transport:
# Using Docker directly docker run -i --rm \ -v /path/to/credentials:/credentials \ -e GOOGLE_APPLICATION_CREDENTIALS=/credentials/service-account-key.json \ -e PROJECT_ID=your-project-id \ -e LOCATION=US \ mcp-bigquery-server --stdio # Or using docker-compose (recommended) docker-compose up mcp-bigquery-server-stdio
-
In Claude Desktop:
- Go to Settings > Tools
- Select "Add Tool" > "Add MCP Tool"
- Choose "Connect to running MCP server"
- Select "stdio" as the transport
- Click "Connect" and select the terminal running your Docker container
Claude will now have access to all BigQuery operations through the MCP server.
The server exposes the following BigQuery operations as MCP tools:
execute_query: Submit a SQL query to BigQuery, optionally as dry-runexecute_query_with_results: Submit a SQL query to BigQuery and return results immediatelyget_job_status: Poll job execution statecancel_job: Cancel a running BigQuery jobfetch_results_chunk: Page through resultslist_datasets: Enumerate datasets visible to the service accountget_table_schema: Retrieve schema for a table
When enabled with --expose-resources, the server exposes:
- Dataset & table schemas as read-only resources (
bq://<project>/<dataset>/schema) - Query result sets (chunk URIs)
MIT