A Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.
Features:
- π Execute SELECT queries on BigQuery datasets
- π List all accessible tables across datasets
- π Retrieve detailed table schemas
- π Service account authentication support
- π― Dataset filtering for security and performance
- π Dual transport support (stdio for local, HTTP/SSE for cloud deployment)
This server can be deployed in multiple ways to suit different use cases:
- π¦ PyPI Package - Install via
uvxoruvfor local use with Claude Desktop or other MCP clients - π³ Docker Hub - Pre-built multi-architecture images available at
timoschd/mcp-server-bigquery - βοΈ Google Cloud Run - Deploy as a serverless HTTP/SSE endpoint with automatic scaling
- π§ Local Development - Use Podman Compose for containerized local development
All deployment methods support both stdio (for local MCP clients) and HTTP/SSE (for cloud/remote access) transports.
The server implements three tools:
-
execute-query: Executes a SQL query using BigQuery dialect- Input:
query(string) - SELECT SQL query to execute - Returns: Query results as a list of dictionaries
- Input:
-
list-tables: Lists all tables in the BigQuery database- Input: None
- Returns: List of fully-qualified table names (format:
dataset.table)
-
describe-table: Describes the schema of a specific table- Input:
table_name(string) - Fully-qualified table name (e.g.,my_dataset.my_table) - Returns: Table DDL (Data Definition Language) with complete schema information
- Input:
Once connected to an MCP client (like Claude Desktop), you can ask questions like:
- "What tables are available in my BigQuery project?"
- "Show me the schema for the
analytics.user_eventstable" - "Query the top 10 users by activity from the
analytics.user_eventstable"
The LLM will automatically use the appropriate tools to answer your questions.
The server can be configured either with command line arguments or environment variables.
| Argument | Environment Variable | Required | Description |
|---|---|---|---|
--project |
BIGQUERY_PROJECT |
Yes | The GCP project ID. |
--location |
BIGQUERY_LOCATION |
Yes | The GCP location (e.g. europe-west4, us-central1). |
--dataset |
BIGQUERY_DATASETS |
No | Only take specific BigQuery datasets into consideration. Several datasets can be specified by repeating the argument (e.g. --dataset my_dataset_1 --dataset my_dataset_2) or by joining them with a comma in the environment variable (e.g. BIGQUERY_DATASETS=my_dataset_1,my_dataset_2). If not provided, all datasets in the project will be considered. |
--key-file |
BIGQUERY_KEY_FILE |
No | Path to a service account key file for BigQuery. If not provided, the server will use Application Default Credentials (ADC). |
--transport |
MCP_TRANSPORT |
No | Transport type: stdio (default), http, or sse. Use stdio for local MCP clients, http/sse for cloud deployments. |
--port |
PORT or MCP_PORT |
No | Port number for HTTP/SSE transport (default: 8080). Ignored when using stdio transport. |
To install BigQuery Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-server-bigquery --client claudeOn MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
"mcpServers": {
"bigquery": {
"command": "uv",
"args": [
"--directory",
"{{PATH_TO_REPO}}",
"run",
"mcp-server-bigquery",
"--project",
"{{GCP_PROJECT_ID}}",
"--location",
"{{GCP_LOCATION}}"
]
}
}"mcpServers": {
"bigquery": {
"command": "uvx",
"args": [
"mcp-server-bigquery",
"--project",
"{{GCP_PROJECT_ID}}",
"--location",
"{{GCP_LOCATION}}"
]
}
}To connect to a remotely deployed server (e.g., on Cloud Run):
"mcpServers": {
"bigquery": {
"transport": "sse",
"url": "https://your-server-url.run.app/messages"
}
}Replace {{PATH_TO_REPO}}, {{GCP_PROJECT_ID}}, {{GCP_LOCATION}}, and https://your-server-url.run.app with the appropriate values.
The server can be deployed as a Docker container for cloud environments (e.g., Google Cloud Run, Kubernetes).
Docker images are automatically built and published to Docker Hub via GitHub Actions. You can use the pre-built images or build your own.
# Pull the latest image from Docker Hub
docker pull timoschd/mcp-server-bigquery:latest
# Or pull a specific version
docker pull timoschd/mcp-server-bigquery:v0.3.0docker build -t mcp-server-bigquery .
# or with Podman
podman build -t mcp-server-bigquery .The repository includes a GitHub Actions workflow that automatically builds and publishes multi-architecture images (amd64/arm64) to Docker Hub. See .github/workflows/README.md for setup instructions.
Local stdio mode:
docker run -it \
-e BIGQUERY_PROJECT=your-project-id \
-e BIGQUERY_LOCATION=us-central1 \
timoschd/mcp-server-bigquery:latestHTTP/SSE mode (for cloud deployment):
docker run -p 8080:8080 \
-e BIGQUERY_PROJECT=your-project-id \
-e BIGQUERY_LOCATION=us-central1 \
-e MCP_TRANSPORT=http \
-e PORT=8080 \
timoschd/mcp-server-bigquery:latestWith service account authentication:
docker run -p 8080:8080 \
-v /path/to/key.json:/app/secrets/key.json \
-e BIGQUERY_PROJECT=your-project-id \
-e BIGQUERY_LOCATION=us-central1 \
-e BIGQUERY_KEY_FILE=/app/secrets/key.json \
-e MCP_TRANSPORT=http \
timoschd/mcp-server-bigquery:latestA podman-compose.yml file is provided for easy local development:
# Copy and customize the environment file
cp .env.example .env
# Start the service
podman-compose upOR
docker-compose upThe compose file supports configurable environment variables:
PORT: External port mapping (default: 8085)BIGQUERY_PROJECT: Your GCP project IDBIGQUERY_LOCATION: BigQuery location/regionBIGQUERY_KEY_FILE: Optional path to service account key
Manual deployment:
# Build and push to Google Container Registry
gcloud builds submit --tag gcr.io/YOUR_PROJECT_ID/mcp-server-bigquery
# Deploy to Cloud Run
gcloud run deploy mcp-server-bigquery \
--image gcr.io/YOUR_PROJECT_ID/mcp-server-bigquery \
--platform managed \
--region us-central1 \
--set-env-vars BIGQUERY_PROJECT=your-project-id,BIGQUERY_LOCATION=us-central1,MCP_TRANSPORT=http \
--allow-unauthenticated \
--port 8080Automated deployment with GitHub Actions:
An example GitHub Actions workflow is provided for automated deployments. See .github/workflows/README.md for detailed setup instructions.
# Copy the example workflow
cp .github/workflows/deploy-cloud-run.yml.example .github/workflows/deploy-cloud-run.yml
# Configure GitHub Secrets (see workflow README for details)
# Then push to trigger deployment
git push origin mainTo prepare the package for distribution:
-
Increase the version number in
pyproject.toml -
Sync dependencies and update lockfile:
uv sync- Build package distributions:
uv buildThis will create source and wheel distributions in the dist/ directory.
- Publish to PyPI:
uv publishNote: You'll need to set PyPI credentials via environment variables or command flags:
- Token:
--tokenorUV_PUBLISH_TOKEN - Or username/password:
--username/UV_PUBLISH_USERNAMEand--password/UV_PUBLISH_PASSWORD
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm with this command:
npx @modelcontextprotocol/inspector uv --directory {{PATH_TO_REPO}} run mcp-server-bigquery --project {{GCP_PROJECT_ID}} --location {{GCP_LOCATION}}Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
For testing the HTTP/SSE transport locally:
# Start the server in HTTP mode
uv run mcp-server-bigquery --project {{GCP_PROJECT_ID}} --location {{GCP_LOCATION}} --transport http --port 8080
# In another terminal, test the health endpoint
curl http://localhost:8080/healthThe server logs to both stdout and /tmp/mcp_bigquery_server.log. When running in Docker:
# View container logs
docker logs <container-id>
# Or access the log file
docker exec <container-id> cat /tmp/mcp_bigquery_server.logThe server supports two transport modes:
- Use case: Local MCP clients (Claude Desktop, CLI tools)
- Communication: Standard input/output streams
- Configuration: Default mode, no additional setup required
- Use case: Cloud deployments (Google Cloud Run, Kubernetes, remote servers)
- Communication: Server-Sent Events over HTTP
- Endpoints:
GET /: Health check endpointGET /health: Health check endpointGET /messages: SSE connection for receiving eventsPOST /messages: Send tool invocation requests
- Configuration: Set
--transport httporMCP_TRANSPORT=http
To connect an MCP client (like Claude Desktop or Windsurf) to a remotely deployed server using SSE transport:
Configuration example (e.g., in mcp_config.json or Claude Desktop config):
{
"mcpServers": {
"bigquery": {
"disabled": false,
"transport": "sse",
"url": "https://your-server-url.run.app/messages"
}
}
}Replace https://your-server-url.run.app with your actual deployment URL:
- Cloud Run:
https://mcp-server-bigquery-xxxxx-uc.a.run.app - Custom domain:
https://bigquery-mcp.yourdomain.com - Local testing:
http://localhost:8080
The /messages path is required for SSE communication.
The server supports multiple authentication methods:
-
Service Account Key File (Recommended for production):
--key-file /path/to/service-account-key.json # or export BIGQUERY_KEY_FILE=/path/to/service-account-key.json
-
Application Default Credentials (ADC):
- Used automatically when no key file is provided
- Works with
gcloud auth application-default login - Automatically available in Google Cloud environments (Cloud Run, GCE, etc.)
For questions, issues, or feedback:
- π§ Email: [email protected]
- π Issues: GitHub Issues
- π¬ Discussions: GitHub Discussions
MIT