Integration · BigQuery
Warehouse-native analytics as MCP queries.
BigQuery is Google's serverless data warehouse. The official managed MCP gives AI agents direct, governed access to schemas, queries, and forecasting via OAuth + IAM.
Google's first-party managed MCP server (December 2025) connects AI agents directly to BigQuery without extra infrastructure. No data leaves the warehouse; queries execute in place. OAuth 2.0 plus IAM for authentication and authorization. Tools cover schema inspection, GoogleSQL execution, dataset discovery, and time-series forecasting via Cortex. Pairs cleanly with Mixpanel, dbt, and Hex MCPs for a unified data agent stack.
What you can do via MCP
Example prompts the agent runs.
“List my BigQuery datasets in project acme-prod.”
Calls the dataset-discovery tool, returns the datasets the OAuth scope grants access to with their location and creation date.
“Show me the schema for dataset events_v2.user_actions.”
Inspects the table schema via the MCP, returns column names, types, modes (NULLABLE, REQUIRED, REPEATED), and any policy tags applied.
“Run a query that shows daily active users for the last 30 days, grouped by acquisition channel.”
Generates a GoogleSQL query against the events table, executes via the MCP, returns the result rows and the bytes-scanned cost so the team knows the query economics.
“Find queries from the last week that scanned more than 1TB.”
Queries the INFORMATION_SCHEMA.JOBS view, filters for total_bytes_processed above 1TB, returns the queries with their cost so the team can refactor expensive ones.
“Forecast next quarter's signup volume using the auto-forecast on cohort table cohort_2026q1.”
Calls the BigQuery ML.FORECAST function via the MCP, returns the prediction with confidence intervals so the team can plan against the projection.
GoogleSQL execution via the official BigQuery MCP. The agent generates the query, executes it, and returns rows plus bytes-scanned cost.
# BigQuery MCP, called via the agent runtime
mcp.bigquery.query({
project_id: "acme-prod",
query: `
SELECT
DATE_TRUNC(event_timestamp, DAY) AS day,
acquisition_channel,
COUNT(DISTINCT user_id) AS daily_active_users
FROM \`acme-prod.events_v2.user_actions\`
WHERE event_timestamp >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
GROUP BY day, acquisition_channel
ORDER BY day DESC, daily_active_users DESC
`,
use_legacy_sql: false
})
# Returns rows, bytes_scanned cost, and total_bytes_processed.MCP integration
BigQuery MCP server.
- Server
- https://bigquery.googleapis.com/mcp
- Auth
- OAuth 2.0 plus IAM (scope-based authorization, no API keys)
- Hosting
- Google managed (Streamable HTTP), no separate infrastructure
- Schema discovery: list datasets, list tables, inspect columns and policy tags
- GoogleSQL execution with returned bytes-scanned cost so query economics are visible
- Time-series forecasting via BigQuery ML and Cortex auto-forecast
- Marketplace dataset access for shared and public datasets
- Pairs with Mixpanel, dbt, and Hex MCPs for unified data-agent workflows
Visual demonstration
What this looks like in practice.
acme-prod
events_v2
user_actions
1.2B rows- event_timestamptimestamp
- user_idstring
- event_namestring
- acquisition_channelstringNULL
- session_idstring
- event_propertiesjsonNULL
sessions
184M rows- session_idstring
- user_idstring
- started_attimestamp
- duration_sinteger
1SELECT2 DATE_TRUNC(event_timestamp, DAY) AS day,3 acquisition_channel,4 COUNT(DISTINCT user_id) AS daily_active_users5FROM `acme-prod.events_v2.user_actions`6WHERE event_timestamp >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)7GROUP BY day, acquisition_channel8ORDER BY day DESC, daily_active_users DESC9LIMIT 100
CLI alternative
bq command-line plus REST API for non-MCP workflows.
BigQuery exposes the bq CLI and a REST API as the canonical programmatic surfaces. For AI-driven analytics work the official MCP is the primary interface; the bq CLI fills in for batch jobs, scheduled queries, and any operational task the agent runtime is not the right place for.
Pairs with these skills
The PM data and analytics skill suite.
This integration pairs with the forthcoming product-analytics-setup, data-warehouse-experimentation skills. The skill landing pages and SKILL.md sources land in subsequent dispatches; cross-link hyperlinks are added when the skill pages ship.