BigQuery is the tool that changes how an analyst works the moment their needs exceed what the Google Analytics 4 UI or a spreadsheet can handle. Serverless means you don't manage infrastructure, you run SQL and Google scales compute for you. You only pay for data your query actually scans.
This guide explains what BigQuery is technically (not marketing-speak), how it works architecturally, how much it really costs, when it makes sense to use it, and when it's the wrong choice. Based on BigQuery deployments I've done for European e-commerce, SaaS, and publisher teams since 2022, when the GA4 BigQuery Export became free for everyone.
If you're at the stage of evaluating whether BigQuery makes sense for your setup, this article covers the basics. If you already know you want in and worry about costs, see the BigQuery cost optimization guide (publishing next Friday).
Key Takeaways
BigQuery is an analytical warehouse, not a transactional database. Use it for reporting, modeling, data science; don't use it for application backends with millions of single-record operations.
Pricing (2025): 5 USD per TB of data scanned + 0.02 USD per GB/month of active storage. First 1 TB queries + 10 GB storage per month is free tier.
Killer use case for analysts: free GA4 export with full hit-level data (something the GA4 UI doesn't give you, and Universal Analytics required a paid 360 subscription for).
You don't need to set up servers, partitions, or backups. Google handles that. You write SQL.
Biggest risk is query cost. A single careless
SELECT *on a large dataset can cost 50 EUR. Partitioning and clustering are your friends.
What BigQuery Actually Is
The technical version first, not the marketing one.
BigQuery is an analytical database (OLAP) built on a columnar engine. Unlike Postgres or MySQL, which optimize row reads (OLTP, "show me order 12345"), BigQuery optimizes column-wise aggregations ("sum revenue across all orders per marketing channel for the last 90 days"). This one difference drives everything else.
The second key trait is separation of storage and compute. Data sits in Google Cloud Storage in Capacitor format (Google's proprietary columnar format). Compute, the query execution engine called Dremel, spins up on-demand. You don't pay for "on" clusters, you pay for actual processing.
Third trait: full SQL compliance (ANSI SQL 2011+) plus Google extensions (BigQuery ML for training models in SQL, geospatial functions, arrays, structs, JSON). If you know SQL, you know BigQuery. The real learning curve is cost optimization, not syntax.
How BigQuery Differs From a Regular Database
Five key differences that decide whether BigQuery fits your use case:
1. Scale vs single-query latency. BigQuery scales to petabytes, but the fastest query will still take 500 ms. Postgres returns SELECT * FROM orders WHERE id = 12345 in 5 ms. If you need sub-100ms response per request, BigQuery is the wrong tool.
2. No indexes (in the traditional sense). Instead of indexes, you have partitioning (you split tables by date, usually) and clustering (you physically sort by 1-4 columns). Instead of "add an index on field X", you think "how do I physically arrange data so queries scan less".
3. No OLTP-style UPDATE/DELETE. You can update rows, but expensively (scans and rewrites entire partitions). Typical BigQuery pattern is append-only plus materialized views or scheduled queries that rebuild aggregates.
4. Per-query pricing, not per-server. In Postgres, you pay for a machine 24/7. In BigQuery, you pay only when someone runs a query. If your dashboard fetches data 5 times a day, you pay for 5 scans. The rest of the time: just storage (cents).
5. Good practices enforced by pricing. Every SELECT * literally costs more. Every unfiltered partition scan walks the whole dataset. In Postgres, bad SQL breaks performance; in BigQuery, bad SQL breaks your invoices.
When BigQuery Makes Sense
Four scenarios where BigQuery is the right choice.
1. GA4 Data Export (the free killer use case)
This is the main reason most marketing teams even meet BigQuery. GA4 has a built-in free export to BigQuery, available in every property under Admin > BigQuery Links.
Why this matters:
- Hit-level data, every event, every parameter, every user-scoped custom dimension, at full resolution. The GA4 UI always aggregates, limits you to 10-16 dimensions at once, and samples on large datasets.
- No reporting limits, no report card caps, no cardinality explosion errors, no cardinality limits on custom dimensions.
- Full SQL access, join freely with CRM data, ad data, email data, anything you have in BigQuery.
- Historical data is yours, GA4 default retention is 2 months (max 14). In BigQuery you keep data as long as you want and run year-over-year analyses the GA4 UI physically can't.
For mid-market European e-commerce with 100,000 sessions/day, exporting GA4 to BigQuery typically costs 5-15 EUR/month in storage plus query costs. For equivalent functionality in Universal Analytics 360, Google charged 150,000 EUR/year.
2. Custom Dashboards Beyond Workspace/Looker Studio Native Connectors
If you build dashboards in Looker Studio, Power BI, or your own tool, and the native GA4 connector limits you (sampling, cardinality, missing specific dimensions), BigQuery is the intermediate layer. You connect to BigQuery, create custom views, join with other sources, and the dashboard serves ready-made data.
3. Cross-Platform Attribution and Customer Analytics
You have GA4, Google Ads, Meta Ads, CRM data in HubSpot, orders in Shopify. Each system has its own UI and dashboards. BigQuery is where you join them by user_id or email, and only then can you see the real customer journey, from first touch to retention.
4. Ad-Hoc SQL for Analysts Who Know SQL
If your analytics team knows SQL and gets frustrated by UI limits, BigQuery is their playground. No waiting for a data engineer to build a pipeline. A query written in 30 seconds, an answer in 2 seconds, the next iteration.
Wondering if BigQuery fits your GA4 setup or reporting needs? Get in touch for a scoped assessment, a review of your data, business questions, budget, and a realistic estimate of whether BigQuery pays off in 6-12 months.
When BigQuery Is NOT the Right Tool
Four scenarios where BigQuery is the wrong choice and you should use something else.
1. Transactional Application Backend
If you're building an app that runs a SELECT * FROM users WHERE id = 42 query on every user click, BigQuery will kill you. Single-query latency starts at 500 ms, and cost scales linearly with traffic. For this you need PostgreSQL, MySQL, Cloud Spanner, Firestore, anything OLTP-optimized.
2. Very Small Datasets (Under 10 GB)
If everything fits in a single Excel file or a small Postgres database, BigQuery is overkill. Fixed costs don't exist (free tier covers all), but operational overhead (configuration, monitoring, team training) isn't zero. For small datasets, Google Sheets, Metabase on Postgres, or Retool are simpler.
3. Real-Time Requirement (Sub-1-Second Response)
BigQuery is fast for its class (aggregations on petabytes in seconds), but it's not a real-time tool. If a dashboard needs to refresh every second, you need streaming analytics (Kafka + Flink, Apache Pinot, BigQuery BI Engine as an add-on). Most business dashboards don't need real-time, but if yours does, know it early.
4. Operational Budget Under 50 EUR/Month Without GA4 Export
Without the GA4 Export case, BigQuery often doesn't reach break-even for very small businesses. If you have 20 queries per month on a small dataset, free tier covers everything, but then why BigQuery. Matomo + your own Postgres costs less administratively.
BigQuery Pricing in Practice
Two main cost components (plus a few side ones):
Query Processing Costs
On-demand pricing: 5 USD per 1 TB of data scanned. This is the default. Each request scans specific columns of specific partitions, and Google counts the bytes it actually read.
Practical example: the GA4 Export table for a mid-size e-commerce with 3 months of data is around 5-15 GB. Scanning the whole table with a bad query (SELECT *) costs 5 × 15/1000 = 0.075 USD. Sounds cheap, but multiply by 100 badly-written queries a day and you have 225 USD/month.
Capacity-based alternative: you pay a flat subscription for "slots" (compute units). Makes sense only above 2,000-3,000 USD/mo in on-demand.
Storage Costs
Active storage: 0.02 USD per GB/month (data modified in the last 90 days).
Long-term storage: 0.01 USD per GB/month (data unchanged for 90+ days).
100 GB of GA4 data costs 2 USD/month. 2022 data nobody touches drops to 1 USD/month.
Streaming Inserts
0.01 USD per 200 MB for insertion via the streaming API. For most setups this is negligible, but a team with a large real-time pipeline will notice.
Free Tier (Worth Knowing)
Every month:
- 1 TB of queries free
- 10 GB of active storage free
- Unlimited query cache hits (a repeated query from cache doesn't count against the limit)
For a small business testing BigQuery, this practically means 0 EUR per month for the first few months.
A Mini-Story: From Workspace to BigQuery
When Pawel, analytics lead at a Polish e-commerce, tried to answer "what's the LTV of a customer acquired from Google Ads vs Meta Ads over 12 months?" in March 2025, the GA4 UI couldn't give it to him. Workspace limits 90 days for explorations, cardinality limits blocked his breakdowns, and attempts to pull data via the API kept hitting sampling warnings.
We turned on GA4 BigQuery Export (5 minutes of work), loaded historical 14 months of data, and wrote one SQL query, 40 lines, joining events_* tables with his Shopify customer table (already in BQ via a Fivetran connector). The answer came back in 4 seconds.
Total cost of the whole experiment: 0.42 USD. Time: one afternoon. Business insight: Meta Ads customer LTV was 23% lower than Google Ads, which completely rewrote his Q2 budget allocation.
This is exactly the use case BigQuery exists for.
GA4 BigQuery Export, Killer Use Case Step by Step
1. Enable Export
In GA4: Admin > BigQuery Links > Link. You need a Google Cloud project with BigQuery API enabled. Pick the location (important, EU multi-region for European data, meets GDPR requirements better than US region) and frequency (streaming or daily batch).
2. Data Structure
Each day is a separate table: events_YYYYMMDD. Plus events_intraday_YYYYMMDD if you picked streaming. Each row is one GA4 event with full context (user pseudo id, session, device, location, all custom parameters).
The schema is nested, events have arrays of parameters, and parameters have structs with different value types. Practice: use UNNEST liberally.
3. First Useful Query
"Top 10 pages by unique users in the last 7 days":
SELECT
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page,
COUNT(DISTINCT user_pseudo_id) AS users
FROM my-project.analytics_123456789.events_*
WHERE _TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))
AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
AND event_name = 'page_view'
GROUP BY page
ORDER BY users DESC
LIMIT 10
The key element is the _TABLE_SUFFIX filter, without it the query scans all events_* tables and costs explode.
4. Tools Worth Using
- BigQuery Studio (web UI directly in Google Cloud Console), default start, built-in query editor with schema autocomplete
- BigQuery Console + dbt, for teams already working in dbt
- Dataform (Google native), alternative to dbt, built into GCP
- Looker Studio, free dashboards directly from BQ tables, see the Looker Studio guide
- Jupyter / Colab + google-cloud-bigquery library, data science workflows
Second Mini-Story: When BigQuery Was the Wrong Choice
Honest counterweight. Marek, CTO at a Polish B2C startup, deployed BigQuery as his main product database in 2024. Logic: "Google Cloud, scales, no management". Six months later, the business was running, but every user action in the web app triggered 2-3 BigQuery queries, and each query carried 200-400 ms of latency plus fees.
His BigQuery bill in June 2024: 2,800 EUR. A Cloud SQL (Postgres) bill that could handle the same app: 120 EUR. The 26× difference came from using BigQuery as an operational database when it's an analytical warehouse.
We migrated critical operations to Cloud SQL in three weeks. BigQuery stayed for analytics only (GA4 Export + management dashboards). The bill dropped to 180 EUR/month total.
Lesson: BigQuery is an analytical warehouse, not a transactional database. Using it for anything else ends in either bad performance, expensive invoices, or both.
Planning a BigQuery deployment and not sure you're doing it right? See my data engineering services, architecture audits, GA4 → BigQuery migrations, custom pipelines, and cost optimization.
How to Start With BigQuery (in 30 Minutes)
Step 1: Google Cloud Project
Go to console.cloud.google.com, create a project (free, only needs a credit card for free tier sign-up, but the first 300 USD for 90 days is also free for new accounts).
Step 2: Enable BigQuery API
In Google Cloud Console: APIs & Services > Library > BigQuery API > Enable. Takes 30 seconds.
Step 3: First Dataset
In BigQuery Studio: Explorer (left) > your project > Create dataset. Name, region (pick EU), done.
Step 4: Load Your First Data
Two options:
- If you have a CSV file, Create table > Upload > point to the file > BigQuery auto-detects schema
- If you have GA4, enable Export (Admin > BigQuery Links)
- For testing, BigQuery has public datasets (
bigquery-public-data) with examples:stack_overflow,london_bicycles,google_trends
Step 5: First Query
In BigQuery Studio: Compose new query, type SQL, Run. The result is in a table. The query counts bytes on the bar at the top, that's where you verify cost before hitting Run.
Step 6: Connect to Looker Studio or Your BI
In Looker Studio: Add data > BigQuery > pick project > pick dataset > dashboard ready.
Frequently Asked Questions
How much does BigQuery cost?
The pricing has two components: query processing (5 USD per TB of data scanned) and storage (0.02 USD per GB of active storage per month, 0.01 USD for long-term storage). First 1 TB of queries and 10 GB of storage per month is free tier. A typical GA4 Export setup for a mid-market e-commerce costs 10-50 USD/month total.
Is BigQuery free for GA4?
The GA4 BigQuery Export is free for all GA4 accounts since 2022. You pay for storage (0.02 USD/GB/month) and query processing, but the export itself costs nothing. In Universal Analytics, BigQuery Export required paid GA 360.
Is BigQuery GDPR-compliant?
Yes, if you pick an EU region (EU multi-region or a specific region like europe-west1) and have a DPA with Google Cloud. Data sits in EU data centers. For the full context on GDPR and analytics data, see the GA4 and GDPR guide.
What's the difference between BigQuery and Snowflake?
Both are analytical data warehouses with columnar storage. Main differences: BigQuery is GCP-native (tight integration with GA4, Google Ads, Vertex AI), Snowflake runs on AWS/Azure/GCP (multi-cloud). BigQuery has simpler pricing (per query), Snowflake per-warehouse per-second. For GA4 analytics in Europe, BigQuery wins on integration. For multi-cloud enterprise, Snowflake is more flexible.
Do I need to know SQL to use BigQuery?
Yes, for serious work. BigQuery is a SQL-first tool. If you don't know SQL, you can use Looker Studio connected to BigQuery to browse data comfortably without SQL, but the full power of BigQuery (custom analytics, ad-hoc queries) requires SQL. The investment in learning ANSI SQL (1-2 weeks for someone comfortable with Excel) pays back fast.
How long can I keep data in BigQuery?
As long as you want. No retention limits (unlike GA4, max 14 months). Data unchanged for 90+ days automatically moves to long-term storage (50% cheaper). For historical year-over-year analytics, this is a meaningful advantage over the native GA4 UI.
Conclusion
BigQuery is Google's serverless analytical warehouse. Use it for reporting, ad-hoc SQL, modeling, and above all the free GA4 export. Don't use it as a transactional application database, don't use it for sub-second request response, and don't use it if you have 10 GB of data and no SQL team.
For most mid-market European businesses with GA4 Export, BigQuery is a real jump in analytics quality at a cost of 10-50 EUR per month. Start with free tier, enable GA4 Export, write your first queries, and after a month you'll see whether it makes sense to escalate the investment.
Want to deploy BigQuery for your GA4 setup or martech stack and don't know where to start? Get in touch for a scoped assessment, a review of your data, business questions, and budget, plus a concrete deployment plan. No generics, just scope and numbers.
Want to deploy BigQuery for your analytics stack?
I design and build BigQuery pipelines, GA4 Export setups, and custom data models for European analytics teams. Scoped work, transparent pricing.
See my servicesNeed help? Get in touch
Have a question about your analytics setup? Fill out the form, I usually reply within 24 hours.