To use Firebase Analytics with BigQuery, enable the BigQuery link in your Firebase project settings, which creates a daily export of raw event data into BigQuery tables. Each day generates an events_ table containing every event with its parameters, user properties, and device info. You can query this data with standard SQL to build custom reports, funnels, and cohort analyses that go far beyond the Firebase Console dashboards.
Connecting Firebase Analytics to BigQuery for Advanced Analysis
The Firebase Analytics Console provides useful dashboards, but custom analysis requires raw data access. BigQuery integration exports every Analytics event with full parameter detail into BigQuery, where you can run SQL queries for funnel analysis, cohort retention, custom dimensions, and data joins with other sources. This tutorial covers enabling the export, understanding the table schema, writing practical queries, and keeping BigQuery costs under control.
Prerequisites
- A Firebase project on the Blaze (pay-as-you-go) plan
- Firebase Analytics initialized and collecting events
- A Google Cloud project linked to your Firebase project
- Basic knowledge of SQL queries
Step-by-step guide
Enable the BigQuery link in Firebase
Enable the BigQuery link in Firebase
Open the Firebase Console, go to Project Settings (gear icon), and click the Integrations tab. Find BigQuery and click Link. Select which data you want to export — at minimum, enable Google Analytics. Choose whether to include advertising identifiers. Firebase creates a BigQuery dataset named analytics_{project_number} and begins daily exports. The first export appears within 24 hours.
Expected result: A BigQuery dataset is created and daily Analytics event exports begin within 24 hours.
Understand the events_ table schema
Understand the events_ table schema
Firebase creates one table per day with the naming pattern events_YYYYMMDD. Each row represents a single event. Key columns include event_name (the event type), event_timestamp (microseconds since epoch), user_pseudo_id (anonymous user ID), event_params (REPEATED RECORD of key-value pairs), user_properties (REPEATED RECORD), device (struct with category, os, browser), and geo (struct with country, region, city).
1-- View the schema of an events table2SELECT column_name, data_type3FROM `your_project.analytics_123456789.INFORMATION_SCHEMA.COLUMNS`4WHERE table_name = 'events_20260327'5ORDER BY ordinal_position;67-- Key columns:8-- event_name STRING (e.g., 'page_view', 'purchase')9-- event_timestamp INTEGER (microseconds since epoch)10-- user_pseudo_id STRING (anonymous user identifier)11-- event_params RECORD[] (key-value parameters)12-- user_properties RECORD[] (user attributes)13-- device RECORD (category, os_version, browser)14-- geo RECORD (country, region, city)15-- traffic_source RECORD (source, medium, campaign)Expected result: You understand the table structure and can identify which columns to query for your analysis.
Query event counts and extract parameters
Query event counts and extract parameters
Event parameters are stored as a REPEATED RECORD, which means you need to UNNEST them to access specific values. Use a subquery or CROSS JOIN UNNEST to extract parameter values by their key. This is the most common pattern when working with Firebase Analytics in BigQuery.
1-- Count events by type for the last 7 days2SELECT3 event_name,4 COUNT(*) AS event_count5FROM `your_project.analytics_123456789.events_*`6WHERE _TABLE_SUFFIX BETWEEN7 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))8 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())9GROUP BY event_name10ORDER BY event_count DESC11LIMIT 20;1213-- Extract a specific parameter value from events14SELECT15 event_name,16 (SELECT value.string_value FROM UNNEST(event_params)17 WHERE key = 'page_location') AS page_url,18 COUNT(*) AS views19FROM `your_project.analytics_123456789.events_*`20WHERE event_name = 'page_view'21 AND _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE() - 1)22GROUP BY event_name, page_url23ORDER BY views DESC24LIMIT 10;Expected result: A table showing top events by count and top pages by page views for the specified date range.
Build a simple conversion funnel
Build a simple conversion funnel
Funnel analysis shows how many users complete a sequence of steps. Count distinct users at each step to see where drop-off occurs. This example tracks users from page_view to sign_up to purchase over the last 30 days.
1WITH events AS (2 SELECT3 user_pseudo_id,4 event_name,5 event_timestamp6 FROM `your_project.analytics_123456789.events_*`7 WHERE _TABLE_SUFFIX BETWEEN8 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY))9 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())10 AND event_name IN ('page_view', 'sign_up', 'purchase')11)12SELECT13 'Step 1: Page View' AS step,14 COUNT(DISTINCT user_pseudo_id) AS users15FROM events WHERE event_name = 'page_view'16UNION ALL17SELECT18 'Step 2: Sign Up',19 COUNT(DISTINCT user_pseudo_id)20FROM events WHERE event_name = 'sign_up'21UNION ALL22SELECT23 'Step 3: Purchase',24 COUNT(DISTINCT user_pseudo_id)25FROM events WHERE event_name = 'purchase'26ORDER BY step;Expected result: A three-row table showing the number of unique users at each funnel step, revealing the drop-off rate between steps.
Analyze user retention by cohort
Analyze user retention by cohort
Cohort retention shows what percentage of users return after their first visit. Group users by their first_visit date (cohort) and check which days they returned. This is one of the most valuable analyses for measuring product stickiness.
1WITH first_visits AS (2 SELECT3 user_pseudo_id,4 DATE(TIMESTAMP_MICROS(MIN(event_timestamp))) AS cohort_date5 FROM `your_project.analytics_123456789.events_*`6 WHERE _TABLE_SUFFIX BETWEEN '20260201' AND '20260328'7 AND event_name = 'first_visit'8 GROUP BY user_pseudo_id9),10return_visits AS (11 SELECT DISTINCT12 e.user_pseudo_id,13 DATE(TIMESTAMP_MICROS(e.event_timestamp)) AS visit_date14 FROM `your_project.analytics_123456789.events_*` e15 WHERE _TABLE_SUFFIX BETWEEN '20260201' AND '20260328'16 AND e.event_name = 'session_start'17)18SELECT19 fv.cohort_date,20 COUNT(DISTINCT fv.user_pseudo_id) AS cohort_size,21 COUNT(DISTINCT CASE22 WHEN DATE_DIFF(rv.visit_date, fv.cohort_date, DAY) = 123 THEN rv.user_pseudo_id END) AS day_1_retained,24 COUNT(DISTINCT CASE25 WHEN DATE_DIFF(rv.visit_date, fv.cohort_date, DAY) = 726 THEN rv.user_pseudo_id END) AS day_7_retained27FROM first_visits fv28LEFT JOIN return_visits rv USING (user_pseudo_id)29GROUP BY fv.cohort_date30ORDER BY fv.cohort_date;Expected result: A table showing each daily cohort, its size, and how many users returned on day 1 and day 7.
Manage BigQuery costs
Manage BigQuery costs
BigQuery charges per byte scanned (first 1 TB/month free, then $5/TB). Firebase Analytics tables can grow large quickly. Always filter by _TABLE_SUFFIX to limit the date range, select only the columns you need, and use the BigQuery dry-run feature to preview costs before running expensive queries.
1-- Check table sizes to estimate costs2SELECT3 table_id,4 ROUND(size_bytes / (1024 * 1024 * 1024), 2) AS size_gb,5 row_count6FROM `your_project.analytics_123456789.__TABLES__`7ORDER BY size_bytes DESC8LIMIT 10;910-- Use dry run to estimate query cost before executing11-- In BigQuery Console: click "More" > "Estimate" before running12-- Or use bq CLI: bq query --dry_run "SELECT ..."Expected result: You can estimate and control BigQuery costs by filtering date ranges and previewing query sizes before execution.
Complete working example
1-- Firebase Analytics BigQuery Queries2-- Replace 'your_project.analytics_123456789' with your dataset34-- 1. Top events in the last 7 days5SELECT6 event_name,7 COUNT(*) AS event_count,8 COUNT(DISTINCT user_pseudo_id) AS unique_users9FROM `your_project.analytics_123456789.events_*`10WHERE _TABLE_SUFFIX BETWEEN11 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))12 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())13GROUP BY event_name14ORDER BY event_count DESC15LIMIT 20;1617-- 2. Extract page_view URLs with counts18SELECT19 (SELECT value.string_value FROM UNNEST(event_params)20 WHERE key = 'page_location') AS page_url,21 COUNT(*) AS views,22 COUNT(DISTINCT user_pseudo_id) AS unique_visitors23FROM `your_project.analytics_123456789.events_*`24WHERE event_name = 'page_view'25 AND _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE() - 1)26GROUP BY page_url27ORDER BY views DESC28LIMIT 20;2930-- 3. Daily active users over the last 30 days31SELECT32 DATE(TIMESTAMP_MICROS(event_timestamp)) AS event_date,33 COUNT(DISTINCT user_pseudo_id) AS daily_active_users34FROM `your_project.analytics_123456789.events_*`35WHERE _TABLE_SUFFIX BETWEEN36 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY))37 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())38GROUP BY event_date39ORDER BY event_date;4041-- 4. Users by country42SELECT43 geo.country AS country,44 COUNT(DISTINCT user_pseudo_id) AS users45FROM `your_project.analytics_123456789.events_*`46WHERE _TABLE_SUFFIX BETWEEN47 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))48 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())49GROUP BY country50ORDER BY users DESC51LIMIT 10;Common mistakes when using Firebase Analytics with BigQuery
Why it's a problem: Querying events_* without a _TABLE_SUFFIX filter, scanning all historical data and incurring high costs
How to avoid: Always add a WHERE _TABLE_SUFFIX BETWEEN clause to limit the date range. Use dry-run to estimate costs before executing.
Why it's a problem: Trying to access event parameters directly as columns instead of using UNNEST
How to avoid: Event parameters are stored as REPEATED RECORDs. Use (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'param_name') to extract values.
Why it's a problem: Expecting data in BigQuery immediately after enabling the link
How to avoid: The first daily export appears within 24 hours after linking. Historical data is not backfilled — only new events from the link date onward are exported.
Why it's a problem: Not being on the Blaze plan when trying to enable the BigQuery link
How to avoid: BigQuery integration requires the Firebase Blaze plan. Upgrade in the Firebase Console under the billing section before linking.
Best practices
- Always filter by _TABLE_SUFFIX when using the wildcard events_* table to control query costs
- Select only the columns you need instead of SELECT * to reduce bytes scanned
- Use BigQuery dry-run to estimate query cost before executing expensive analyses
- Start with daily export only — enable streaming export only if you need near-real-time data
- Create saved queries or views in BigQuery for reports you run frequently
- Use UNNEST with a subquery to extract specific event parameters cleanly
- Schedule recurring queries with BigQuery Scheduled Queries to automate daily reports
- Monitor your BigQuery usage in the Google Cloud Console to stay within the free tier
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I have Firebase Analytics data exported to BigQuery. Write SQL queries for: top events by count, page views by URL, a 3-step conversion funnel (page_view > sign_up > purchase), and daily active users over the last 30 days. Use the Firebase Analytics BigQuery schema with event_params UNNEST.
Write BigQuery SQL queries for Firebase Analytics data. Include event counts, page view extraction using UNNEST on event_params, a conversion funnel with 3 steps, and a cohort retention analysis. Use _TABLE_SUFFIX for date filtering and optimize for cost.
Frequently asked questions
Does the Firebase BigQuery integration cost money?
The Firebase side is free on the Blaze plan. BigQuery charges for storage (first 10 GB/month free) and queries (first 1 TB/month free). For most moderate-traffic apps, the free tier is sufficient.
Can I backfill historical Analytics data into BigQuery?
No. Only events from the date you enable the BigQuery link onward are exported. Enable the link as early as possible to avoid missing data.
What is the difference between daily and streaming export?
Daily export creates one complete table per day (events_YYYYMMDD) with a 24-hour delay. Streaming export writes events to an intraday table (events_intraday_YYYYMMDD) in near-real-time but costs more in BigQuery storage.
How do I access event parameter values in BigQuery?
Event parameters are stored as REPEATED RECORDs. Use a subquery with UNNEST: (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'your_param_key').
Can I join Firebase Analytics data with other BigQuery datasets?
Yes. One of the biggest advantages of BigQuery is joining Analytics data with CRM data, advertising data, or backend databases for cross-source analysis.
How long is Analytics data retained in BigQuery?
Data is retained indefinitely in BigQuery unless you set a table expiration policy. You can set partition expiration to automatically delete data older than a specified number of days.
Can RapidDev help set up BigQuery analytics and build custom dashboards?
Yes. RapidDev can configure the BigQuery integration, write optimized SQL queries for your specific metrics, and build dashboards with Looker Studio or custom visualizations.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation