Skip to main content
RapidDev - Software Development Agency
firebase-tutorial

How to Use Firebase Analytics with BigQuery

To use Firebase Analytics with BigQuery, enable the BigQuery link in your Firebase project settings, which creates a daily export of raw event data into BigQuery tables. Each day generates an events_ table containing every event with its parameters, user properties, and device info. You can query this data with standard SQL to build custom reports, funnels, and cohort analyses that go far beyond the Firebase Console dashboards.

What you'll learn

  • How to link Firebase Analytics to BigQuery and enable daily data export
  • How to understand the BigQuery events_ table schema and query structure
  • How to write SQL queries for custom event analysis, funnels, and user retention
  • How to manage BigQuery costs and optimize queries
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate8 min read15-20 minFirebase Blaze plan required, BigQuery free tier (1 TB queries/month)March 2026RapidDev Engineering Team
TL;DR

To use Firebase Analytics with BigQuery, enable the BigQuery link in your Firebase project settings, which creates a daily export of raw event data into BigQuery tables. Each day generates an events_ table containing every event with its parameters, user properties, and device info. You can query this data with standard SQL to build custom reports, funnels, and cohort analyses that go far beyond the Firebase Console dashboards.

Connecting Firebase Analytics to BigQuery for Advanced Analysis

The Firebase Analytics Console provides useful dashboards, but custom analysis requires raw data access. BigQuery integration exports every Analytics event with full parameter detail into BigQuery, where you can run SQL queries for funnel analysis, cohort retention, custom dimensions, and data joins with other sources. This tutorial covers enabling the export, understanding the table schema, writing practical queries, and keeping BigQuery costs under control.

Prerequisites

  • A Firebase project on the Blaze (pay-as-you-go) plan
  • Firebase Analytics initialized and collecting events
  • A Google Cloud project linked to your Firebase project
  • Basic knowledge of SQL queries

Step-by-step guide

1

Enable the BigQuery link in Firebase

Open the Firebase Console, go to Project Settings (gear icon), and click the Integrations tab. Find BigQuery and click Link. Select which data you want to export — at minimum, enable Google Analytics. Choose whether to include advertising identifiers. Firebase creates a BigQuery dataset named analytics_{project_number} and begins daily exports. The first export appears within 24 hours.

Expected result: A BigQuery dataset is created and daily Analytics event exports begin within 24 hours.

2

Understand the events_ table schema

Firebase creates one table per day with the naming pattern events_YYYYMMDD. Each row represents a single event. Key columns include event_name (the event type), event_timestamp (microseconds since epoch), user_pseudo_id (anonymous user ID), event_params (REPEATED RECORD of key-value pairs), user_properties (REPEATED RECORD), device (struct with category, os, browser), and geo (struct with country, region, city).

typescript
1-- View the schema of an events table
2SELECT column_name, data_type
3FROM `your_project.analytics_123456789.INFORMATION_SCHEMA.COLUMNS`
4WHERE table_name = 'events_20260327'
5ORDER BY ordinal_position;
6
7-- Key columns:
8-- event_name STRING (e.g., 'page_view', 'purchase')
9-- event_timestamp INTEGER (microseconds since epoch)
10-- user_pseudo_id STRING (anonymous user identifier)
11-- event_params RECORD[] (key-value parameters)
12-- user_properties RECORD[] (user attributes)
13-- device RECORD (category, os_version, browser)
14-- geo RECORD (country, region, city)
15-- traffic_source RECORD (source, medium, campaign)

Expected result: You understand the table structure and can identify which columns to query for your analysis.

3

Query event counts and extract parameters

Event parameters are stored as a REPEATED RECORD, which means you need to UNNEST them to access specific values. Use a subquery or CROSS JOIN UNNEST to extract parameter values by their key. This is the most common pattern when working with Firebase Analytics in BigQuery.

typescript
1-- Count events by type for the last 7 days
2SELECT
3 event_name,
4 COUNT(*) AS event_count
5FROM `your_project.analytics_123456789.events_*`
6WHERE _TABLE_SUFFIX BETWEEN
7 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))
8 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
9GROUP BY event_name
10ORDER BY event_count DESC
11LIMIT 20;
12
13-- Extract a specific parameter value from events
14SELECT
15 event_name,
16 (SELECT value.string_value FROM UNNEST(event_params)
17 WHERE key = 'page_location') AS page_url,
18 COUNT(*) AS views
19FROM `your_project.analytics_123456789.events_*`
20WHERE event_name = 'page_view'
21 AND _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE() - 1)
22GROUP BY event_name, page_url
23ORDER BY views DESC
24LIMIT 10;

Expected result: A table showing top events by count and top pages by page views for the specified date range.

4

Build a simple conversion funnel

Funnel analysis shows how many users complete a sequence of steps. Count distinct users at each step to see where drop-off occurs. This example tracks users from page_view to sign_up to purchase over the last 30 days.

typescript
1WITH events AS (
2 SELECT
3 user_pseudo_id,
4 event_name,
5 event_timestamp
6 FROM `your_project.analytics_123456789.events_*`
7 WHERE _TABLE_SUFFIX BETWEEN
8 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY))
9 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
10 AND event_name IN ('page_view', 'sign_up', 'purchase')
11)
12SELECT
13 'Step 1: Page View' AS step,
14 COUNT(DISTINCT user_pseudo_id) AS users
15FROM events WHERE event_name = 'page_view'
16UNION ALL
17SELECT
18 'Step 2: Sign Up',
19 COUNT(DISTINCT user_pseudo_id)
20FROM events WHERE event_name = 'sign_up'
21UNION ALL
22SELECT
23 'Step 3: Purchase',
24 COUNT(DISTINCT user_pseudo_id)
25FROM events WHERE event_name = 'purchase'
26ORDER BY step;

Expected result: A three-row table showing the number of unique users at each funnel step, revealing the drop-off rate between steps.

5

Analyze user retention by cohort

Cohort retention shows what percentage of users return after their first visit. Group users by their first_visit date (cohort) and check which days they returned. This is one of the most valuable analyses for measuring product stickiness.

typescript
1WITH first_visits AS (
2 SELECT
3 user_pseudo_id,
4 DATE(TIMESTAMP_MICROS(MIN(event_timestamp))) AS cohort_date
5 FROM `your_project.analytics_123456789.events_*`
6 WHERE _TABLE_SUFFIX BETWEEN '20260201' AND '20260328'
7 AND event_name = 'first_visit'
8 GROUP BY user_pseudo_id
9),
10return_visits AS (
11 SELECT DISTINCT
12 e.user_pseudo_id,
13 DATE(TIMESTAMP_MICROS(e.event_timestamp)) AS visit_date
14 FROM `your_project.analytics_123456789.events_*` e
15 WHERE _TABLE_SUFFIX BETWEEN '20260201' AND '20260328'
16 AND e.event_name = 'session_start'
17)
18SELECT
19 fv.cohort_date,
20 COUNT(DISTINCT fv.user_pseudo_id) AS cohort_size,
21 COUNT(DISTINCT CASE
22 WHEN DATE_DIFF(rv.visit_date, fv.cohort_date, DAY) = 1
23 THEN rv.user_pseudo_id END) AS day_1_retained,
24 COUNT(DISTINCT CASE
25 WHEN DATE_DIFF(rv.visit_date, fv.cohort_date, DAY) = 7
26 THEN rv.user_pseudo_id END) AS day_7_retained
27FROM first_visits fv
28LEFT JOIN return_visits rv USING (user_pseudo_id)
29GROUP BY fv.cohort_date
30ORDER BY fv.cohort_date;

Expected result: A table showing each daily cohort, its size, and how many users returned on day 1 and day 7.

6

Manage BigQuery costs

BigQuery charges per byte scanned (first 1 TB/month free, then $5/TB). Firebase Analytics tables can grow large quickly. Always filter by _TABLE_SUFFIX to limit the date range, select only the columns you need, and use the BigQuery dry-run feature to preview costs before running expensive queries.

typescript
1-- Check table sizes to estimate costs
2SELECT
3 table_id,
4 ROUND(size_bytes / (1024 * 1024 * 1024), 2) AS size_gb,
5 row_count
6FROM `your_project.analytics_123456789.__TABLES__`
7ORDER BY size_bytes DESC
8LIMIT 10;
9
10-- Use dry run to estimate query cost before executing
11-- In BigQuery Console: click "More" > "Estimate" before running
12-- Or use bq CLI: bq query --dry_run "SELECT ..."

Expected result: You can estimate and control BigQuery costs by filtering date ranges and previewing query sizes before execution.

Complete working example

bigquery-analytics-queries.sql
1-- Firebase Analytics BigQuery Queries
2-- Replace 'your_project.analytics_123456789' with your dataset
3
4-- 1. Top events in the last 7 days
5SELECT
6 event_name,
7 COUNT(*) AS event_count,
8 COUNT(DISTINCT user_pseudo_id) AS unique_users
9FROM `your_project.analytics_123456789.events_*`
10WHERE _TABLE_SUFFIX BETWEEN
11 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))
12 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
13GROUP BY event_name
14ORDER BY event_count DESC
15LIMIT 20;
16
17-- 2. Extract page_view URLs with counts
18SELECT
19 (SELECT value.string_value FROM UNNEST(event_params)
20 WHERE key = 'page_location') AS page_url,
21 COUNT(*) AS views,
22 COUNT(DISTINCT user_pseudo_id) AS unique_visitors
23FROM `your_project.analytics_123456789.events_*`
24WHERE event_name = 'page_view'
25 AND _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE() - 1)
26GROUP BY page_url
27ORDER BY views DESC
28LIMIT 20;
29
30-- 3. Daily active users over the last 30 days
31SELECT
32 DATE(TIMESTAMP_MICROS(event_timestamp)) AS event_date,
33 COUNT(DISTINCT user_pseudo_id) AS daily_active_users
34FROM `your_project.analytics_123456789.events_*`
35WHERE _TABLE_SUFFIX BETWEEN
36 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY))
37 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
38GROUP BY event_date
39ORDER BY event_date;
40
41-- 4. Users by country
42SELECT
43 geo.country AS country,
44 COUNT(DISTINCT user_pseudo_id) AS users
45FROM `your_project.analytics_123456789.events_*`
46WHERE _TABLE_SUFFIX BETWEEN
47 FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY))
48 AND FORMAT_DATE('%Y%m%d', CURRENT_DATE())
49GROUP BY country
50ORDER BY users DESC
51LIMIT 10;

Common mistakes when using Firebase Analytics with BigQuery

Why it's a problem: Querying events_* without a _TABLE_SUFFIX filter, scanning all historical data and incurring high costs

How to avoid: Always add a WHERE _TABLE_SUFFIX BETWEEN clause to limit the date range. Use dry-run to estimate costs before executing.

Why it's a problem: Trying to access event parameters directly as columns instead of using UNNEST

How to avoid: Event parameters are stored as REPEATED RECORDs. Use (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'param_name') to extract values.

Why it's a problem: Expecting data in BigQuery immediately after enabling the link

How to avoid: The first daily export appears within 24 hours after linking. Historical data is not backfilled — only new events from the link date onward are exported.

Why it's a problem: Not being on the Blaze plan when trying to enable the BigQuery link

How to avoid: BigQuery integration requires the Firebase Blaze plan. Upgrade in the Firebase Console under the billing section before linking.

Best practices

  • Always filter by _TABLE_SUFFIX when using the wildcard events_* table to control query costs
  • Select only the columns you need instead of SELECT * to reduce bytes scanned
  • Use BigQuery dry-run to estimate query cost before executing expensive analyses
  • Start with daily export only — enable streaming export only if you need near-real-time data
  • Create saved queries or views in BigQuery for reports you run frequently
  • Use UNNEST with a subquery to extract specific event parameters cleanly
  • Schedule recurring queries with BigQuery Scheduled Queries to automate daily reports
  • Monitor your BigQuery usage in the Google Cloud Console to stay within the free tier

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I have Firebase Analytics data exported to BigQuery. Write SQL queries for: top events by count, page views by URL, a 3-step conversion funnel (page_view > sign_up > purchase), and daily active users over the last 30 days. Use the Firebase Analytics BigQuery schema with event_params UNNEST.

Firebase Prompt

Write BigQuery SQL queries for Firebase Analytics data. Include event counts, page view extraction using UNNEST on event_params, a conversion funnel with 3 steps, and a cohort retention analysis. Use _TABLE_SUFFIX for date filtering and optimize for cost.

Frequently asked questions

Does the Firebase BigQuery integration cost money?

The Firebase side is free on the Blaze plan. BigQuery charges for storage (first 10 GB/month free) and queries (first 1 TB/month free). For most moderate-traffic apps, the free tier is sufficient.

Can I backfill historical Analytics data into BigQuery?

No. Only events from the date you enable the BigQuery link onward are exported. Enable the link as early as possible to avoid missing data.

What is the difference between daily and streaming export?

Daily export creates one complete table per day (events_YYYYMMDD) with a 24-hour delay. Streaming export writes events to an intraday table (events_intraday_YYYYMMDD) in near-real-time but costs more in BigQuery storage.

How do I access event parameter values in BigQuery?

Event parameters are stored as REPEATED RECORDs. Use a subquery with UNNEST: (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'your_param_key').

Can I join Firebase Analytics data with other BigQuery datasets?

Yes. One of the biggest advantages of BigQuery is joining Analytics data with CRM data, advertising data, or backend databases for cross-source analysis.

How long is Analytics data retained in BigQuery?

Data is retained indefinitely in BigQuery unless you set a table expiration policy. You can set partition expiration to automatically delete data older than a specified number of days.

Can RapidDev help set up BigQuery analytics and build custom dashboards?

Yes. RapidDev can configure the BigQuery integration, write optimized SQL queries for your specific metrics, and build dashboards with Looker Studio or custom visualizations.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.