Migrating from Firebase to Supabase involves exporting your Firestore data to JSON, mapping NoSQL document structures to relational PostgreSQL tables, converting Firebase Auth users via the Supabase migration tool, replacing security rules with Row Level Security policies, and swapping client SDK calls. Plan for 2-4 weeks for a medium-sized app. The biggest challenge is restructuring denormalized NoSQL data into normalized relational tables with proper foreign keys.
Migrating from Firebase to Supabase Step by Step
Firebase and Supabase solve similar problems with fundamentally different architectures. Firebase uses NoSQL (Firestore) with security rules, while Supabase uses PostgreSQL with Row Level Security. This tutorial provides a systematic migration path: audit your Firebase usage, export data, design relational schemas, migrate authentication, convert security rules to RLS policies, and update your client code. It covers the key decisions and trade-offs at each stage.
Prerequisites
- An existing Firebase project with Firestore data and Auth users
- A new Supabase project created at supabase.com
- Node.js installed for running migration scripts
- firebase-admin SDK for server-side data export
- Basic understanding of SQL and PostgreSQL
Step-by-step guide
Audit your Firebase usage and plan the migration
Audit your Firebase usage and plan the migration
Before writing any code, document every Firebase service you use: Firestore collections, Auth providers, Storage buckets, Cloud Functions, and Hosting. For each Firestore collection, note the document structure, relationships (subcollections, reference fields), and query patterns. Map each Firebase service to its Supabase equivalent: Firestore to PostgreSQL tables, Firebase Auth to Supabase Auth, Cloud Storage to Supabase Storage, Cloud Functions to Edge Functions, and Hosting to Vercel/Netlify.
1// Firebase service → Supabase equivalent2// Firestore collections → PostgreSQL tables3// Firestore subcollections → Foreign key relationships4// Security rules → Row Level Security (RLS)5// Firebase Auth → Supabase Auth (GoTrue)6// Cloud Storage → Supabase Storage7// Cloud Functions → Edge Functions (Deno)8// Firebase Hosting → Vercel, Netlify, or Supabase9// Realtime Database → Supabase Realtime (Postgres Changes)10// Cloud Messaging (FCM) → Third-party push serviceExpected result: A complete inventory of Firebase services in use and their Supabase counterparts, forming the migration plan.
Export Firestore data to JSON
Export Firestore data to JSON
Use the Firebase Admin SDK to export each collection to JSON files. This script reads every document in a collection (including subcollections) and writes the data to disk. For large collections, export in batches to avoid memory issues. Keep the document IDs in the export so you can preserve them as primary keys or reference columns in PostgreSQL.
1import * as admin from 'firebase-admin'2import * as fs from 'fs'34admin.initializeApp({ credential: admin.credential.applicationDefault() })5const db = admin.firestore()67async function exportCollection(collectionName: string) {8 const snapshot = await db.collection(collectionName).get()9 const data = snapshot.docs.map((doc) => ({10 _id: doc.id,11 ...doc.data(),12 }))13 fs.writeFileSync(14 `./export/${collectionName}.json`,15 JSON.stringify(data, null, 2)16 )17 console.log(`Exported ${data.length} docs from ${collectionName}`)18}1920// Export each collection21await exportCollection('users')22await exportCollection('posts')23await exportCollection('comments')Expected result: JSON files in the ./export directory containing all documents from each Firestore collection.
Design relational tables and create the PostgreSQL schema
Design relational tables and create the PostgreSQL schema
Translate your Firestore document structures into PostgreSQL tables. Embedded objects become separate tables with foreign keys. Arrays of references become junction tables. Denormalized copies become single-source tables with joins. Use Supabase's SQL editor or a migration file to create the schema. Enable Row Level Security on every table from the start.
1-- Create tables in Supabase SQL Editor2create table public.users (3 id uuid primary key default gen_random_uuid(),4 firebase_uid text unique, -- keep old Firebase UID for reference5 email text unique not null,6 display_name text,7 avatar_url text,8 created_at timestamptz default now()9);1011create table public.posts (12 id uuid primary key default gen_random_uuid(),13 author_id uuid references public.users(id) on delete cascade,14 title text not null,15 body text,16 tags text[] default '{}', -- Firestore array → PostgreSQL array17 created_at timestamptz default now(),18 updated_at timestamptz default now()19);2021create table public.comments (22 id uuid primary key default gen_random_uuid(),23 post_id uuid references public.posts(id) on delete cascade,24 author_id uuid references public.users(id) on delete cascade,25 content text not null,26 created_at timestamptz default now()27);2829-- Enable RLS on all tables30alter table public.users enable row level security;31alter table public.posts enable row level security;32alter table public.comments enable row level security;Expected result: PostgreSQL tables with proper foreign keys, data types, and RLS enabled, ready to receive imported data.
Import data into Supabase with a migration script
Import data into Supabase with a migration script
Write a Node.js script that reads the exported JSON files, transforms the data to match your new relational schema, and inserts it into Supabase using the JavaScript client or direct PostgreSQL connection. Handle data transformations like converting Firestore document references to foreign key UUIDs, flattening embedded objects, and converting timestamps.
1import { createClient } from '@supabase/supabase-js'2import * as fs from 'fs'34const supabase = createClient(5 process.env.SUPABASE_URL!,6 process.env.SUPABASE_SERVICE_ROLE_KEY! // Use service role to bypass RLS7)89async function importUsers() {10 const raw = JSON.parse(fs.readFileSync('./export/users.json', 'utf8'))1112 const users = raw.map((doc: any) => ({13 firebase_uid: doc._id,14 email: doc.email,15 display_name: doc.displayName || null,16 avatar_url: doc.photoURL || null,17 created_at: doc.createdAt || new Date().toISOString(),18 }))1920 const { error } = await supabase.from('users').insert(users)21 if (error) console.error('Import error:', error)22 else console.log(`Imported ${users.length} users`)23}2425await importUsers()Expected result: All Firestore data is now in PostgreSQL tables with correct types and foreign key relationships.
Migrate Firebase Auth users to Supabase Auth
Migrate Firebase Auth users to Supabase Auth
Export Firebase Auth users using the Admin SDK's listUsers method, then import them into Supabase. For email/password users, you cannot export password hashes from Firebase, so users will need to reset their passwords on first login. For OAuth users (Google, GitHub), configure the same providers in Supabase and users can sign in normally since OAuth tokens are provider-managed.
1// Export Firebase Auth users2const listAllUsers = async (nextPageToken?: string) => {3 const result = await admin.auth().listUsers(1000, nextPageToken)4 const users = result.users.map((user) => ({5 firebase_uid: user.uid,6 email: user.email,7 display_name: user.displayName,8 phone: user.phoneNumber,9 providers: user.providerData.map((p) => p.providerId),10 created_at: user.metadata.creationTime,11 }))1213 // Write to file14 fs.appendFileSync(15 './export/auth-users.json',16 JSON.stringify(users, null, 2)17 )1819 if (result.pageToken) {20 await listAllUsers(result.pageToken)21 }22}2324await listAllUsers()2526// Then in Supabase, configure the same OAuth providers27// in Authentication > Providers (Google, GitHub, etc.)28// Email/password users will need to use "Forgot Password"29// on their first Supabase loginExpected result: Auth user data is exported and OAuth providers are configured in Supabase. Email/password users have a password reset path.
Convert Firebase security rules to Supabase RLS policies
Convert Firebase security rules to Supabase RLS policies
Firebase security rules and Supabase RLS serve the same purpose — restricting data access — but the syntax is completely different. Firebase rules use a declarative DSL with request.auth, while Supabase uses SQL policies with auth.uid(). Translate each rule into a PostgreSQL policy. The key difference is that RLS policies are actual database-level enforcement, not application-level.
1-- Firebase rule: allow read if request.auth != null2-- Supabase equivalent:3create policy "Authenticated users can read posts"4 on public.posts for select5 using (auth.uid() is not null);67-- Firebase rule: allow write if request.auth.uid == resource.data.authorId8-- Supabase equivalent:9create policy "Users can update own posts"10 on public.posts for update11 using (auth.uid() = author_id)12 with check (auth.uid() = author_id);1314-- Firebase rule: allow create if request.resource.data.title.size() > 015-- Supabase equivalent (use a check constraint instead):16alter table public.posts17 add constraint posts_title_not_empty18 check (length(title) > 0);1920-- Firebase rule: allow delete if request.auth.uid == resource.data.authorId21create policy "Users can delete own posts"22 on public.posts for delete23 using (auth.uid() = author_id);Expected result: Every Firebase security rule has a corresponding RLS policy in Supabase, enforced at the database level.
Update client-side code to use the Supabase SDK
Update client-side code to use the Supabase SDK
Replace Firebase client SDK calls with Supabase equivalents. The patterns are similar but the syntax differs. Firestore's collection/doc/getDoc becomes supabase.from().select(). Firebase Auth's signInWithPopup becomes supabase.auth.signInWithOAuth(). Real-time listeners switch from onSnapshot to supabase.channel().on('postgres_changes').
1// BEFORE: Firebase2import { collection, getDocs, query, where } from 'firebase/firestore'3const q = query(collection(db, 'posts'), where('authorId', '==', userId))4const snapshot = await getDocs(q)5const posts = snapshot.docs.map(d => ({ id: d.id, ...d.data() }))67// AFTER: Supabase8import { createClient } from '@supabase/supabase-js'9const supabase = createClient(SUPABASE_URL, SUPABASE_ANON_KEY)10const { data: posts } = await supabase11 .from('posts')12 .select('*')13 .eq('author_id', userId)1415// BEFORE: Firebase Auth16import { signInWithPopup, GoogleAuthProvider } from 'firebase/auth'17await signInWithPopup(auth, new GoogleAuthProvider())1819// AFTER: Supabase Auth20await supabase.auth.signInWithOAuth({ provider: 'google' })2122// BEFORE: Firebase real-time23onSnapshot(query(...), (snap) => { /* update */ })2425// AFTER: Supabase real-time26supabase.channel('posts').on(27 'postgres_changes',28 { event: '*', schema: 'public', table: 'posts' },29 (payload) => { /* update */ }30).subscribe()Expected result: All Firebase SDK calls are replaced with Supabase equivalents and the app functions the same way.
Complete working example
1import * as admin from 'firebase-admin'2import * as fs from 'fs'3import { createClient } from '@supabase/supabase-js'45admin.initializeApp({ credential: admin.credential.applicationDefault() })6const firestore = admin.firestore()78const supabase = createClient(9 process.env.SUPABASE_URL!,10 process.env.SUPABASE_SERVICE_ROLE_KEY!11)1213async function exportAndImport(collectionName: string, transform: (doc: any) => any) {14 console.log(`Migrating ${collectionName}...`)15 const snapshot = await firestore.collection(collectionName).get()16 const rows = snapshot.docs.map((doc) => transform({ _id: doc.id, ...doc.data() }))1718 // Insert in batches of 50019 for (let i = 0; i < rows.length; i += 500) {20 const batch = rows.slice(i, i + 500)21 const { error } = await supabase.from(collectionName).insert(batch)22 if (error) {23 console.error(`Error at batch ${i}:`, error.message)24 }25 }26 console.log(`Migrated ${rows.length} rows from ${collectionName}`)27}2829await exportAndImport('users', (doc) => ({30 firebase_uid: doc._id,31 email: doc.email,32 display_name: doc.displayName ?? null,33 created_at: doc.createdAt?._seconds34 ? new Date(doc.createdAt._seconds * 1000).toISOString()35 : new Date().toISOString(),36}))3738await exportAndImport('posts', (doc) => ({39 firebase_uid: doc._id,40 title: doc.title,41 body: doc.body ?? '',42 tags: doc.tags ?? [],43 created_at: doc.createdAt?._seconds44 ? new Date(doc.createdAt._seconds * 1000).toISOString()45 : new Date().toISOString(),46}))4748console.log('Migration complete')Common mistakes when migrating from Firebase to Supabase
Why it's a problem: Trying to replicate the exact NoSQL document structure in PostgreSQL with JSON columns
How to avoid: Take the opportunity to normalize your data with proper tables and foreign keys. JSON columns should only be used for truly unstructured or variable-schema data.
Why it's a problem: Forgetting to enable RLS on new tables, leaving data publicly accessible
How to avoid: Run ALTER TABLE ... ENABLE ROW LEVEL SECURITY on every table immediately after creation. Supabase tables with RLS enabled but no policies deny all access by default.
Why it's a problem: Using the Supabase anon key in migration scripts, which gets blocked by RLS
How to avoid: Use SUPABASE_SERVICE_ROLE_KEY for server-side migration scripts. This key bypasses RLS. Never expose it in client code.
Why it's a problem: Assuming Firebase password hashes can be imported into Supabase
How to avoid: Firebase does not export password hashes in a format Supabase can use. Plan for email/password users to reset their password on first login.
Best practices
- Run the migration in a staging Supabase project first before touching production data
- Keep the Firebase project running in parallel during migration for rollback safety
- Preserve Firebase document IDs in a column so you can trace data lineage and debug issues
- Normalize denormalized Firestore data into proper relational tables with foreign keys
- Enable RLS on every table and create policies before importing any data
- Batch large imports into chunks of 500 rows to avoid timeout errors
- Send password reset emails to email/password users on migration day
- Test all RLS policies with the Supabase SQL editor before going live
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I am migrating a Firebase app to Supabase. The app has Firestore collections for users, posts, and comments, plus Firebase Auth with Google and email/password sign-in. Give me a complete migration plan including data export, PostgreSQL schema design, RLS policies, and client SDK changes.
Convert this Firebase Firestore + Auth app to use Supabase. Replace all firebase/firestore imports with @supabase/supabase-js queries, convert security rules to RLS policies, and change signInWithPopup(GoogleAuthProvider) to supabase.auth.signInWithOAuth({ provider: 'google' }).
Frequently asked questions
How long does a Firebase to Supabase migration typically take?
For a small app with a few collections and basic auth, expect 1-2 weeks. For a medium app with complex queries, subcollections, and multiple auth providers, plan for 2-4 weeks. Large apps with Cloud Functions and extensive real-time features may take 4-8 weeks.
Can I migrate incrementally instead of all at once?
Yes. A common strategy is to migrate auth first, then move collections one at a time while running both backends in parallel. Use feature flags to switch between Firebase and Supabase on a per-feature basis.
Will my real-time listeners work the same in Supabase?
Supabase supports real-time updates via Postgres Changes, which work similarly to Firestore onSnapshot. The main difference is that Supabase real-time is opt-in per table and uses a different event format. The latency is comparable.
What about Cloud Functions — does Supabase have an equivalent?
Supabase Edge Functions (Deno-based) replace Cloud Functions for HTTP endpoints and webhooks. For database triggers, use PostgreSQL triggers and functions instead of Firestore triggers. Scheduled functions can be replaced with pg_cron.
Is vendor lock-in better or worse with Supabase compared to Firebase?
Supabase has significantly less vendor lock-in because it is built on PostgreSQL, an open standard. You can self-host Supabase or migrate to any PostgreSQL provider. Firebase uses proprietary NoSQL schemas and APIs that are harder to move away from.
Can RapidDev help with a Firebase to Supabase migration?
Yes. RapidDev can plan and execute the full migration including schema design, data transfer, auth migration, RLS policy creation, and client SDK updates while keeping your app running with zero downtime.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation