Firestore batch writes let you perform multiple set, update, and delete operations atomically using writeBatch(). All operations in a batch either succeed or fail together, making them ideal for maintaining data consistency. Each batch supports up to 500 operations. Create a batch with writeBatch(db), add operations with batch.set(), batch.update(), and batch.delete(), then execute with batch.commit().
Performing Batch Writes in Firestore
Firestore batch writes execute multiple write operations as a single atomic unit. If any operation in the batch fails, none of them are applied. This tutorial covers creating batch writes with writeBatch(), combining set, update, and delete operations, splitting large datasets into batches of 500, comparing batches to transactions, and writing security rules that allow batch operations.
Prerequisites
- A Firebase project with Firestore enabled
- Firebase JS SDK v9+ installed
- Basic understanding of Firestore documents and collections
- Firestore security rules that allow write operations
Step-by-step guide
Create a basic batch write
Create a basic batch write
Import writeBatch from firebase/firestore and create a batch instance. Add operations to the batch using batch.set() for creating or overwriting documents, batch.update() for partial updates, and batch.delete() for removing documents. None of the operations execute until you call batch.commit(), which sends all operations to Firestore as a single atomic unit.
1import {2 getFirestore, writeBatch, doc, collection, serverTimestamp3} from 'firebase/firestore';45const db = getFirestore();67async function createBatchWrite() {8 const batch = writeBatch(db);910 // Set (create or overwrite) a document11 const userRef = doc(collection(db, 'users'));12 batch.set(userRef, {13 name: 'Alice',14 email: 'alice@example.com',15 createdAt: serverTimestamp()16 });1718 // Update an existing document19 const statsRef = doc(db, 'stats', 'userCount');20 batch.update(statsRef, {21 count: 101,22 lastUpdated: serverTimestamp()23 });2425 // Delete a document26 const oldRef = doc(db, 'tempData', 'expired-session');27 batch.delete(oldRef);2829 // Commit all operations atomically30 await batch.commit();31 console.log('Batch write completed');32}Expected result: All three operations (create, update, delete) are applied atomically. Either all succeed or none are applied.
Batch-insert multiple documents
Batch-insert multiple documents
Use batch writes to insert multiple documents in a single operation. This is much more efficient than individual addDoc calls because it reduces the number of network round trips. Loop through your data array, add each document to the batch with batch.set(), and commit once. For auto-generated IDs, use doc(collection(db, 'collectionName')) to create a reference with a generated ID.
1import {2 getFirestore, writeBatch, doc, collection, serverTimestamp3} from 'firebase/firestore';45const db = getFirestore();67interface Product {8 name: string;9 price: number;10 category: string;11}1213async function batchInsertProducts(products: Product[]) {14 const batch = writeBatch(db);1516 for (const product of products) {17 const docRef = doc(collection(db, 'products'));18 batch.set(docRef, {19 ...product,20 createdAt: serverTimestamp()21 });22 }2324 await batch.commit();25 console.log(`Inserted ${products.length} products`);26}2728// Usage29await batchInsertProducts([30 { name: 'Widget A', price: 29.99, category: 'tools' },31 { name: 'Widget B', price: 49.99, category: 'tools' },32 { name: 'Gadget C', price: 19.99, category: 'electronics' }33]);Expected result: All products are inserted as new documents with auto-generated IDs in a single atomic operation.
Handle the 500-operation limit
Handle the 500-operation limit
Firestore batch writes are limited to 500 operations per batch. For larger datasets, split your operations into chunks of 500 and commit each chunk separately. Note that while each individual batch is atomic, the overall operation across multiple batches is not. If the second batch fails, the first batch's changes are already applied. For critical operations requiring full atomicity, consider using transactions instead.
1import {2 getFirestore, writeBatch, doc, collection, serverTimestamp3} from 'firebase/firestore';45const db = getFirestore();6const BATCH_LIMIT = 500;78async function batchInsertLargeDataset(items: any[]) {9 let batch = writeBatch(db);10 let operationCount = 0;11 let totalCommitted = 0;1213 for (const item of items) {14 const docRef = doc(collection(db, 'items'));15 batch.set(docRef, {16 ...item,17 createdAt: serverTimestamp()18 });19 operationCount++;2021 if (operationCount >= BATCH_LIMIT) {22 await batch.commit();23 totalCommitted += operationCount;24 console.log(`Committed ${totalCommitted} of ${items.length}`);25 batch = writeBatch(db);26 operationCount = 0;27 }28 }2930 // Commit remaining operations31 if (operationCount > 0) {32 await batch.commit();33 totalCommitted += operationCount;34 }3536 console.log(`Total committed: ${totalCommitted} items`);37}Expected result: All items are inserted across multiple batch commits, with each batch containing up to 500 operations.
Use batch writes for denormalized data updates
Use batch writes for denormalized data updates
In Firestore, you often denormalize data for query efficiency. When the source data changes, use a batch write to update all denormalized copies atomically. For example, when a user updates their display name, update the name in both the user document and all of their post documents in a single batch.
1import {2 getFirestore, writeBatch, doc, collection,3 query, where, getDocs4} from 'firebase/firestore';56const db = getFirestore();78async function updateUserNameEverywhere(9 userId: string,10 newName: string11) {12 const batch = writeBatch(db);1314 // Update the user document15 const userRef = doc(db, 'users', userId);16 batch.update(userRef, { displayName: newName });1718 // Update all posts by this user19 const postsQuery = query(20 collection(db, 'posts'),21 where('authorId', '==', userId)22 );23 const postsSnapshot = await getDocs(postsQuery);2425 for (const postDoc of postsSnapshot.docs) {26 batch.update(postDoc.ref, { authorName: newName });27 }2829 // Check we haven't exceeded the limit30 if (postsSnapshot.size + 1 > 500) {31 throw new Error('Too many posts to update in a single batch');32 }3334 await batch.commit();35 console.log(`Updated name in ${postsSnapshot.size + 1} documents`);36}Expected result: The user's display name is updated in the user document and all associated posts atomically.
Write security rules that support batch operations
Write security rules that support batch operations
Firestore security rules evaluate each document operation in a batch independently. Every set, update, and delete in the batch must pass the security rules for the respective document path. There is no special rule syntax for batches. Ensure your rules allow the operations you are performing and validate the data being written.
1rules_version = '2';2service cloud.firestore {3 match /databases/{database}/documents {4 match /products/{productId} {5 allow read: if request.auth != null;6 allow create: if request.auth != null7 && request.resource.data.name is string8 && request.resource.data.price is number9 && request.resource.data.price > 0;10 allow update: if request.auth != null11 && request.resource.data.name is string;12 allow delete: if request.auth != null13 && request.auth.token.admin == true;14 }1516 match /users/{userId} {17 allow read: if request.auth != null;18 allow write: if request.auth != null19 && request.auth.uid == userId;20 }2122 match /posts/{postId} {23 allow read: if request.auth != null;24 allow update: if request.auth != null25 && request.auth.uid == resource.data.authorId;26 }27 }28}Expected result: Security rules allow the individual operations that make up your batch writes while validating data integrity.
Complete working example
1import {2 getFirestore,3 writeBatch,4 doc,5 collection,6 query,7 where,8 getDocs,9 serverTimestamp,10 increment11} from 'firebase/firestore';12import { initializeApp } from 'firebase/app';1314const app = initializeApp({15 apiKey: 'YOUR_API_KEY',16 authDomain: 'YOUR_PROJECT.firebaseapp.com',17 projectId: 'YOUR_PROJECT_ID'18});1920const db = getFirestore(app);21const BATCH_LIMIT = 500;2223// Insert multiple documents with auto-generated IDs24export async function batchInsert<T extends Record<string, any>>(25 collectionName: string,26 items: T[]27): Promise<number> {28 let batch = writeBatch(db);29 let count = 0;30 let total = 0;3132 for (const item of items) {33 const ref = doc(collection(db, collectionName));34 batch.set(ref, { ...item, createdAt: serverTimestamp() });35 count++;3637 if (count >= BATCH_LIMIT) {38 await batch.commit();39 total += count;40 batch = writeBatch(db);41 count = 0;42 }43 }4445 if (count > 0) {46 await batch.commit();47 total += count;48 }4950 return total;51}5253// Update denormalized data across collections54export async function updateDenormalizedField(55 sourceCollection: string,56 sourceId: string,57 targetCollection: string,58 foreignKey: string,59 fieldName: string,60 newValue: any61): Promise<number> {62 const batch = writeBatch(db);6364 // Update source document65 batch.update(doc(db, sourceCollection, sourceId), {66 [fieldName]: newValue67 });6869 // Find and update target documents70 const q = query(71 collection(db, targetCollection),72 where(foreignKey, '==', sourceId)73 );74 const snapshot = await getDocs(q);7576 if (snapshot.size + 1 > BATCH_LIMIT) {77 throw new Error(78 `Too many documents (${snapshot.size + 1}) for a single batch`79 );80 }8182 snapshot.docs.forEach(d => {83 batch.update(d.ref, { [fieldName]: newValue });84 });8586 await batch.commit();87 return snapshot.size + 1;88}8990// Delete multiple documents by query91export async function batchDelete(92 collectionName: string,93 field: string,94 value: any95): Promise<number> {96 const q = query(97 collection(db, collectionName),98 where(field, '==', value)99 );100 const snapshot = await getDocs(q);101102 let batch = writeBatch(db);103 let count = 0;104 let total = 0;105106 for (const d of snapshot.docs) {107 batch.delete(d.ref);108 count++;109110 if (count >= BATCH_LIMIT) {111 await batch.commit();112 total += count;113 batch = writeBatch(db);114 count = 0;115 }116 }117118 if (count > 0) {119 await batch.commit();120 total += count;121 }122123 return total;124}Common mistakes when batching Write in Firestore
Why it's a problem: Exceeding the 500-operation limit and getting a batch commit error
How to avoid: Track the operation count and commit the batch at 499 operations, then create a new batch for the remaining items. Each set, update, and delete counts as one operation.
Why it's a problem: Assuming that multiple batches committed sequentially are atomic
How to avoid: Each batch is independently atomic, but multiple batches are not. If the second batch fails, the first batch's changes persist. For full atomicity across more than 500 operations, use transactions (also limited to 500) or redesign your data model.
Why it's a problem: Using batch writes when a transaction is needed for read-then-write operations
How to avoid: Batch writes do not support reads. If you need to read a document's current value before deciding what to write, use runTransaction() instead. Batches are for blind writes only.
Best practices
- Use batch writes for multiple blind writes that need to be atomic (no reads required)
- Split large datasets into chunks of 500 operations per batch commit
- Use transactions instead of batches when you need to read documents before writing
- Include serverTimestamp() in batch operations for consistent timestamp values
- Test batch operations against your security rules before deploying to production
- Use batch writes for denormalized data updates to keep copies consistent
- For server-side batch operations on very large datasets, use the Admin SDK which bypasses security rules
- Monitor Firestore write quotas when running large batch operations (20,000 writes/day on Spark)
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
Show me how to use Firestore batch writes to insert multiple documents, handle the 500-operation limit for large datasets, and update denormalized data across collections atomically. Use the Firebase modular SDK v9 with TypeScript.
Write a TypeScript Firestore utility module using the modular SDK v9 that provides reusable functions for batch inserts with auto-chunking at 500 operations, batch deletes by query, and denormalized field updates across collections. Include serverTimestamp and security rules.
Frequently asked questions
What is the maximum number of operations in a Firestore batch write?
500 operations per batch. Each set, update, and delete counts as one operation. If you need more, split your operations into multiple batches of 500 and commit them sequentially.
Are batch writes atomic?
Yes. All operations in a single batch either succeed or fail together. If any operation fails (for example, due to security rules), none of the changes are applied. However, multiple sequential batches are not atomic with each other.
What is the difference between batch writes and transactions?
Batch writes are for multiple blind writes (no reads needed) and are atomic. Transactions support reads before writes with automatic retry on contention. Use batches when you do not need to read existing data; use transactions when you need read-then-write consistency.
Do batch writes work offline?
Yes. Batch writes are queued locally when the device is offline and committed to Firestore when connectivity is restored. Transactions do not work offline because they require server communication for read consistency.
Can I use batch writes across different collections?
Yes. A single batch can include operations on documents in different collections. For example, you can set a document in users, update a document in stats, and delete a document in sessions all in one batch.
Do security rules apply to each operation in a batch?
Yes. Firestore evaluates security rules independently for each operation in the batch. If any single operation fails the rules check, the entire batch is rejected.
Can RapidDev help optimize batch operations in my Firebase app?
Yes. RapidDev can design efficient batch operation patterns for your data model, including chunked imports, denormalized data sync, and server-side batch processing with the Admin SDK for large-scale operations.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation