Skip to main content
RapidDev - Software Development Agency
firebase-tutorial

How to Batch Write in Firestore

Firestore batch writes let you perform multiple set, update, and delete operations atomically using writeBatch(). All operations in a batch either succeed or fail together, making them ideal for maintaining data consistency. Each batch supports up to 500 operations. Create a batch with writeBatch(db), add operations with batch.set(), batch.update(), and batch.delete(), then execute with batch.commit().

What you'll learn

  • How to create and commit a Firestore batch write
  • How to mix set, update, and delete operations in a single batch
  • How to handle the 500-operation limit for large batches
  • When to use batch writes versus transactions
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate9 min read10-15 minFirebase JS SDK v9+, Cloud Firestore, all plansMarch 2026RapidDev Engineering Team
TL;DR

Firestore batch writes let you perform multiple set, update, and delete operations atomically using writeBatch(). All operations in a batch either succeed or fail together, making them ideal for maintaining data consistency. Each batch supports up to 500 operations. Create a batch with writeBatch(db), add operations with batch.set(), batch.update(), and batch.delete(), then execute with batch.commit().

Performing Batch Writes in Firestore

Firestore batch writes execute multiple write operations as a single atomic unit. If any operation in the batch fails, none of them are applied. This tutorial covers creating batch writes with writeBatch(), combining set, update, and delete operations, splitting large datasets into batches of 500, comparing batches to transactions, and writing security rules that allow batch operations.

Prerequisites

  • A Firebase project with Firestore enabled
  • Firebase JS SDK v9+ installed
  • Basic understanding of Firestore documents and collections
  • Firestore security rules that allow write operations

Step-by-step guide

1

Create a basic batch write

Import writeBatch from firebase/firestore and create a batch instance. Add operations to the batch using batch.set() for creating or overwriting documents, batch.update() for partial updates, and batch.delete() for removing documents. None of the operations execute until you call batch.commit(), which sends all operations to Firestore as a single atomic unit.

typescript
1import {
2 getFirestore, writeBatch, doc, collection, serverTimestamp
3} from 'firebase/firestore';
4
5const db = getFirestore();
6
7async function createBatchWrite() {
8 const batch = writeBatch(db);
9
10 // Set (create or overwrite) a document
11 const userRef = doc(collection(db, 'users'));
12 batch.set(userRef, {
13 name: 'Alice',
14 email: 'alice@example.com',
15 createdAt: serverTimestamp()
16 });
17
18 // Update an existing document
19 const statsRef = doc(db, 'stats', 'userCount');
20 batch.update(statsRef, {
21 count: 101,
22 lastUpdated: serverTimestamp()
23 });
24
25 // Delete a document
26 const oldRef = doc(db, 'tempData', 'expired-session');
27 batch.delete(oldRef);
28
29 // Commit all operations atomically
30 await batch.commit();
31 console.log('Batch write completed');
32}

Expected result: All three operations (create, update, delete) are applied atomically. Either all succeed or none are applied.

2

Batch-insert multiple documents

Use batch writes to insert multiple documents in a single operation. This is much more efficient than individual addDoc calls because it reduces the number of network round trips. Loop through your data array, add each document to the batch with batch.set(), and commit once. For auto-generated IDs, use doc(collection(db, 'collectionName')) to create a reference with a generated ID.

typescript
1import {
2 getFirestore, writeBatch, doc, collection, serverTimestamp
3} from 'firebase/firestore';
4
5const db = getFirestore();
6
7interface Product {
8 name: string;
9 price: number;
10 category: string;
11}
12
13async function batchInsertProducts(products: Product[]) {
14 const batch = writeBatch(db);
15
16 for (const product of products) {
17 const docRef = doc(collection(db, 'products'));
18 batch.set(docRef, {
19 ...product,
20 createdAt: serverTimestamp()
21 });
22 }
23
24 await batch.commit();
25 console.log(`Inserted ${products.length} products`);
26}
27
28// Usage
29await batchInsertProducts([
30 { name: 'Widget A', price: 29.99, category: 'tools' },
31 { name: 'Widget B', price: 49.99, category: 'tools' },
32 { name: 'Gadget C', price: 19.99, category: 'electronics' }
33]);

Expected result: All products are inserted as new documents with auto-generated IDs in a single atomic operation.

3

Handle the 500-operation limit

Firestore batch writes are limited to 500 operations per batch. For larger datasets, split your operations into chunks of 500 and commit each chunk separately. Note that while each individual batch is atomic, the overall operation across multiple batches is not. If the second batch fails, the first batch's changes are already applied. For critical operations requiring full atomicity, consider using transactions instead.

typescript
1import {
2 getFirestore, writeBatch, doc, collection, serverTimestamp
3} from 'firebase/firestore';
4
5const db = getFirestore();
6const BATCH_LIMIT = 500;
7
8async function batchInsertLargeDataset(items: any[]) {
9 let batch = writeBatch(db);
10 let operationCount = 0;
11 let totalCommitted = 0;
12
13 for (const item of items) {
14 const docRef = doc(collection(db, 'items'));
15 batch.set(docRef, {
16 ...item,
17 createdAt: serverTimestamp()
18 });
19 operationCount++;
20
21 if (operationCount >= BATCH_LIMIT) {
22 await batch.commit();
23 totalCommitted += operationCount;
24 console.log(`Committed ${totalCommitted} of ${items.length}`);
25 batch = writeBatch(db);
26 operationCount = 0;
27 }
28 }
29
30 // Commit remaining operations
31 if (operationCount > 0) {
32 await batch.commit();
33 totalCommitted += operationCount;
34 }
35
36 console.log(`Total committed: ${totalCommitted} items`);
37}

Expected result: All items are inserted across multiple batch commits, with each batch containing up to 500 operations.

4

Use batch writes for denormalized data updates

In Firestore, you often denormalize data for query efficiency. When the source data changes, use a batch write to update all denormalized copies atomically. For example, when a user updates their display name, update the name in both the user document and all of their post documents in a single batch.

typescript
1import {
2 getFirestore, writeBatch, doc, collection,
3 query, where, getDocs
4} from 'firebase/firestore';
5
6const db = getFirestore();
7
8async function updateUserNameEverywhere(
9 userId: string,
10 newName: string
11) {
12 const batch = writeBatch(db);
13
14 // Update the user document
15 const userRef = doc(db, 'users', userId);
16 batch.update(userRef, { displayName: newName });
17
18 // Update all posts by this user
19 const postsQuery = query(
20 collection(db, 'posts'),
21 where('authorId', '==', userId)
22 );
23 const postsSnapshot = await getDocs(postsQuery);
24
25 for (const postDoc of postsSnapshot.docs) {
26 batch.update(postDoc.ref, { authorName: newName });
27 }
28
29 // Check we haven't exceeded the limit
30 if (postsSnapshot.size + 1 > 500) {
31 throw new Error('Too many posts to update in a single batch');
32 }
33
34 await batch.commit();
35 console.log(`Updated name in ${postsSnapshot.size + 1} documents`);
36}

Expected result: The user's display name is updated in the user document and all associated posts atomically.

5

Write security rules that support batch operations

Firestore security rules evaluate each document operation in a batch independently. Every set, update, and delete in the batch must pass the security rules for the respective document path. There is no special rule syntax for batches. Ensure your rules allow the operations you are performing and validate the data being written.

typescript
1rules_version = '2';
2service cloud.firestore {
3 match /databases/{database}/documents {
4 match /products/{productId} {
5 allow read: if request.auth != null;
6 allow create: if request.auth != null
7 && request.resource.data.name is string
8 && request.resource.data.price is number
9 && request.resource.data.price > 0;
10 allow update: if request.auth != null
11 && request.resource.data.name is string;
12 allow delete: if request.auth != null
13 && request.auth.token.admin == true;
14 }
15
16 match /users/{userId} {
17 allow read: if request.auth != null;
18 allow write: if request.auth != null
19 && request.auth.uid == userId;
20 }
21
22 match /posts/{postId} {
23 allow read: if request.auth != null;
24 allow update: if request.auth != null
25 && request.auth.uid == resource.data.authorId;
26 }
27 }
28}

Expected result: Security rules allow the individual operations that make up your batch writes while validating data integrity.

Complete working example

firestore-batch-writes.ts
1import {
2 getFirestore,
3 writeBatch,
4 doc,
5 collection,
6 query,
7 where,
8 getDocs,
9 serverTimestamp,
10 increment
11} from 'firebase/firestore';
12import { initializeApp } from 'firebase/app';
13
14const app = initializeApp({
15 apiKey: 'YOUR_API_KEY',
16 authDomain: 'YOUR_PROJECT.firebaseapp.com',
17 projectId: 'YOUR_PROJECT_ID'
18});
19
20const db = getFirestore(app);
21const BATCH_LIMIT = 500;
22
23// Insert multiple documents with auto-generated IDs
24export async function batchInsert<T extends Record<string, any>>(
25 collectionName: string,
26 items: T[]
27): Promise<number> {
28 let batch = writeBatch(db);
29 let count = 0;
30 let total = 0;
31
32 for (const item of items) {
33 const ref = doc(collection(db, collectionName));
34 batch.set(ref, { ...item, createdAt: serverTimestamp() });
35 count++;
36
37 if (count >= BATCH_LIMIT) {
38 await batch.commit();
39 total += count;
40 batch = writeBatch(db);
41 count = 0;
42 }
43 }
44
45 if (count > 0) {
46 await batch.commit();
47 total += count;
48 }
49
50 return total;
51}
52
53// Update denormalized data across collections
54export async function updateDenormalizedField(
55 sourceCollection: string,
56 sourceId: string,
57 targetCollection: string,
58 foreignKey: string,
59 fieldName: string,
60 newValue: any
61): Promise<number> {
62 const batch = writeBatch(db);
63
64 // Update source document
65 batch.update(doc(db, sourceCollection, sourceId), {
66 [fieldName]: newValue
67 });
68
69 // Find and update target documents
70 const q = query(
71 collection(db, targetCollection),
72 where(foreignKey, '==', sourceId)
73 );
74 const snapshot = await getDocs(q);
75
76 if (snapshot.size + 1 > BATCH_LIMIT) {
77 throw new Error(
78 `Too many documents (${snapshot.size + 1}) for a single batch`
79 );
80 }
81
82 snapshot.docs.forEach(d => {
83 batch.update(d.ref, { [fieldName]: newValue });
84 });
85
86 await batch.commit();
87 return snapshot.size + 1;
88}
89
90// Delete multiple documents by query
91export async function batchDelete(
92 collectionName: string,
93 field: string,
94 value: any
95): Promise<number> {
96 const q = query(
97 collection(db, collectionName),
98 where(field, '==', value)
99 );
100 const snapshot = await getDocs(q);
101
102 let batch = writeBatch(db);
103 let count = 0;
104 let total = 0;
105
106 for (const d of snapshot.docs) {
107 batch.delete(d.ref);
108 count++;
109
110 if (count >= BATCH_LIMIT) {
111 await batch.commit();
112 total += count;
113 batch = writeBatch(db);
114 count = 0;
115 }
116 }
117
118 if (count > 0) {
119 await batch.commit();
120 total += count;
121 }
122
123 return total;
124}

Common mistakes when batching Write in Firestore

Why it's a problem: Exceeding the 500-operation limit and getting a batch commit error

How to avoid: Track the operation count and commit the batch at 499 operations, then create a new batch for the remaining items. Each set, update, and delete counts as one operation.

Why it's a problem: Assuming that multiple batches committed sequentially are atomic

How to avoid: Each batch is independently atomic, but multiple batches are not. If the second batch fails, the first batch's changes persist. For full atomicity across more than 500 operations, use transactions (also limited to 500) or redesign your data model.

Why it's a problem: Using batch writes when a transaction is needed for read-then-write operations

How to avoid: Batch writes do not support reads. If you need to read a document's current value before deciding what to write, use runTransaction() instead. Batches are for blind writes only.

Best practices

  • Use batch writes for multiple blind writes that need to be atomic (no reads required)
  • Split large datasets into chunks of 500 operations per batch commit
  • Use transactions instead of batches when you need to read documents before writing
  • Include serverTimestamp() in batch operations for consistent timestamp values
  • Test batch operations against your security rules before deploying to production
  • Use batch writes for denormalized data updates to keep copies consistent
  • For server-side batch operations on very large datasets, use the Admin SDK which bypasses security rules
  • Monitor Firestore write quotas when running large batch operations (20,000 writes/day on Spark)

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

Show me how to use Firestore batch writes to insert multiple documents, handle the 500-operation limit for large datasets, and update denormalized data across collections atomically. Use the Firebase modular SDK v9 with TypeScript.

Firebase Prompt

Write a TypeScript Firestore utility module using the modular SDK v9 that provides reusable functions for batch inserts with auto-chunking at 500 operations, batch deletes by query, and denormalized field updates across collections. Include serverTimestamp and security rules.

Frequently asked questions

What is the maximum number of operations in a Firestore batch write?

500 operations per batch. Each set, update, and delete counts as one operation. If you need more, split your operations into multiple batches of 500 and commit them sequentially.

Are batch writes atomic?

Yes. All operations in a single batch either succeed or fail together. If any operation fails (for example, due to security rules), none of the changes are applied. However, multiple sequential batches are not atomic with each other.

What is the difference between batch writes and transactions?

Batch writes are for multiple blind writes (no reads needed) and are atomic. Transactions support reads before writes with automatic retry on contention. Use batches when you do not need to read existing data; use transactions when you need read-then-write consistency.

Do batch writes work offline?

Yes. Batch writes are queued locally when the device is offline and committed to Firestore when connectivity is restored. Transactions do not work offline because they require server communication for read consistency.

Can I use batch writes across different collections?

Yes. A single batch can include operations on documents in different collections. For example, you can set a document in users, update a document in stats, and delete a document in sessions all in one batch.

Do security rules apply to each operation in a batch?

Yes. Firestore evaluates security rules independently for each operation in the batch. If any single operation fails the rules check, the entire batch is rejected.

Can RapidDev help optimize batch operations in my Firebase app?

Yes. RapidDev can design efficient batch operation patterns for your data model, including chunked imports, denormalized data sync, and server-side batch processing with the Admin SDK for large-scale operations.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.