Skip to main content
RapidDev - Software Development Agency
firebase-tutorial

How to Reduce Cold Start Time in Firebase Cloud Functions

Firebase Cloud Functions v2 cold starts range from 1 to 12 seconds depending on dependencies and configuration. The most effective fixes are setting minInstances to keep warm instances ready, using preferRest: true for the Firestore Admin SDK to avoid gRPC overhead, lazy-loading heavy dependencies inside the function handler instead of at the top level, and increasing memory allocation to speed up initialization.

What you'll learn

  • Why cold starts happen and what affects their duration in Firebase Cloud Functions
  • How to use minInstances to keep warm function instances ready
  • How to reduce initialization time with preferRest, lazy loading, and code splitting
  • How to measure cold start duration and monitor warm vs cold invocations
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner10 min read15-20 minFirebase Cloud Functions v2 (2nd gen), firebase-functions v4+, firebase-admin v12+, Node.js 18/20/22March 2026RapidDev Engineering Team
TL;DR

Firebase Cloud Functions v2 cold starts range from 1 to 12 seconds depending on dependencies and configuration. The most effective fixes are setting minInstances to keep warm instances ready, using preferRest: true for the Firestore Admin SDK to avoid gRPC overhead, lazy-loading heavy dependencies inside the function handler instead of at the top level, and increasing memory allocation to speed up initialization.

Eliminating Cold Start Delays in Firebase Cloud Functions

Cold starts occur when Firebase spins up a new function instance to handle a request — the container must initialize, load your code, and establish connections before processing begins. V2 functions built on Cloud Run typically cold start in 1-3 seconds for lightweight functions and 3-12 seconds for functions that use firebase-admin with Firestore (due to gRPC overhead). This tutorial covers every practical technique to reduce or eliminate cold start latency.

Prerequisites

  • A Firebase project on the Blaze plan with Cloud Functions deployed
  • Firebase CLI v12+ installed and logged in
  • Functions written using v2 imports (firebase-functions/v2)
  • Basic understanding of how Cloud Functions execute on Cloud Run

Step-by-step guide

1

Set minInstances to keep warm instances running

The most effective cold start fix is setting minInstances to 1 or more. This keeps at least one container running at all times, so the first request is always warm. Warm instances respond in around 50ms instead of seconds. The cost is approximately $3-8 per month per warm instance depending on memory allocation. Use this for user-facing functions where latency matters.

typescript
1import { onRequest } from "firebase-functions/v2/https";
2import { onCall } from "firebase-functions/v2/https";
3
4// Keep 1 warm instance — eliminates cold starts for most traffic
5export const api = onRequest(
6 {
7 minInstances: 1,
8 maxInstances: 100,
9 memory: "512MiB",
10 region: "us-central1",
11 },
12 async (req, res) => {
13 res.json({ status: "ok", cold: false });
14 }
15);
16
17// For callable functions
18export const processData = onCall(
19 {
20 minInstances: 1,
21 memory: "1GiB",
22 concurrency: 10,
23 },
24 async (request) => {
25 return { result: "processed" };
26 }
27);

Expected result: At least one instance stays warm at all times. The first request after idle time responds in milliseconds instead of seconds.

2

Use preferRest to avoid gRPC initialization overhead

The firebase-admin SDK's Firestore client uses gRPC by default, which adds 2-5 seconds to cold start time due to protocol initialization and native module loading. Setting preferRest: true tells the Admin SDK to use the REST API instead, which initializes much faster. The tradeoff is slightly higher latency per Firestore operation (a few milliseconds), but the cold start improvement is dramatic.

typescript
1import { initializeApp } from "firebase-admin/app";
2import { getFirestore } from "firebase-admin/firestore";
3
4// Initialize with preferRest to avoid gRPC cold start overhead
5const app = initializeApp();
6const db = getFirestore(app);
7db.settings({ preferRest: true });
8
9// This single change can reduce cold starts by 2-5 seconds
10// for functions that use Firestore via the Admin SDK

Expected result: Cold start time for Firestore-using functions drops from 5-12 seconds to 1-3 seconds.

3

Lazy-load heavy dependencies inside the function handler

Top-level imports are loaded when the container starts, before any request is processed. Heavy libraries like image processors, PDF generators, or AI SDKs add seconds to cold start time. Move these imports inside the function handler so they are only loaded when the function is actually called. Global-scope code should be limited to lightweight initialization.

typescript
1import { onRequest } from "firebase-functions/v2/https";
2
3// BAD: Heavy import at top level — loaded on every cold start
4// import sharp from "sharp";
5// import PDFDocument from "pdfkit";
6
7export const generateThumbnail = onRequest(
8 { memory: "1GiB" },
9 async (req, res) => {
10 // GOOD: Lazy-load heavy dependencies only when needed
11 const sharp = await import("sharp");
12 const image = await sharp.default(req.body)
13 .resize(200, 200)
14 .toBuffer();
15 res.setHeader("Content-Type", "image/png");
16 res.send(image);
17 }
18);
19
20export const healthCheck = onRequest(async (req, res) => {
21 // This function does NOT load sharp because it is lazy-loaded
22 // in generateThumbnail, not at the top level
23 res.json({ status: "ok" });
24});

Expected result: Functions that do not need heavy libraries cold-start faster because they skip loading unused dependencies.

4

Split functions into separate files or entry points

When all functions share a single index.ts file, every cold start loads every function's dependencies. Split functions into separate files and use dynamic exports so each function's container only loads what it needs. Firebase supports splitting functions across multiple files in the functions/src directory.

typescript
1// functions/src/index.ts — Lightweight entry point
2export { api } from "./api";
3export { processImage } from "./images";
4export { onUserCreated } from "./auth";
5
6// functions/src/api.ts — Only loads API dependencies
7import { onRequest } from "firebase-functions/v2/https";
8import { getFirestore } from "firebase-admin/firestore";
9
10const db = getFirestore();
11db.settings({ preferRest: true });
12
13export const api = onRequest(
14 { minInstances: 1, memory: "512MiB" },
15 async (req, res) => {
16 const data = await db.collection("items").limit(20).get();
17 res.json(data.docs.map((d) => d.data()));
18 }
19);
20
21// functions/src/images.ts — Only loads image dependencies
22import { onRequest } from "firebase-functions/v2/https";
23
24export const processImage = onRequest(
25 { memory: "2GiB" },
26 async (req, res) => {
27 const sharp = await import("sharp");
28 // Image processing...
29 res.json({ status: "done" });
30 }
31);

Expected result: Each function's cold start only loads its own dependencies, reducing initialization time for lightweight functions.

5

Increase memory to speed up CPU-bound initialization

Higher memory allocations come with proportionally more CPU in Cloud Run. More CPU means faster module loading and initialization during cold starts. A function with 256 MiB gets a fraction of a vCPU, while 1 GiB gets a full vCPU. If your cold start is dominated by JavaScript parsing and module loading (not network), increasing memory can cut the time significantly.

typescript
1import { onRequest } from "firebase-functions/v2/https";
2
3// 256 MiB (default) — cold start ~4s for moderate dependencies
4export const slowStart = onRequest(
5 { memory: "256MiB" },
6 async (req, res) => {
7 res.json({ memory: "256MiB" });
8 }
9);
10
11// 1 GiB — cold start ~2s for the same code
12export const fastStart = onRequest(
13 { memory: "1GiB" },
14 async (req, res) => {
15 res.json({ memory: "1GiB" });
16 }
17);
18
19// 2 GiB — cold start ~1.5s with 2 vCPUs
20export const fastestStart = onRequest(
21 { memory: "2GiB", cpu: 2 },
22 async (req, res) => {
23 res.json({ memory: "2GiB", cpu: 2 });
24 }
25);

Expected result: Functions with higher memory allocation initialize faster due to increased CPU availability during cold start.

6

Measure cold start duration in your function logs

Add timing instrumentation to your functions to measure the actual cold start duration. Log the difference between when the module loads (top-level code) and when the first request arrives. Check Cloud Logging for these measurements and compare cold vs warm response times. This data helps you prioritize which optimization techniques to apply.

typescript
1import { onRequest } from "firebase-functions/v2/https";
2import { logger } from "firebase-functions";
3
4// Record module load time (runs once during cold start)
5const moduleLoadTime = Date.now();
6let isFirstInvocation = true;
7
8export const measuredFunction = onRequest(
9 { memory: "512MiB", minInstances: 0 },
10 async (req, res) => {
11 const requestStart = Date.now();
12
13 if (isFirstInvocation) {
14 const coldStartMs = requestStart - moduleLoadTime;
15 logger.info("Cold start detected", {
16 coldStartMs,
17 memory: "512MiB",
18 });
19 isFirstInvocation = false;
20 }
21
22 // Your function logic here
23 const processingTime = Date.now() - requestStart;
24
25 logger.info("Request completed", {
26 processingMs: processingTime,
27 warm: !isFirstInvocation,
28 });
29
30 res.json({
31 processingMs: processingTime,
32 });
33 }
34);

Expected result: You can see exact cold start durations in your logs and track improvements as you apply optimizations.

Complete working example

functions/src/index.ts
1// Firebase Cloud Functions — Cold Start Optimization
2import { initializeApp } from "firebase-admin/app";
3import { getFirestore } from "firebase-admin/firestore";
4import { onRequest, onCall, HttpsError } from "firebase-functions/v2/https";
5import { logger } from "firebase-functions";
6
7// 1. Initialize Admin SDK with preferRest (saves 2-5s cold start)
8const app = initializeApp();
9const db = getFirestore(app);
10db.settings({ preferRest: true });
11
12// Cold start measurement
13const moduleLoadTime = Date.now();
14let coldStartLogged = false;
15
16function logColdStart(functionName: string) {
17 if (!coldStartLogged) {
18 const duration = Date.now() - moduleLoadTime;
19 logger.info(`Cold start: ${functionName}`, { coldStartMs: duration });
20 coldStartLogged = true;
21 }
22}
23
24// 2. API endpoint — minInstances keeps it warm
25export const api = onRequest(
26 {
27 minInstances: 1,
28 maxInstances: 50,
29 memory: "512MiB",
30 region: "us-central1",
31 concurrency: 20,
32 },
33 async (req, res) => {
34 logColdStart("api");
35 const items = await db.collection("items").limit(20).get();
36 res.json(items.docs.map((d) => ({ id: d.id, ...d.data() })));
37 }
38);
39
40// 3. Callable with concurrency — shares warm instance
41export const processData = onCall(
42 {
43 minInstances: 1,
44 memory: "1GiB",
45 concurrency: 10,
46 timeoutSeconds: 120,
47 },
48 async (request) => {
49 logColdStart("processData");
50 if (!request.auth) {
51 throw new HttpsError("unauthenticated", "Login required");
52 }
53 return { processed: true };
54 }
55);
56
57// 4. Image processing — lazy-loads sharp
58export const resizeImage = onRequest(
59 { memory: "2GiB", timeoutSeconds: 60 },
60 async (req, res) => {
61 logColdStart("resizeImage");
62 const sharp = await import("sharp");
63 const buffer = await sharp
64 .default(req.body)
65 .resize(400, 400)
66 .jpeg({ quality: 80 })
67 .toBuffer();
68 res.setHeader("Content-Type", "image/jpeg");
69 res.send(buffer);
70 }
71);
72
73// 5. Lightweight health check — no minInstances needed
74export const health = onRequest(async (req, res) => {
75 res.json({ status: "ok", timestamp: Date.now() });
76});

Common mistakes when reducing Cold Start Time in Firebase Cloud Functions

Why it's a problem: Importing every dependency at the top level of index.ts, causing all functions to pay the initialization cost of every library

How to avoid: Use dynamic import() for heavy libraries inside the function handler. Only import lightweight essentials (firebase-admin, firebase-functions) at the top level.

Why it's a problem: Setting minInstances on every function, leading to high idle costs for rarely-used background triggers

How to avoid: Only use minInstances on user-facing functions where latency matters (API endpoints, auth flows). Leave background triggers and scheduled functions at minInstances: 0.

Why it's a problem: Using the default gRPC Firestore client in Admin SDK without preferRest, adding 2-5 seconds to every cold start

How to avoid: Add db.settings({ preferRest: true }) after initializing Firestore in your function code. This is the single highest-impact change for most functions.

Best practices

  • Set minInstances: 1 on latency-sensitive functions like API endpoints and callable functions
  • Use preferRest: true for the Admin SDK Firestore client to eliminate gRPC cold start overhead
  • Lazy-load heavy dependencies with dynamic import() inside the function handler
  • Split functions into separate files to prevent unnecessary module loading
  • Increase memory allocation to get more CPU for faster initialization during cold starts
  • Set concurrency higher than 1 so one warm instance handles multiple requests simultaneously
  • Log cold start duration with timestamps to measure optimization impact
  • Keep the number of npm dependencies minimal — every dependency adds to initialization time

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

My Firebase Cloud Function v2 takes 8 seconds to cold start. It uses firebase-admin with Firestore and imports the sharp image processing library. Show me how to reduce the cold start to under 2 seconds using minInstances, preferRest, lazy loading, increased memory, and cold start measurement logging.

Firebase Prompt

Optimize my Firebase Cloud Functions for cold start performance. Set up the Admin SDK with preferRest: true, add minInstances: 1 to my API function, lazy-load sharp only in the image processing function, and add cold start timing logs. Use v2 function options with 512 MiB memory for the API and 2 GiB for image processing.

Frequently asked questions

What causes cold starts in Firebase Cloud Functions?

Cold starts happen when Firebase creates a new container instance to handle a request. The container must download your code, initialize the Node.js runtime, load all imported modules, and establish connections (like Firestore gRPC). This process takes 1-12 seconds depending on your dependencies and memory allocation.

How much does minInstances cost?

A warm instance with 256 MiB memory costs approximately $3-4 per month. With 512 MiB, it is about $6-8 per month. The cost scales linearly with memory. This is usually far less than the business impact of slow response times.

Does preferRest reduce performance for Firestore operations?

REST mode adds a few milliseconds of latency per Firestore operation compared to gRPC. For most functions, this is negligible. The tradeoff is worth it: 2-5 seconds faster cold starts in exchange for slightly slower individual reads and writes.

Can I eliminate cold starts entirely?

Setting minInstances to 1 or more ensures at least one instance is always warm. However, traffic spikes that exceed your warm capacity will still trigger cold starts for new instances. Set maxInstances and concurrency appropriately to handle expected peak traffic.

Do v2 functions have faster cold starts than v1?

V2 functions (Cloud Run) generally have similar cold start times to v1 but offer concurrency, which means one warm instance can handle multiple requests. This effectively reduces the number of cold starts under load.

Does the Node.js version affect cold start time?

Node.js 20 and 22 generally initialize slightly faster than 18 due to runtime optimizations. Always use the latest supported LTS version. Currently supported versions are 18, 20, and 22.

Can RapidDev help optimize my Firebase Cloud Functions performance?

Yes. RapidDev can profile your functions, identify cold start bottlenecks, implement minInstances and preferRest optimizations, restructure your code for lazy loading, and set up monitoring dashboards for ongoing performance tracking.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.