Skip to main content
RapidDev - Software Development Agency
flutterflow-tutorials

How to Integrate FlutterFlow with AWS S3 and Google Cloud Storage

Connect FlutterFlow to AWS S3 or Google Cloud Storage by creating Cloud Functions that generate pre-signed URLs. For uploads: FlutterFlow sends a file to a Cloud Function, or the Cloud Function generates a pre-signed upload URL and FlutterFlow uploads directly to S3/GCS using that URL. For downloads: the Cloud Function generates a time-limited signed URL that FlutterFlow loads in an Image widget or launches in the browser. Never put AWS credentials or GCS service account keys in your FlutterFlow app.

What you'll learn

  • How to set up a Cloud Function that uploads files to AWS S3 using the AWS SDK
  • How to generate pre-signed download URLs from S3 or GCS for secure file access
  • How to use FlutterFlow's file upload button to send files to S3 via your Cloud Function
  • When to choose S3 or GCS over Firebase Storage and how to configure bucket permissions
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner11 min read60-90 minFlutterFlow Free+ (Cloud Functions required for credential management)March 2026RapidDev Engineering Team
TL;DR

Connect FlutterFlow to AWS S3 or Google Cloud Storage by creating Cloud Functions that generate pre-signed URLs. For uploads: FlutterFlow sends a file to a Cloud Function, or the Cloud Function generates a pre-signed upload URL and FlutterFlow uploads directly to S3/GCS using that URL. For downloads: the Cloud Function generates a time-limited signed URL that FlutterFlow loads in an Image widget or launches in the browser. Never put AWS credentials or GCS service account keys in your FlutterFlow app.

Store files in AWS S3 or Google Cloud Storage from your FlutterFlow app

Firebase Storage is the default file storage for FlutterFlow apps, but there are good reasons to use AWS S3 or Google Cloud Storage instead: your team already uses AWS infrastructure, S3 costs less at high volume, you need CDN distribution via CloudFront or Cloud CDN, or compliance requirements specify a particular cloud vendor. FlutterFlow connects to either service via Cloud Functions — the function handles authentication with the storage provider so AWS access keys or GCS service account credentials never appear in the mobile app. The app sends files to the Cloud Function, and the function handles the actual storage API call.

Prerequisites

  • An AWS account with an S3 bucket created, or a Google Cloud project with Cloud Storage bucket created
  • AWS IAM user with S3 access (access key + secret key), or a GCS service account with Storage Object Admin role
  • A Firebase project with Cloud Functions enabled (Blaze billing plan required for external API calls)
  • Basic understanding of FlutterFlow's API Manager and the FlutterFlowUploadButton widget

Step-by-step guide

1

Configure AWS S3 bucket permissions and CORS

In the AWS Console, go to S3 → your bucket → Permissions. Set the Bucket Policy to allow your Cloud Function's service account to put and get objects. Block Public Access should remain ON — files should be accessed only via pre-signed URLs, not public URLs. Under CORS Configuration, add a CORS rule that allows your app's origin (or * for development): [{"AllowedHeaders":["*"],"AllowedMethods":["PUT","GET","HEAD"],"AllowedOrigins":["*"],"ExposeHeaders":["ETag"]}]. This CORS config is required if FlutterFlow uploads directly to S3 using a pre-signed URL (without going through the Cloud Function). Create an IAM user with a policy that allows s3:PutObject, s3:GetObject, s3:DeleteObject on your bucket ARN only (not s3:* on all resources). Save the access key ID and secret — you will need them in the Cloud Function environment.

Expected result: The S3 bucket has CORS configured, block public access is on, and an IAM user with limited S3 permissions is created with access keys ready.

2

Create the uploadToS3 Cloud Function

In your Firebase Cloud Functions, install the AWS SDK: npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner. Create a function uploadToS3 that accepts a multipart form request with the file data, or alternatively generates a pre-signed upload URL for direct browser-to-S3 upload. For the proxy approach (recommended for small files under 10MB): import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'; const client = new S3Client({ region: 'us-east-1', credentials: { accessKeyId: process.env.AWS_ACCESS_KEY_ID, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY }}); const command = new PutObjectCommand({ Bucket: 'your-bucket', Key: `uploads/${userId}/${filename}`, Body: fileBuffer, ContentType: contentType }); await client.send(command); const fileUrl = `https://your-bucket.s3.amazonaws.com/uploads/${userId}/${filename}`;. Store AWS credentials in Cloud Function environment variables — never hardcode.

Expected result: The uploadToS3 Cloud Function deploys and successfully uploads a test file to your S3 bucket when called with file data.

3

Generate pre-signed download URLs from S3

Files stored in S3 with public access blocked are not accessible via their direct S3 URLs. To display them in FlutterFlow, generate a pre-signed URL with a time limit. Create a Cloud Function getSignedUrl: import { GetObjectCommand } from '@aws-sdk/client-s3'; import { getSignedUrl } from '@aws-sdk/s3-request-presigner'; const command = new GetObjectCommand({ Bucket: 'your-bucket', Key: req.body.s3Key }); const signedUrl = await getSignedUrl(client, command, { expiresIn: 3600 });. Return the signedUrl to FlutterFlow. In FlutterFlow, call this Cloud Function API to get a signed URL, then bind it to an Image widget's network URL or use a Launch URL action for file downloads. Store the S3 key (not the signed URL) in Firestore — signed URLs expire, but the S3 key is permanent.

Expected result: The getSignedUrl Cloud Function returns a time-limited URL that displays the S3 file correctly in a FlutterFlow Image widget.

4

Integrate with FlutterFlow's file upload button

In FlutterFlow, add a FlutterFlowUploadButton widget to your page. In the widget settings, set the Upload Type to Custom (instead of Firebase). This stores the selected file path locally rather than uploading to Firebase Storage. After the user selects a file, the button's On Upload action triggers with the file path. Add a Backend Call action in the Action Flow that calls your uploadToS3 Cloud Function — you will need to create a Custom Action that reads the file bytes from the local path and includes them in the HTTP request body as multipart form data. Store the returned S3 key and file URL in a Page State variable, then save both to Firestore in the next action (create a document with s3Key, fileUrl, filename, userId, uploadedAt).

Expected result: Users can select and upload files from their device directly to S3, with the file URL and S3 key saved to Firestore for retrieval.

5

Use Google Cloud Storage instead of AWS S3

The pattern for Google Cloud Storage (GCS) is nearly identical to S3, with different SDK and bucket configuration. In your Cloud Function: import { Storage } from '@google-cloud/storage'; const storage = new Storage(); const bucket = storage.bucket('your-gcs-bucket');. For upload: await bucket.file(`uploads/${userId}/${filename}`).save(fileBuffer, { contentType });. For signed URL generation: const [url] = await bucket.file(filePath).getSignedUrl({ action: 'read', expires: Date.now() + 3600000 });. If your Cloud Functions run in the same Google Cloud project as your GCS bucket, you do not need separate service account credentials — the Cloud Function's default service account can access GCS in the same project. Grant it the Storage Object Admin role in IAM. This is why GCS is often simpler than S3 for Firebase-based projects.

Expected result: The GCS Cloud Function uploads and generates signed URLs from your Google Cloud Storage bucket identically to the S3 version.

Complete working example

s3_storage_cloud_function.js
1// Cloud Function: s3Storage
2// Handles upload and signed URL generation for AWS S3
3// Install: npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
4const functions = require('firebase-functions');
5const { S3Client, PutObjectCommand, GetObjectCommand } = require('@aws-sdk/client-s3');
6const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
7
8const s3Client = new S3Client({
9 region: process.env.AWS_REGION || 'us-east-1',
10 credentials: {
11 accessKeyId: process.env.AWS_ACCESS_KEY_ID,
12 secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
13 },
14});
15
16const BUCKET = process.env.S3_BUCKET_NAME;
17
18exports.s3Storage = functions.https.onRequest(async (req, res) => {
19 res.set('Access-Control-Allow-Origin', '*');
20 if (req.method === 'OPTIONS') return res.status(204).send('');
21
22 const { action, s3Key, contentType, userId, filename } = req.body;
23
24 if (action === 'getUploadUrl') {
25 // Generate a pre-signed URL for direct browser upload
26 const key = `uploads/${userId}/${Date.now()}-${filename}`;
27 const command = new PutObjectCommand({
28 Bucket: BUCKET,
29 Key: key,
30 ContentType: contentType || 'application/octet-stream',
31 });
32 const uploadUrl = await getSignedUrl(s3Client, command, { expiresIn: 300 }); // 5 min
33 return res.json({ uploadUrl, s3Key: key });
34 }
35
36 if (action === 'getDownloadUrl') {
37 if (!s3Key) return res.status(400).json({ error: 's3Key required' });
38 const command = new GetObjectCommand({ Bucket: BUCKET, Key: s3Key });
39 const downloadUrl = await getSignedUrl(s3Client, command, { expiresIn: 3600 }); // 1 hour
40 return res.json({ downloadUrl });
41 }
42
43 return res.status(400).json({ error: 'action must be getUploadUrl or getDownloadUrl' });
44});

Common mistakes

Why it's a problem: Putting AWS access keys or GCS service account JSON in the FlutterFlow app or Firestore

How to avoid: Store all cloud storage credentials exclusively in Cloud Function environment variables. The Cloud Function acts as an authenticated proxy — the app never has direct storage credentials. For temporary write access, generate pre-signed URLs with short expiry (5 minutes) from the Cloud Function.

Why it's a problem: Making the entire S3 bucket publicly accessible to simplify file loading

How to avoid: Keep S3 Block Public Access ON for all buckets storing user data. Use pre-signed URLs for all file access — they expire automatically and can be scoped to specific objects. Public access is only appropriate for truly public assets like app icons or marketing images in a CDN bucket.

Why it's a problem: Uploading large files through the Cloud Function instead of directly to S3

How to avoid: For files over 5MB, use the pre-signed URL pattern: the Cloud Function generates a pre-signed PUT URL with 5-minute expiry, the FlutterFlow Custom Action uploads the file directly from the device to S3 using an HTTP PUT request to that URL. The Cloud Function is only involved in generating the URL, not transferring the file data.

Best practices

  • Use lifecycle policies in S3 to automatically delete temporary files and move infrequently accessed files to cheaper storage tiers — reduces storage costs without manual cleanup
  • Store only the S3 key (the file path within the bucket) in Firestore, not the pre-signed URL — keys are permanent; pre-signed URLs expire and become useless as stored values
  • Set up S3 server-side encryption (SSE-S3 or SSE-KMS) for buckets storing sensitive user data — one checkbox in S3 bucket settings enables encryption at rest
  • Add file size validation in the Cloud Function before accepting uploads — reject files over your maximum size with a clear error message to prevent abuse
  • Use CloudFront in front of your S3 bucket for CDN distribution if your users are global — signed CloudFront URLs work similarly to S3 pre-signed URLs but deliver files from edge locations worldwide
  • Tag uploaded S3 objects with metadata (userId, uploadedAt, fileType) to enable cost allocation reporting and simpler lifecycle policy targeting by user or content type
  • Test signed URL generation and expiry explicitly — generate a URL, wait for it to expire, and verify it returns a 403 — before deploying to production

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I am building a FlutterFlow app that stores user-uploaded files in AWS S3. Write a Node.js Firebase Cloud Function that: (1) accepts a request with action 'getUploadUrl' and returns a pre-signed S3 PUT URL valid for 5 minutes along with the generated S3 key, (2) accepts a request with action 'getDownloadUrl' and an s3Key parameter and returns a pre-signed S3 GET URL valid for 1 hour. Use AWS SDK v3 (@aws-sdk/client-s3 and @aws-sdk/s3-request-presigner), read credentials from environment variables, and include proper error handling.

FlutterFlow Prompt

Add a file upload feature to my FlutterFlow page. When the user taps the Upload button, call my s3Storage Cloud Function API to get a pre-signed upload URL. Then use a Custom Action to upload the selected file directly to that URL via HTTP PUT. After upload, store the S3 key and filename in Firestore under the current user's documents collection.

Frequently asked questions

When should I use AWS S3 or GCS instead of Firebase Storage?

Use Firebase Storage (the default) if your app is already on Firebase and you want the simplest setup. Switch to S3 if: your team's infrastructure is on AWS, you need CloudFront CDN distribution, you want granular IAM permissions per user, or compliance requires specific AWS certifications. Switch to GCS if: you already use Google Cloud services beyond Firebase, you want integrated billing with Google Cloud, or you need features like Object Versioning or multi-regional buckets. Firebase Storage IS Google Cloud Storage behind the scenes — so migrating from Firebase Storage to GCS is seamless.

How do I display private S3 images in a FlutterFlow Image widget?

Generate a pre-signed URL from your Cloud Function using getSignedUrl with a 1-hour expiry. Call the Cloud Function's getDownloadUrl action from FlutterFlow and store the returned URL in Page State. Bind the Image widget's Network URL to this Page State variable. The Image widget loads the file from S3 using the pre-signed URL. Regenerate the URL on page load to ensure it has not expired.

Can I set up a CDN in front of my S3 bucket for FlutterFlow?

Yes — configure CloudFront (AWS) or Cloud CDN (Google Cloud) in front of your storage bucket. For public assets (app static images, default avatars), CloudFront serves them from edge locations worldwide for faster load times. For private user files, use CloudFront signed cookies or signed URLs, which work similarly to S3 pre-signed URLs but are generated by CloudFront's key pair system. Your Cloud Function generates the CloudFront signed URL instead of the S3 signed URL.

What is the cost comparison between S3, GCS, and Firebase Storage?

For small apps (under 5GB storage, under 1TB/month transfer): Firebase Storage free tier covers typical usage with no cost. For apps with significant storage or transfer: S3 costs approximately $0.023/GB storage + $0.09/GB transfer (US regions). GCS costs approximately $0.020/GB storage + $0.08/GB transfer (US multi-region). Firebase Storage uses GCS pricing after the Blaze plan free tier. At scale (10TB+/month transfer), CloudFront + S3 with reserved capacity pricing is typically cheaper than Firebase Storage.

How do I delete files from S3 when a Firestore document is deleted?

Cloud Functions can be triggered by Firestore document deletions. Create a Firestore-triggered function on document delete: exports.onDocumentDelete = functions.firestore.document('collection/{docId}').onDelete(async (snap) => { const s3Key = snap.data().s3Key; if (s3Key) { const command = new DeleteObjectCommand({ Bucket: BUCKET, Key: s3Key }); await s3Client.send(command); } }). This automatically cleans up the S3 file when the Firestore record is deleted, preventing orphaned files accumulating in S3.

Can RapidDev help set up S3 storage for a production FlutterFlow app?

Yes. Production S3 integrations often require additional setup beyond the basics: bucket versioning for file history, lifecycle policies for cost optimization, CloudFront CDN configuration, multi-part upload for large files, and virus scanning via Lambda triggers. RapidDev has configured S3-backed FlutterFlow apps for media, legal document, and healthcare file storage use cases. Reach out if your project needs production-grade cloud storage configuration.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.