Skip to main content
RapidDev - Software Development Agency
supabase-tutorial

How to Secure Supabase Storage Files

Secure Supabase Storage files by using private buckets, writing RLS policies on the storage.objects table, and generating signed URLs for temporary access. Private buckets deny all unauthenticated access by default. Add row-level security policies that scope file access to the owning user by matching the file path to auth.uid(), and use createSignedUrl() to share time-limited download links.

What you'll learn

  • How to create private storage buckets in Supabase
  • How to write RLS policies on storage.objects for upload and download access
  • How to use signed URLs for secure temporary file sharing
  • How to implement user-scoped folder patterns for file isolation
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate8 min read15-20 minSupabase (all plans), @supabase/supabase-js v2+March 2026RapidDev Engineering Team
TL;DR

Secure Supabase Storage files by using private buckets, writing RLS policies on the storage.objects table, and generating signed URLs for temporary access. Private buckets deny all unauthenticated access by default. Add row-level security policies that scope file access to the owning user by matching the file path to auth.uid(), and use createSignedUrl() to share time-limited download links.

Securing Files in Supabase Storage with Private Buckets, RLS, and Signed URLs

Supabase Storage uses the same Row Level Security system as your database tables, applied to the storage.objects table. This tutorial shows you how to create private buckets, write RLS policies that restrict file access to authenticated users or specific file owners, and generate signed URLs for time-limited sharing. You will build a secure file upload system where each user can only access their own files.

Prerequisites

  • A Supabase project with authentication configured
  • Access to the SQL Editor in the Supabase Dashboard
  • @supabase/supabase-js v2 installed in your project
  • Basic understanding of Row Level Security concepts

Step-by-step guide

1

Create a private storage bucket

In the Supabase Dashboard, go to Storage and click New Bucket. Name it documents and leave the Public bucket toggle OFF. A private bucket denies all unauthenticated access by default. Unlike public buckets where anyone with the URL can download files, private buckets require both authentication and an RLS policy to allow any operation. You can also create the bucket via SQL or the JS client.

typescript
1-- Create a private bucket via SQL
2insert into storage.buckets (id, name, public)
3values ('documents', 'documents', false);
4
5-- Or via the JS client (server-side only, requires service role key)
6const { data, error } = await supabase.storage.createBucket('documents', {
7 public: false,
8 fileSizeLimit: 10485760 // 10MB
9});

Expected result: A private bucket named 'documents' appears in the Storage section of the Dashboard with the public access indicator set to off.

2

Write RLS policies for user-scoped file access

Storage files are stored in the storage.objects table, and you write RLS policies on this table just like any other. The key pattern is to use a user-scoped folder structure where each user's files are stored under a path prefixed with their user ID. The storage.foldername() function extracts folder segments from the file path, and you compare the first segment to auth.uid() to ensure users can only access their own files.

typescript
1-- Users can upload files to their own folder
2create policy "Users upload own files"
3on storage.objects for insert
4to authenticated
5with check (
6 bucket_id = 'documents'
7 and (select auth.uid())::text = (storage.foldername(name))[1]
8);
9
10-- Users can read their own files
11create policy "Users read own files"
12on storage.objects for select
13to authenticated
14using (
15 bucket_id = 'documents'
16 and (select auth.uid())::text = (storage.foldername(name))[1]
17);
18
19-- Users can delete their own files
20create policy "Users delete own files"
21on storage.objects for delete
22to authenticated
23using (
24 bucket_id = 'documents'
25 and (select auth.uid())::text = (storage.foldername(name))[1]
26);
27
28-- Users can update (overwrite) their own files
29create policy "Users update own files"
30on storage.objects for update
31to authenticated
32using (
33 bucket_id = 'documents'
34 and (select auth.uid())::text = (storage.foldername(name))[1]
35);

Expected result: RLS policies are active on storage.objects. Authenticated users can only upload, read, update, and delete files under their own user ID folder.

3

Upload files to the user-scoped folder from the frontend

When uploading from the client, construct the file path using the authenticated user's ID as the first folder segment. The Supabase JS client automatically includes the user's JWT in the request, which the RLS policy checks against the folder path. Use the upsert option if you want to allow overwriting existing files.

typescript
1import { createClient } from '@supabase/supabase-js'
2
3const supabase = createClient(
4 process.env.NEXT_PUBLIC_SUPABASE_URL!,
5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
6)
7
8async function uploadFile(file: File) {
9 const { data: { user } } = await supabase.auth.getUser()
10 if (!user) throw new Error('Not authenticated')
11
12 const filePath = `${user.id}/${file.name}`
13
14 const { data, error } = await supabase.storage
15 .from('documents')
16 .upload(filePath, file, {
17 cacheControl: '3600',
18 upsert: false
19 })
20
21 if (error) throw error
22 return data
23}

Expected result: The file is uploaded to documents/{user_id}/filename.ext. Any attempt to upload to another user's folder is blocked by the RLS policy.

4

Generate signed URLs for temporary file access

For private buckets, you cannot use getPublicUrl() because the bucket is not publicly accessible. Instead, use createSignedUrl() to generate a time-limited URL that grants temporary access to a specific file. Signed URLs are ideal for sharing files with external users, rendering private images in the browser, or creating download links that expire. The expiry time is in seconds.

typescript
1// Generate a signed URL valid for 1 hour (3600 seconds)
2const { data, error } = await supabase.storage
3 .from('documents')
4 .createSignedUrl(`${user.id}/report.pdf`, 3600)
5
6if (data) {
7 console.log('Download link:', data.signedUrl)
8}
9
10// Generate signed URLs for multiple files at once
11const { data: urls, error: urlError } = await supabase.storage
12 .from('documents')
13 .createSignedUrls([
14 `${user.id}/report.pdf`,
15 `${user.id}/invoice.pdf`
16 ], 3600)

Expected result: A signed URL is returned that grants temporary access to the file. After the expiry time, the URL stops working and returns a 400 error.

5

List files in a user's folder with proper RLS

Use the list method to show users their uploaded files. Because RLS is active, the list operation only returns files the user's policy allows them to see. Pass the user's ID as the path prefix to scope the listing to their folder. You can add pagination with limit and offset options for large file collections.

typescript
1async function listUserFiles() {
2 const { data: { user } } = await supabase.auth.getUser()
3 if (!user) throw new Error('Not authenticated')
4
5 const { data: files, error } = await supabase.storage
6 .from('documents')
7 .list(user.id, {
8 limit: 100,
9 offset: 0,
10 sortBy: { column: 'created_at', order: 'desc' }
11 })
12
13 if (error) throw error
14 return files
15}

Expected result: An array of file objects is returned, showing only the files in the authenticated user's folder. Other users' files are invisible.

Complete working example

secure-storage.ts
1import { createClient } from '@supabase/supabase-js'
2
3const supabase = createClient(
4 process.env.NEXT_PUBLIC_SUPABASE_URL!,
5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
6)
7
8// Upload a file to the authenticated user's private folder
9export async function uploadFile(file: File) {
10 const { data: { user } } = await supabase.auth.getUser()
11 if (!user) throw new Error('Not authenticated')
12
13 const filePath = `${user.id}/${file.name}`
14 const { data, error } = await supabase.storage
15 .from('documents')
16 .upload(filePath, file, { cacheControl: '3600', upsert: false })
17
18 if (error) throw error
19 return data
20}
21
22// List all files in the authenticated user's folder
23export async function listUserFiles() {
24 const { data: { user } } = await supabase.auth.getUser()
25 if (!user) throw new Error('Not authenticated')
26
27 const { data, error } = await supabase.storage
28 .from('documents')
29 .list(user.id, { limit: 100, sortBy: { column: 'created_at', order: 'desc' } })
30
31 if (error) throw error
32 return data
33}
34
35// Generate a signed URL for temporary file access
36export async function getSignedUrl(fileName: string, expiresIn = 3600) {
37 const { data: { user } } = await supabase.auth.getUser()
38 if (!user) throw new Error('Not authenticated')
39
40 const { data, error } = await supabase.storage
41 .from('documents')
42 .createSignedUrl(`${user.id}/${fileName}`, expiresIn)
43
44 if (error) throw error
45 return data.signedUrl
46}
47
48// Delete a file from the authenticated user's folder
49export async function deleteFile(fileName: string) {
50 const { data: { user } } = await supabase.auth.getUser()
51 if (!user) throw new Error('Not authenticated')
52
53 const { data, error } = await supabase.storage
54 .from('documents')
55 .remove([`${user.id}/${fileName}`])
56
57 if (error) throw error
58 return data
59}

Common mistakes when securing Supabase Storage Files

Why it's a problem: Using a public bucket when files should be restricted to authenticated users

How to avoid: Create the bucket with public: false. Public buckets allow anyone with the URL to download files, bypassing all access controls.

Why it's a problem: Uploading files without the user ID as the first folder segment, causing the RLS policy to block the operation

How to avoid: Always construct the file path as {user.id}/{filename}. The RLS policy checks storage.foldername(name)[1] against auth.uid().

Why it's a problem: Using getPublicUrl() on a private bucket and getting 400 errors

How to avoid: Private buckets do not support public URLs. Use createSignedUrl() with an expiry time instead.

Why it's a problem: Forgetting to add a SELECT RLS policy on storage.objects, causing list and download operations to return empty results

How to avoid: Add a SELECT policy alongside your INSERT policy. Without it, users can upload but cannot see or download their own files.

Best practices

  • Always use private buckets for user-uploaded content and sensitive documents
  • Scope file paths with the user's ID as the first folder segment for easy RLS policy enforcement
  • Add RLS policies for all four operations: SELECT, INSERT, UPDATE, and DELETE on storage.objects
  • Use createSignedUrl() with the shortest practical expiry time for sharing private files
  • Set fileSizeLimit on the bucket to prevent excessively large uploads at the storage level
  • Verify the user with getUser() before any storage operation instead of relying on getSession()
  • Add cacheControl headers when uploading to improve CDN performance for frequently accessed files
  • Never expose the SUPABASE_SERVICE_ROLE_KEY to the client — it bypasses all storage RLS policies

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I need to secure file uploads in Supabase Storage so each user can only access their own files. Show me how to create a private bucket, write RLS policies on storage.objects using the user ID folder pattern, and generate signed URLs for temporary file sharing.

Supabase Prompt

Set up a private Supabase Storage bucket called 'documents' with RLS policies on storage.objects that scope all operations to the authenticated user's folder path. Include TypeScript functions for upload, list, signed URL generation, and delete.

Frequently asked questions

What is the difference between a public and private bucket in Supabase?

A public bucket allows anyone with the file URL to download it without authentication. A private bucket requires both authentication and a passing RLS policy on storage.objects before any operation is allowed.

Can I make some files in a private bucket publicly accessible?

Not directly. A bucket is either public or private. To share specific files from a private bucket, generate signed URLs with createSignedUrl(). These URLs work for anyone but expire after the specified time.

How long can a signed URL last?

Signed URLs can last up to 7 days (604800 seconds). Set the shortest expiry that meets your needs for security. Common values are 300 seconds for image display and 86400 seconds for email download links.

Why do my storage uploads return a 403 error?

A 403 error means the RLS policy on storage.objects is blocking the operation. Check that you have an INSERT policy, the bucket_id matches, and the file path starts with the user's ID. Also verify the user is authenticated.

Does the service role key bypass storage RLS policies?

Yes. The SUPABASE_SERVICE_ROLE_KEY bypasses all RLS policies including those on storage.objects. Never use it in client-side code. It is intended for server-side admin operations only.

Can I restrict upload file types in Supabase Storage?

Supabase does not enforce file type restrictions at the bucket level. Validate file types in your client code before uploading, or write an Edge Function that checks the content-type header before storing the file.

Can RapidDev help configure secure file storage for my Supabase project?

Yes. RapidDev can design your storage architecture, write RLS policies for complex access patterns, and implement secure upload and download flows tailored to your application.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.