Skip to main content
RapidDev - Software Development Agency

How to Build a Privacy Tools with Replit

Build a GDPR/CCPA compliance toolkit in Replit using Express and PostgreSQL in 1-2 hours. You'll create cookie consent recording, data export requests, account deletion flows, and a privacy policy acceptance system. Replit Auth handles user identity, and a Scheduled Deployment processes export jobs in the background.

What you'll build

  • Cookie consent banner with accept/reject/customize options that records granular consent per category
  • Data export request system that gathers all user data across tables and delivers a JSON download file
  • Account deletion flow with confirmation token email and cascading data purge PostgreSQL function
  • Privacy policy versioning with per-user acceptance tracking and upgrade notices
  • Admin dashboard showing consent statistics, pending deletion requests, and export request log
  • Express REST API with routes for consent recording, policy acceptance, and compliance management
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate14 min read1-2 hoursReplit FreeApril 2026RapidDev Engineering Team
TL;DR

Build a GDPR/CCPA compliance toolkit in Replit using Express and PostgreSQL in 1-2 hours. You'll create cookie consent recording, data export requests, account deletion flows, and a privacy policy acceptance system. Replit Auth handles user identity, and a Scheduled Deployment processes export jobs in the background.

What you're building

A GDPR/CCPA compliance toolkit gives your app the data privacy infrastructure regulators require: a way to record what users consented to, let them export their data, request deletion, and stay informed about privacy policy updates. Without these tools, a single user complaint can trigger fines and reputational damage — especially if you operate in the EU or California.

Replit Agent builds this as an Express backend with five tables and a dozen routes. You describe the schema and features in a single prompt, and the Agent generates the database structure, API endpoints, and a React frontend with consent banner and settings page. Drizzle ORM handles type-safe queries, and Drizzle Studio lets you inspect consent records and deletion requests without writing SQL.

The architecture has two deployment parts: the main Express app on Autoscale handles real-time requests (consent recording, policy acceptance), and a Scheduled Deployment runs every hour to process queued data export requests — gathering user data from all tables, writing a JSON file, and marking the request ready for download. PostgreSQL's built-in transaction support ensures the deletion cascade either completes fully or rolls back cleanly.

Final result

A privacy compliance backend with consent recording, data export, account deletion, and policy versioning — plus an admin dashboard and React frontend with cookie banner and user privacy settings.

Tech stack

ReplitIDE & Hosting
ExpressBackend Framework
PostgreSQLDatabase
Drizzle ORMDatabase ORM
Replit AuthAuth
SendGridTransactional Email

Prerequisites

  • A Replit account (Free tier is sufficient)
  • Basic understanding of what cookies and user data mean for your app
  • A SendGrid account (free tier) for deletion confirmation emails
  • Knowledge of which database tables in your app store user data (needed for the export and deletion steps)

Build steps

1

Scaffold the project with Replit Agent

Open Replit, create a new App, and type the following prompt into the Agent. This generates the full Express + PostgreSQL project with all five compliance tables and the core API structure in one shot.

prompt.txt
1// Paste this into Replit Agent to scaffold the privacy tools app
2// Build a GDPR/CCPA privacy compliance toolkit with Express and PostgreSQL using Drizzle ORM.
3//
4// Create these tables:
5// 1. consent_records: id serial primary key, user_id text, session_id text, ip_address text,
6// consent_type text (enum: analytics/marketing/functional/all), granted boolean not null,
7// source text (enum: banner/settings/signup), created_at timestamp default now()
8// 2. data_export_requests: id serial primary key, user_id text not null, status text default 'pending'
9// (enum: pending/processing/ready/expired/failed), file_url text, requested_at timestamp default now(),
10// completed_at timestamp, expires_at timestamp
11// 3. deletion_requests: id serial primary key, user_id text not null, status text default 'pending'
12// (enum: pending/confirmed/processing/completed), reason text, confirmation_token text unique,
13// confirmed_at timestamp, completed_at timestamp, requested_at timestamp default now()
14// 4. privacy_policies: id serial primary key, version text unique not null, content text (markdown),
15// effective_date date not null, is_current boolean default false, created_at timestamp default now()
16// 5. policy_acceptances: id serial primary key, user_id text not null, policy_version text not null,
17// accepted_at timestamp default now(), unique(user_id, policy_version)
18//
19// Add Express routes:
20// POST /api/consent, GET /api/consent/:userId
21// POST /api/data-export/request, GET /api/data-export/:id/status, GET /api/data-export/:id/download
22// POST /api/deletion/request, POST /api/deletion/confirm/:token
23// GET /api/privacy-policy, POST /api/privacy-policy/accept
24// GET /api/admin/consent-stats, GET /api/admin/deletion-requests
25//
26// Use Replit Auth for user identity. Bind server to 0.0.0.0 port 3000.

Pro tip: After Agent finishes, open Drizzle Studio (Database tab in the sidebar) to verify all five tables were created with the correct columns and types before building on top of them.

Expected result: A running Express app with all five tables in PostgreSQL and API routes responding. The terminal shows 'listening on 3000'.

2

Add the consent recording and retrieval routes

The consent API is the most frequently called part — every page load may check consent status. This route stores the user's choice and returns their current preferences.

server/routes/consent.js
1const express = require('express');
2const { db } = require('../db');
3const { consentRecords } = require('../schema');
4const { eq, desc } = require('drizzle-orm');
5
6const router = express.Router();
7
8// Record a new consent choice
9router.post('/api/consent', express.json(), async (req, res) => {
10 const { consentType, granted, source } = req.body;
11 const userId = req.user?.id || null;
12 const sessionId = req.session?.id || null;
13
14 if (!['analytics', 'marketing', 'functional', 'all'].includes(consentType)) {
15 return res.status(400).json({ error: 'Invalid consent_type' });
16 }
17
18 try {
19 const record = await db.insert(consentRecords).values({
20 userId,
21 sessionId,
22 ipAddress: req.ip,
23 consentType,
24 granted: Boolean(granted),
25 source: source || 'banner',
26 }).returning();
27
28 return res.json({ recorded: true, id: record[0].id });
29 } catch (err) {
30 console.error('[consent] insert error:', err);
31 return res.status(500).json({ error: 'Failed to record consent' });
32 }
33});
34
35// Get current consent status for a user
36router.get('/api/consent/:userId', async (req, res) => {
37 const rows = await db.select()
38 .from(consentRecords)
39 .where(eq(consentRecords.userId, req.params.userId))
40 .orderBy(desc(consentRecords.createdAt))
41 .limit(10);
42
43 // Build a map of most recent consent per type
44 const status = {};
45 for (const row of rows) {
46 if (!status[row.consentType]) {
47 status[row.consentType] = row.granted;
48 }
49 }
50 return res.json({ userId: req.params.userId, consent: status });
51});
52
53module.exports = router;

Pro tip: Always record consent even for anonymous users — use the session_id as the identifier. When the user logs in later, link the session to their user_id by updating earlier consent records.

Expected result: POST /api/consent with {consentType: 'analytics', granted: true, source: 'banner'} returns {recorded: true, id: 1}.

3

Build the data export request and processing system

GDPR requires you to provide users a copy of all their data within 30 days. This step creates the request endpoint that queues an export job, and a background processor that gathers all user data into a downloadable JSON file.

server/routes/dataExport.js
1// server/routes/dataExport.js — request and download endpoints
2const express = require('express');
3const { db } = require('../db');
4const { dataExportRequests } = require('../schema');
5const { eq } = require('drizzle-orm');
6const { withDbRetry } = require('../lib/retryDb');
7
8const router = express.Router();
9
10router.post('/api/data-export/request', async (req, res) => {
11 if (!req.user) return res.status(401).json({ error: 'Login required' });
12
13 const existing = await db.select().from(dataExportRequests)
14 .where(eq(dataExportRequests.userId, req.user.id))
15 .orderBy(dataExportRequests.requestedAt)
16 .limit(1);
17
18 // Prevent spam — one pending request at a time
19 if (existing[0]?.status === 'pending' || existing[0]?.status === 'processing') {
20 return res.status(429).json({ error: 'Export already in progress', requestId: existing[0].id });
21 }
22
23 const record = await withDbRetry(() =>
24 db.insert(dataExportRequests).values({
25 userId: req.user.id,
26 status: 'pending',
27 }).returning()
28 );
29
30 return res.json({ requestId: record[0].id, message: 'Export queued — ready in ~5 minutes' });
31});
32
33router.get('/api/data-export/:id/status', async (req, res) => {
34 const row = await db.select().from(dataExportRequests)
35 .where(eq(dataExportRequests.id, parseInt(req.params.id)))
36 .limit(1);
37
38 if (!row[0]) return res.status(404).json({ error: 'Not found' });
39 if (row[0].userId !== req.user?.id) return res.status(403).json({ error: 'Forbidden' });
40
41 return res.json({ status: row[0].status, fileUrl: row[0].fileUrl, expiresAt: row[0].expiresAt });
42});
43
44module.exports = router;

Pro tip: The background export processor runs as a Scheduled Deployment. Ask Agent: 'Create a server/jobs/processExports.js script that queries pending data_export_requests, gathers all user data from every table, writes it to a JSON file in /tmp, uploads it to a public URL, and updates the request status to ready with an expires_at 7 days from now.'

Expected result: POST /api/data-export/request returns a requestId. GET /api/data-export/:id/status shows 'pending' immediately and 'ready' after the background job runs.

4

Add account deletion with confirmation and cascading purge

Deletion is the most critical GDPR right. This flow sends a confirmation email with a unique token, then on confirmation runs a PostgreSQL function that deletes or anonymizes all user data in the correct order.

prompt.txt
1// Ask Agent to add the deletion flow with this prompt:
2// Add two routes to server/routes/deletion.js:
3//
4// POST /api/deletion/request:
5// 1. Require Replit Auth (req.user must exist)
6// 2. Generate a crypto.randomUUID() as confirmation_token
7// 3. Insert into deletion_requests with status='pending' and the token
8// 4. Send an email via SendGrid (API key from process.env.SENDGRID_API_KEY in Secrets)
9// with subject 'Confirm your account deletion' and a link:
10// https://${req.get('host')}/api/deletion/confirm/${token}
11// 5. Return { message: 'Confirmation email sent' }
12//
13// POST /api/deletion/confirm/:token:
14// 1. Find the deletion_request by confirmation_token where status='pending'
15// 2. If not found, return 404
16// 3. Update status to 'processing', set confirmed_at = now()
17// 4. Run a PostgreSQL function 'purge_user_data(user_id)' via db.execute()
18// that deletes from: policy_acceptances, consent_records, data_export_requests,
19// deletion_requests last (self-reference) — all in order respecting foreign keys
20// 5. Update deletion_request status to 'completed'
21// 6. Return { message: 'Account deleted' }
22//
23// Also create the purge_user_data PostgreSQL function in a migration file.

Pro tip: Add SENDGRID_API_KEY to Replit Secrets via the lock icon in the sidebar. After adding it, click Stop → Run to reload the environment so the new key is visible to the running server.

Expected result: POST /api/deletion/request sends a confirmation email. Following the link calls /api/deletion/confirm/:token and returns {message: 'Account deleted'} after all rows are purged.

5

Build the privacy policy versioning and consent banner

Privacy policies change over time — every user must accept the current version. This step adds the policy management routes and the React cookie consent banner that shows on first visit.

prompt.txt
1// Ask Agent to build the privacy policy system and React frontend with this prompt:
2// Add to the Express app:
3// 1. GET /api/privacy-policy — returns the current policy (is_current=true)
4// 2. POST /api/privacy-policy/accept — inserts into policy_acceptances for the logged-in user
5// if they haven't already accepted this version (check unique constraint)
6// 3. GET /api/admin/consent-stats — returns aggregate counts: total consent records,
7// breakdown by consent_type and granted (true/false), grouped using SQL COUNT + GROUP BY
8// 4. GET /api/admin/deletion-requests — lists all deletion_requests with status != 'completed'
9//
10// Build a React frontend with:
11// a) A CookieBanner component (fixed bottom bar) showing 'We use cookies' text,
12// Accept All, Reject All, and Customize buttons. On choice, POST /api/consent.
13// Store the choice in localStorage so the banner doesn't show again.
14// b) A PrivacySettings page with toggle switches per consent category
15// (analytics, marketing, functional), a 'Request My Data' button (POST /api/data-export/request),
16// and a 'Delete My Account' button (POST /api/deletion/request).
17// c) A PolicyAcceptanceModal that checks GET /api/privacy-policy on login,
18// compares version to what's stored in localStorage, and shows the modal if not accepted.
19// d) An AdminPrivacy page with consent stats pie chart and deletion requests table.

Expected result: The app shows a cookie consent banner on first visit. The Privacy Settings page has working toggle switches. Clicking 'Request My Data' queues an export.

Complete code

server/routes/consent.js
1const express = require('express');
2const { db } = require('../db');
3const { consentRecords, policyAcceptances, privacyPolicies } = require('../schema');
4const { eq, desc, and, count } = require('drizzle-orm');
5const { withDbRetry } = require('../lib/retryDb');
6
7const router = express.Router();
8
9const VALID_CONSENT_TYPES = ['analytics', 'marketing', 'functional', 'all'];
10
11router.post('/api/consent', express.json(), async (req, res) => {
12 const { consentType, granted, source } = req.body;
13 if (!VALID_CONSENT_TYPES.includes(consentType)) {
14 return res.status(400).json({ error: 'Invalid consent_type' });
15 }
16 try {
17 const record = await withDbRetry(() =>
18 db.insert(consentRecords).values({
19 userId: req.user?.id || null,
20 sessionId: req.headers['x-session-id'] || null,
21 ipAddress: req.ip,
22 consentType,
23 granted: Boolean(granted),
24 source: source || 'banner',
25 }).returning()
26 );
27 return res.json({ recorded: true, id: record[0].id });
28 } catch (err) {
29 console.error('[consent] error:', err.message);
30 return res.status(500).json({ error: 'Failed to record consent' });
31 }
32});
33
34router.get('/api/consent/:userId', async (req, res) => {
35 const rows = await db.select()
36 .from(consentRecords)
37 .where(eq(consentRecords.userId, req.params.userId))
38 .orderBy(desc(consentRecords.createdAt))
39 .limit(20);
40 const status = {};
41 for (const row of rows) {
42 if (!(row.consentType in status)) status[row.consentType] = row.granted;
43 }
44 return res.json({ userId: req.params.userId, consent: status });
45});
46
47router.post('/api/privacy-policy/accept', express.json(), async (req, res) => {
48 if (!req.user) return res.status(401).json({ error: 'Login required' });
49 const [policy] = await db.select().from(privacyPolicies)
50 .where(eq(privacyPolicies.isCurrent, true)).limit(1);
51 if (!policy) return res.status(404).json({ error: 'No current policy' });
52 await db.insert(policyAcceptances)
53 .values({ userId: req.user.id, policyVersion: policy.version })
54 .onConflictDoNothing();
55 return res.json({ accepted: true, version: policy.version });
56});
57
58router.get('/api/admin/consent-stats', async (req, res) => {
59 const rows = await db.select({
60 consentType: consentRecords.consentType,
61module.exports = router;

Customization ideas

Consent version history

Track every consent change over time so you can prove what a user agreed to on any given date — essential for regulatory audits. Query consent_records filtered by user_id and date range.

Granular cookie categories

Add sub-categories under 'marketing' such as 'retargeting' and 'email marketing', each with its own toggle. Store as separate consent_type values and update the banner's Customize panel accordingly.

Automated policy update notifications

When an admin publishes a new privacy policy version, trigger a notification to all users who have accepted previous versions — prompting them to review and re-accept the updated terms.

Data export in multiple formats

Support CSV and PDF exports in addition to JSON. The background processor generates all three formats, stores them as separate file_url entries, and lets users choose their preferred format at download time.

Common pitfalls

Pitfall: Logging consent client-side only (in localStorage/cookies)

How to avoid: Always POST to /api/consent for every consent choice. Use localStorage only as a cache to avoid re-showing the banner — never as the source of truth.

Pitfall: Forgetting to add Deployment Secrets after adding Workspace Secrets

How to avoid: After testing locally, open the Publish pane, go to Secrets, and add SENDGRID_API_KEY again with the same value. Redeploy after adding.

Pitfall: Running the data export query synchronously in the request handler

How to avoid: Use the queue pattern: return a requestId immediately, process the export in a background Scheduled Deployment, and let users poll /api/data-export/:id/status.

Pitfall: Deleting the deletion_requests row before it's fully processed

How to avoid: Keep deletion_requests rows forever (status='completed'). For GDPR compliance you can anonymize the user_id in the record after completion, but never hard-delete it.

Best practices

  • Store consent server-side in consent_records for every choice — localStorage is only a UX cache to avoid re-showing banners.
  • Use Drizzle ORM's onConflictDoNothing() for policy_acceptances inserts — the unique constraint prevents double-recording if the user clicks accept twice.
  • Wrap the deletion cascade in a PostgreSQL function run as a single transaction — if any table delete fails, the entire operation rolls back and the user's account is preserved.
  • Set an expires_at on data export files (7 days is common) and clean up old files with a Scheduled Deployment to avoid storage creep.
  • Add SENDGRID_API_KEY to both Workspace Secrets and Deployment Secrets — they are completely separate in Replit and the deployed app won't see workspace keys.
  • Test your deletion flow with a real test account before going to production — create a user, add data across all tables, run deletion, then verify every table is clean.
  • Use Replit Agent to generate the purge_user_data PostgreSQL function — describe every table that has a user_id column and Agent will generate the correct delete order respecting foreign key constraints.

AI prompts to try

Copy these prompts to build this project faster.

ChatGPT Prompt

I'm building a GDPR compliance toolkit with Express and PostgreSQL using Drizzle ORM. I have five tables: consent_records, data_export_requests, deletion_requests, privacy_policies, and policy_acceptances. Help me write a PostgreSQL function called purge_user_data(p_user_id text) that deletes all rows for a given user from these tables in the correct foreign key order, wrapped in a BEGIN/EXCEPTION/END block so it rolls back on any error.

Build Prompt

Extend the privacy tools app with a consent audit report. Build a GET /api/admin/consent-audit route that accepts start_date and end_date query params and returns: total consent events, breakdown by consent_type (analytics/marketing/functional/all), breakdown by source (banner/settings/signup), daily trend (count per day over the range), and top 5 users by consent change frequency. Display this in a React admin page with a date range picker, four stat cards at the top, a stacked bar chart for the daily trend using recharts, and a data table for the per-user breakdown.

Frequently asked questions

Does this toolkit make my app fully GDPR compliant?

It gives you the technical infrastructure GDPR requires: consent recording, data access (export), data erasure (deletion), and policy versioning. Full compliance also requires legal steps like appointing a Data Protection Officer if required and updating your terms of service — the toolkit covers the technical side.

What happens if the deletion job fails halfway through?

The purge_user_data PostgreSQL function runs inside a transaction. If any DELETE fails (e.g., a foreign key constraint you missed), the entire transaction rolls back and the user's data is preserved. The deletion_request status stays at 'processing' — you can re-run the function after fixing the issue.

Can I use this on Replit Free tier?

Yes. The main app runs on Replit Free's Autoscale deployment. The Scheduled Deployment for export processing also runs on Free tier. The only paid component is SendGrid (which has a free tier of 100 emails/day — plenty for deletion confirmation emails).

How do I handle users who haven't logged in (anonymous sessions)?

Record their session_id from a UUID cookie in the consent_records table with user_id left null. If they later create an account, a merge endpoint can update all consent_records with that session_id to set the newly linked user_id.

What's the difference between Workspace Secrets and Deployment Secrets in Replit?

Workspace Secrets are only available while you're working in the editor. Deployment Secrets are injected into the deployed app. They are completely separate — add SENDGRID_API_KEY in both locations to make deletion emails work both locally and in production.

How long do data exports stay available for download?

Set expires_at to 7 days after the export completes (a common GDPR practice). Run a Scheduled Deployment daily that finds expired exports (expires_at < now()), deletes the file, and sets status to 'expired'. The user can request a fresh export at any time.

Can RapidDev help build a custom privacy compliance system for my app?

Yes. RapidDev has built compliance toolkits for 600+ apps and can extend this foundation with multi-tenant consent management, automated regulatory reporting, and custom deletion workflows for your specific data model. Free consultation available.

Should I deploy on Autoscale or Reserved VM?

Autoscale works for this use case. The main consent and policy routes are stateless and handle cold starts well. The export processing job runs as a separate Scheduled Deployment so cold starts on the main app don't affect background processing.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help building your app?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.