Skip to main content
RapidDev - Software Development Agency

How to Build a Integration Hub with Replit

Build a Zapier-style integration hub in Replit in 2-4 hours. Connect multiple external APIs, transform data with field mapping rules, and pipe events between services via dynamic webhook endpoints. Credentials are AES-256 encrypted in PostgreSQL. Deploy on Reserved VM to keep webhook receivers always-on. Uses Express, PostgreSQL with Drizzle ORM, and Node.js crypto.

What you'll build

  • Connections table storing AES-256-encrypted API credentials for multiple external services
  • Integrations with configurable JSON field transformation rules between source and destination
  • Dynamic webhook endpoint router that matches incoming paths to integration configs
  • HMAC webhook signature verification to authenticate incoming webhook calls
  • Run history logging with per-execution status, records processed, and error messages
  • Test/dry-run endpoint that validates transformation logic against sample data without executing
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Advanced14 min read2-4 hoursReplit FreeApril 2026RapidDev Engineering Team
TL;DR

Build a Zapier-style integration hub in Replit in 2-4 hours. Connect multiple external APIs, transform data with field mapping rules, and pipe events between services via dynamic webhook endpoints. Credentials are AES-256 encrypted in PostgreSQL. Deploy on Reserved VM to keep webhook receivers always-on. Uses Express, PostgreSQL with Drizzle ORM, and Node.js crypto.

What you're building

Zapier charges $20-100/month for a few thousand task runs, and it runs on their servers. Building your own integration hub lets you run unlimited automations, customize the transformation logic in code, and keep sensitive API credentials in your own encrypted database — not a third-party SaaS.

Replit Agent generates the entire backend: connections, integrations, runs, and webhook_endpoints tables. The security model stores all external API keys encrypted with AES-256-GCM using a master key from Replit Secrets. Credentials are decrypted only at execution time, never stored in plaintext, and never returned in API responses.

The dynamic webhook receiver is the core feature: incoming POST requests to /api/webhooks/:path are matched to a webhook_endpoints row, HMAC-verified, then the integration executes — fetching from the source API, applying the field transformation config, and posting to the destination. Deploy on Reserved VM because the webhook receiver must be always-on; a cold start on an Autoscale instance could cause missed events from source services.

Final result

A working integration hub that connects external APIs, transforms data with field mapping rules, handles incoming webhooks, and logs every execution — all running on Reserved VM with encrypted credential storage.

Tech stack

ReplitIDE & Hosting
ExpressBackend Framework
PostgreSQLDatabase
Drizzle ORMDatabase ORM
Replit AuthAuth
Node.js cryptoCredential Encryption

Prerequisites

  • A Replit account (Replit Core for Reserved VM deployment — required for always-on webhook reception)
  • At least two external API accounts to connect (e.g., Slack, GitHub, a webhook testing tool like webhook.site)
  • Basic understanding of what APIs and webhooks are
  • API keys or tokens for the services you want to connect

Build steps

1

Generate the schema and encryption foundation with Replit Agent

The encryption module is the first thing to build. Every external API credential must be encrypted before storage and decrypted only at execution time. Build this correctly before writing any integration logic.

prompt.txt
1// Prompt to type into Replit Agent:
2// Build an integration hub with Express and PostgreSQL using Drizzle ORM.
3// Create these tables in shared/schema.ts:
4// - connections: id serial pk, user_id text not null, service_name text not null,
5// credentials_encrypted text not null, status text default 'active', created_at timestamp
6// - integrations: id serial pk, user_id text not null, name text not null,
7// source_connection_id integer references connections,
8// destination_connection_id integer references connections,
9// transform_config jsonb (field mapping rules), is_active boolean default true,
10// created_at timestamp
11// - integration_runs: id serial pk, integration_id integer references integrations not null,
12// status text not null (running/success/failed),
13// records_processed integer default 0, error_message text,
14// started_at timestamp default now(), completed_at timestamp
15// - webhook_endpoints: id serial pk, integration_id integer references integrations not null,
16// path text unique not null, secret text not null, created_at timestamp
17// Create a server/lib/crypto.js module with:
18// encrypt(plaintext): uses Node.js crypto.createCipheriv with AES-256-GCM
19// returns: {iv, authTag, ciphertext} encoded as a single string
20// decrypt(encrypted): reverses the above
21// The encryption key comes from process.env.ENCRYPTION_KEY (32 bytes, stored in Secrets)
22// Set up Replit Auth. Bind server to 0.0.0.0.

Pro tip: Generate a secure ENCRYPTION_KEY by running: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))" in the Replit Shell. Add the output to Replit Secrets as ENCRYPTION_KEY.

Expected result: Agent creates the schema and the crypto module. Test encryption by importing server/lib/crypto.js in the Shell: const {encrypt,decrypt}=require('./server/lib/crypto'); const e=encrypt('test'); console.log(decrypt(e))

2

Build the connections and integrations management routes

Connections store encrypted API credentials. When creating a connection, the credentials are immediately encrypted before database insertion. When listing connections, credentials are redacted — only the ID and service name return.

server/routes/connections.js
1const { encrypt, decrypt } = require('../lib/crypto');
2const { db } = require('../db');
3const { connections, integrations } = require('../../shared/schema');
4const { eq, and } = require('drizzle-orm');
5
6// POST /api/connections — store encrypted credentials
7router.post('/api/connections', async (req, res) => {
8 if (!req.user) return res.status(401).json({ error: 'Auth required' });
9 const { serviceName, credentials } = req.body;
10
11 if (!credentials || typeof credentials !== 'object') {
12 return res.status(400).json({ error: 'credentials must be an object' });
13 }
14
15 const credentialsEncrypted = encrypt(JSON.stringify(credentials));
16
17 const [conn] = await db.insert(connections).values({
18 userId: req.user.id,
19 serviceName,
20 credentialsEncrypted
21 }).returning();
22
23 // Never return the encrypted credentials in the response
24 res.json({ id: conn.id, serviceName: conn.serviceName, status: conn.status, createdAt: conn.createdAt });
25});
26
27// GET /api/connections — list without credentials
28router.get('/api/connections', async (req, res) => {
29 if (!req.user) return res.status(401).json({ error: 'Auth required' });
30 const conns = await db.select({
31 id: connections.id,
32 serviceName: connections.serviceName,
33 status: connections.status,
34 createdAt: connections.createdAt
35 }).from(connections).where(eq(connections.userId, req.user.id));
36 res.json(conns);
37});
38
39// POST /api/integrations — create integration with transform config
40router.post('/api/integrations', async (req, res) => {
41 if (!req.user) return res.status(401).json({ error: 'Auth required' });
42 const { name, sourceConnectionId, destinationConnectionId, transformConfig } = req.body;
43
44 // Verify both connections belong to user
45 const [src, dst] = await Promise.all([
46 db.query.connections.findFirst({ where: and(eq(connections.id, Number(sourceConnectionId)), eq(connections.userId, req.user.id)) }),
47 db.query.connections.findFirst({ where: and(eq(connections.id, Number(destinationConnectionId)), eq(connections.userId, req.user.id)) })
48 ]);
49 if (!src || !dst) return res.status(404).json({ error: 'Connection not found' });
50
51 const [integration] = await db.insert(integrations).values({
52 userId: req.user.id, name, sourceConnectionId: src.id,
53 destinationConnectionId: dst.id, transformConfig
54 }).returning();
55
56 res.json(integration);
57});

Pro tip: The credentials object structure depends on the service. For Slack it might be {webhookUrl: '...'}. For GitHub it might be {accessToken: '...'}. Store whatever the execution engine needs to call the API — the structure is flexible JSONB.

Expected result: POST /api/connections with {serviceName: 'slack', credentials: {webhookUrl: 'https://hooks.slack.com/...'}} returns {id, serviceName, status} without exposing the webhook URL. The encrypted string is in the database.

3

Build the dynamic webhook receiver and execution engine

The webhook receiver is the most critical route. It matches incoming paths to webhook_endpoints rows, verifies the HMAC signature, then executes the integration — fetching source data, transforming it, and sending to the destination.

server/routes/webhooks.js
1const crypto = require('crypto');
2const { decrypt } = require('../lib/crypto');
3const { db } = require('../db');
4const { webhookEndpoints, integrations, connections, integrationRuns } = require('../../shared/schema');
5const { eq } = require('drizzle-orm');
6
7// Dynamic webhook receiver
8router.post('/api/webhooks/:path', async (req, res) => {
9 const { path } = req.params;
10
11 const endpoint = await db.query.webhookEndpoints.findFirst({
12 where: eq(webhookEndpoints.path, path)
13 });
14
15 if (!endpoint) return res.status(404).json({ error: 'Webhook endpoint not found' });
16
17 // Verify HMAC signature
18 const signature = req.headers['x-hub-signature-256'] || req.headers['x-signature'];
19 if (signature) {
20 const expectedSig = 'sha256=' + crypto.createHmac('sha256', endpoint.secret)
21 .update(JSON.stringify(req.body)).digest('hex');
22 if (signature !== expectedSig) {
23 return res.status(401).json({ error: 'Invalid signature' });
24 }
25 }
26
27 // Acknowledge immediately — process async
28 res.json({ received: true });
29
30 // Execute integration asynchronously
31 executeIntegration(endpoint.integrationId, req.body).catch(err => {
32 console.error('Integration execution error:', err.message);
33 });
34});
35
36async function executeIntegration(integrationId, triggerData) {
37 const [run] = await db.insert(integrationRuns).values({
38 integrationId, status: 'running'
39 }).returning();
40
41 try {
42 const integration = await db.query.integrations.findFirst({
43 where: eq(integrations.id, integrationId)
44 });
45
46 const srcConn = await db.query.connections.findFirst({ where: eq(connections.id, integration.sourceConnectionId) });
47 const dstConn = await db.query.connections.findFirst({ where: eq(connections.id, integration.destinationConnectionId) });
48
49 const srcCredentials = JSON.parse(decrypt(srcConn.credentialsEncrypted));
50 const dstCredentials = JSON.parse(decrypt(dstConn.credentialsEncrypted));
51
52 // Apply field transformation
53 const transformedData = applyTransform(triggerData, integration.transformConfig);
54
55 // Execute destination call
56 const response = await fetch(dstCredentials.webhookUrl || dstCredentials.apiEndpoint, {
57 method: 'POST',
58 headers: { 'Content-Type': 'application/json',
59 ...(dstCredentials.apiKey ? { 'Authorization': `Bearer ${dstCredentials.apiKey}` } : {}) },
60 body: JSON.stringify(transformedData)
61 });
62
63 if (!response.ok) throw new Error(`Destination responded with ${response.status}`);
64
65 await db.update(integrationRuns).set({
66 status: 'success', recordsProcessed: 1, completedAt: new Date()
67 }).where(eq(integrationRuns.id, run.id));
68
69 } catch (err) {
70 await db.update(integrationRuns).set({
71 status: 'failed', errorMessage: err.message, completedAt: new Date()
72 }).where(eq(integrationRuns.id, run.id));
73 }
74}
75
76function applyTransform(data, config) {
77 if (!config || !config.fieldMappings) return data;
78 const result = {};
79 for (const mapping of config.fieldMappings) {
80 const value = mapping.sourceField.split('.').reduce((obj, key) => obj?.[key], data);
81 if (value !== undefined) {
82 result[mapping.destinationField] = mapping.transform ? applyFieldTransform(value, mapping.transform) : value;
83 }
84 }
85 return result;
86}
87
88function applyFieldTransform(value, transform) {
89 if (transform === 'uppercase') return String(value).toUpperCase();
90 if (transform === 'lowercase') return String(value).toLowerCase();
91 if (transform === 'toString') return String(value);
92 if (transform === 'toNumber') return Number(value);
93 return value;
94}

Pro tip: Returning 200 immediately before async execution is critical. External services (GitHub, Stripe, etc.) retry webhooks if they don't get a quick response. By acknowledging first and processing after, you prevent false retries.

Expected result: POST /api/webhooks/your-path returns {received: true} instantly. The integration executes in the background and a row appears in integration_runs with status='success' or 'failed'.

4

Add webhook endpoint management and test runner

Users need to create webhook endpoints for their integrations and test them before going live. The test endpoint runs the transformation logic against sample data without calling external APIs.

prompt.txt
1// Prompt to type into Replit Agent:
2// Add these routes to server/routes/integrations.js:
3//
4// POST /api/integrations/:id/webhook — create webhook endpoint for an integration
5// Auto-generate a unique path: crypto.randomBytes(8).toString('hex')
6// Auto-generate a webhook secret: crypto.randomBytes(16).toString('hex')
7// INSERT INTO webhook_endpoints (integration_id, path, secret)
8// Return: {path, secret, fullUrl: `${process.env.APP_URL}/api/webhooks/${path}`}
9//
10// POST /api/integrations/:id/test — dry-run with sample data
11// Body: {sampleData: {...}}
12// Load the integration's transform_config
13// Apply applyTransform(sampleData, transformConfig)
14// Return: {input: sampleData, output: transformedData, mappingsApplied: count}
15// Does NOT call any external APIs — just shows the transformation result
16//
17// GET /api/integrations/:id/runs — execution history
18// Return last 50 runs ordered by started_at DESC
19// Include: status, records_processed, error_message, started_at, completed_at,
20// duration_ms (completed_at - started_at in milliseconds)
21//
22// PATCH /api/integrations/:id/toggle — enable/disable integration
23// Toggle is_active boolean
24// Return {isActive: newValue}
25//
26// GET /api/integrations/:id/runs/:runId — single run detail
27// Return full run details including any error_message

Expected result: POST /api/integrations/1/webhook returns {path: 'a3b4c5d6...', secret: '...', fullUrl: 'https://your-app.repl.co/api/webhooks/a3b4c5d6...'}. Register this URL in your source service's webhook settings.

5

Build the React frontend and deploy on Reserved VM

The frontend is a connection manager and integration builder. Deploy on Reserved VM — not Autoscale — because the webhook receiver must be always-on to avoid missing incoming events.

prompt.txt
1// Prompt to type into Replit Agent:
2// Build the React frontend at client/src/pages/:
3//
4// 1. ConnectionsPage:
5// - List of existing connections showing service name, status badge, created date
6// - 'Add Connection' button opens a modal:
7// Service name input, JSON credentials editor (textarea with format hint),
8// Submit → POST /api/connections
9// - Delete connection button (if no active integrations use it)
10//
11// 2. IntegrationsPage:
12// - List of integrations: name, source service → destination service, active/inactive badge
13// - 'New Integration' button opens a wizard:
14// Step 1: Select source connection, destination connection
15// Step 2: Build field mapping rules (add row: source field → destination field, optional transform)
16// Step 3: Name the integration, create webhook endpoint
17// Show the webhook URL to copy into the source service
18// - Toggle active/inactive switch per integration
19// - Test button → calls /api/integrations/:id/test with sample JSON
20//
21// 3. RunsPage for each integration:
22// - Data table: started_at, status (success/failed with colors), records_processed,
23// duration, error_message (truncated with expand button)
24// - Refresh every 30 seconds
25//
26// Then deploy:
27// 1. Add ENCRYPTION_KEY, SESSION_SECRET, APP_URL to Replit Secrets
28// 2. Ensure server binds to 0.0.0.0
29// 3. Deploy → Reserved VM (NOT Autoscale — webhook receiver must be always-on)

Pro tip: After deploying to Reserved VM, copy your deployment URL and set it as APP_URL in Replit Secrets. The webhook endpoint creation route uses this to return the full webhook URL that users register in their source services.

Expected result: The integration hub is live on Reserved VM. Create a connection for a Slack incoming webhook URL, create an integration that forwards GitHub webhook events to Slack with field mapping, and register the Replit webhook URL in GitHub's webhook settings.

Complete code

server/lib/crypto.js
1const crypto = require('crypto');
2
3const ALGORITHM = 'aes-256-gcm';
4const KEY_HEX = process.env.ENCRYPTION_KEY;
5
6if (!KEY_HEX || KEY_HEX.length !== 64) {
7 throw new Error('ENCRYPTION_KEY must be a 64-character hex string (32 bytes). Generate with: node -e "console.log(require(\'crypto\').randomBytes(32).toString(\'hex\'))"');
8}
9
10const KEY = Buffer.from(KEY_HEX, 'hex');
11
12function encrypt(plaintext) {
13 const iv = crypto.randomBytes(16);
14 const cipher = crypto.createCipheriv(ALGORITHM, KEY, iv);
15 const ciphertext = Buffer.concat([cipher.update(plaintext, 'utf8'), cipher.final()]);
16 const authTag = cipher.getAuthTag();
17 return JSON.stringify({
18 iv: iv.toString('hex'),
19 authTag: authTag.toString('hex'),
20 ciphertext: ciphertext.toString('hex')
21 });
22}
23
24function decrypt(encrypted) {
25 const { iv, authTag, ciphertext } = JSON.parse(encrypted);
26 const decipher = crypto.createDecipheriv(ALGORITHM, KEY, Buffer.from(iv, 'hex'));
27 decipher.setAuthTag(Buffer.from(authTag, 'hex'));
28 const decrypted = Buffer.concat([
29 decipher.update(Buffer.from(ciphertext, 'hex')),
30 decipher.final()
31 ]);
32 return decrypted.toString('utf8');
33}
34
35module.exports = { encrypt, decrypt };

Customization ideas

Retry queue for failed runs

Add a retry_count and max_retries column to integration_runs. When a run fails, if retry_count < max_retries, insert a pending_retries row. A Scheduled Deployment checks pending retries and re-executes the integration with the original trigger data stored in a trigger_data JSONB column.

Scheduled integrations (polling source APIs)

Add a schedule_cron column to integrations for integrations that don't use webhooks. A Scheduled Deployment reads enabled scheduled integrations, fetches from the source API using stored credentials, and sends to the destination. This enables polling-based automation for APIs that don't support webhooks.

Integration templates

Add a templates table with pre-built transform configs for common patterns (GitHub issue to Slack, Stripe payment to Notion, etc.). Users pick a template and only need to fill in their connection credentials. Reduces setup time from 20 minutes to 2.

Common pitfalls

Pitfall: Storing API credentials in plaintext in the database

How to avoid: Always encrypt credentials with AES-256-GCM before storage as shown in server/lib/crypto.js. The ENCRYPTION_KEY stays in Replit Secrets — never in the database.

Pitfall: Processing webhook payloads synchronously before returning 200

How to avoid: Return {received: true} immediately, then execute the integration asynchronously as shown in the webhook receiver route. Log the run ID for tracking.

Pitfall: Deploying the webhook receiver on Autoscale

How to avoid: Deploy on Reserved VM ($10-20/month on Replit). The server is always running. For Autoscale fans: add a keep-alive ping endpoint and call it from an external cron service every 5 minutes.

Pitfall: Returning credentials in GET /api/connections responses

How to avoid: The GET /api/connections SELECT statement explicitly lists only non-credential columns: id, serviceName, status, createdAt. Never SELECT * on the connections table in API responses.

Best practices

  • Store all external API credentials encrypted with AES-256-GCM. Decrypt only at execution time in the execution engine, never in list/detail API responses.
  • Return 200 immediately from webhook receivers and process asynchronously. Log the run for tracking. External services retry on slow responses, causing duplicate executions.
  • Deploy on Reserved VM — webhook receivers must be always-on. Autoscale's cold starts cause missed events from external services.
  • Generate a unique HMAC secret per webhook endpoint and verify signatures on every incoming request. This prevents spoofed webhook calls from unauthorized sources.
  • Use Drizzle Studio (database icon in sidebar) to inspect integration_runs during development. You can see error messages and execution durations without building a full UI.
  • Store Replit's dynamic outbound IPs warning in your documentation: tell users to authenticate external APIs with API keys rather than IP allowlisting, since Replit's outbound IPs are dynamic.
  • Test field transformation logic using the /api/integrations/:id/test endpoint before activating webhooks. This prevents sending malformed data to destination APIs.

AI prompts to try

Copy these prompts to build this project faster.

ChatGPT Prompt

I'm building an integration hub with Node.js and PostgreSQL. I store external API credentials encrypted with AES-256-GCM in a connections table. I need a field transformation engine that takes an incoming webhook payload (arbitrary JSON) and a transform_config (an array of fieldMappings: [{sourceField: 'data.user.email', destinationField: 'email', transform: 'lowercase'}]) and produces a transformed output object. The sourceField uses dot notation for nested access. Help me write the applyTransform(data, config) function and a unit test verifying that {data: {user: {email: 'JOHN@EXAMPLE.COM'}}} with the above mapping produces {email: 'john@example.com'}.

Build Prompt

Add a visual field mapper UI to the integration hub. Build a React component FieldMapper that shows source fields (detected from a sample JSON payload the user pastes) as a list on the left, destination fields as inputs on the right, with drag-to-connect arrows or simple dropdown selectors. Each mapping row has: source field path, destination field name, optional transform (uppercase/lowercase/toString/toNumber). The resulting mappings array is stored as transform_config.fieldMappings in the integrations table.

Frequently asked questions

How do I connect to a service that uses OAuth instead of API keys?

OAuth tokens are just strings once obtained. Use the OAuth flow to get an access_token and optionally refresh_token, then store both encrypted in the credentials JSONB. The execution engine reads and uses them like any API key. Add a token refresh helper that calls the OAuth token endpoint when the access_token expires.

Can multiple users share connections?

In the current design, each connection belongs to one user (user_id column). For shared team connections, add a team_id column to connections and let multiple users in the same team query connections WHERE team_id = req.user.teamId. Add a teams table and team_members table to manage membership.

Why do I need Reserved VM instead of Autoscale?

Webhook receivers must be always-on. When a GitHub push event fires a webhook to your URL, the request arrives at the exact moment of the event. If Autoscale is spinning up from zero (3-10 second cold start), the webhook delivery times out and the source service retries — potentially causing duplicate runs.

How do I handle high-volume webhooks (hundreds per second)?

The current single-threaded Express handler processes one webhook at a time. For high-volume scenarios, add a queue table (pending_executions: integration_id, trigger_data JSONB, created_at). The webhook receiver inserts into the queue and returns. A separate worker process (pm2 cluster mode on Reserved VM) drains the queue with concurrency.

What happens to the encryption key if I lose it?

All encrypted credentials become permanently unrecoverable. The ENCRYPTION_KEY is the master key — treat it like a password manager's master password. Back it up securely. Never rotate it without re-encrypting all credentials first using the old key to decrypt and the new key to encrypt.

Do I need Replit Core for this project?

Yes, for Reserved VM deployment which is required for reliable webhook reception. Replit Core is $25/month. The free plan supports Autoscale, but Autoscale cold starts cause missed webhook events from external services.

Can RapidDev help build a custom integration platform?

Yes. RapidDev has built 600+ apps and can add features like OAuth connection flows, retry queues, multi-step integrations with conditional logic, and a visual workflow builder. Book a free consultation at rapidevelopers.com.

How do I test a new integration before enabling it for real traffic?

Use the POST /api/integrations/:id/test endpoint with a sample_data payload. It runs the field transformation and returns the transformed output without calling any external APIs. Once you're happy with the output structure, enable the integration and register the webhook URL in your source service.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help building your app?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.