To migrate n8n to a new server, export all workflows and credentials from the old instance, copy the .n8n directory or database, set the same N8N_ENCRYPTION_KEY on the new server, and import everything. The encryption key is critical — without it, all saved credentials become unreadable on the new instance.
Moving Your n8n Instance to a New Server Without Data Loss
Migrating n8n involves three things: your workflows, your credentials, and your configuration. Workflows can be exported as JSON files through the UI or CLI. Credentials are encrypted in the database using N8N_ENCRYPTION_KEY, so you must carry that key to the new server. This tutorial walks through the complete migration process, from backing up the old server to verifying everything works on the new one.
Prerequisites
- SSH or terminal access to both the old and new servers
- The N8N_ENCRYPTION_KEY from the old server
- n8n installed on the new server (same version or newer)
- Access to the n8n database (SQLite file or PostgreSQL connection)
Step-by-step guide
Find and save your N8N_ENCRYPTION_KEY from the old server
Find and save your N8N_ENCRYPTION_KEY from the old server
The encryption key is the most critical piece of the migration. Without it, all saved credentials (API keys, OAuth tokens, database passwords) are permanently lost. Find the key in your environment variables, Docker Compose file, systemd service file, or .env file. If you never set one explicitly, n8n auto-generated one and stored it in the .n8n directory. Check the file at ~/.n8n/config by looking for the encryptionKey field. Save this key somewhere secure — you will need it on the new server.
1# Check environment variable2echo $N8N_ENCRYPTION_KEY34# Check Docker Compose5grep N8N_ENCRYPTION_KEY docker-compose.yml67# Check auto-generated key (if no env var was set)8cat ~/.n8n/config | grep encryptionKeyExpected result: You have the N8N_ENCRYPTION_KEY string saved securely for use on the new server.
Export all workflows from the old n8n instance
Export all workflows from the old n8n instance
You can export workflows through the n8n UI or CLI. In the UI, go to each workflow, click the three-dot menu, and select Export. To export all workflows at once, use the n8n CLI. Run the export command to save all workflows as JSON files to a directory. These JSON files contain the complete workflow structure, node configurations, and connections. They do not contain credentials — those are stored separately in the database.
1# Export all workflows to a directory2n8n export:workflow --all --output=./n8n-backup/workflows/34# Export all credentials (encrypted with N8N_ENCRYPTION_KEY)5n8n export:credentials --all --output=./n8n-backup/credentials/67# If using Docker8docker exec -it n8n n8n export:workflow --all --output=/tmp/workflows/9docker cp n8n:/tmp/workflows/ ./n8n-backup/workflows/Expected result: All workflows and credentials are exported as JSON files in the backup directory.
Copy the database to the new server
Copy the database to the new server
If you are using SQLite (the default), the database is a single file at ~/.n8n/database.sqlite. Copy this file to the new server's .n8n directory. If you are using PostgreSQL, create a database dump using pg_dump and restore it on the new server's PostgreSQL instance. The database contains workflows, credentials, execution history, and settings. Copying it directly is the fastest way to migrate everything at once, but it requires the same N8N_ENCRYPTION_KEY on the new server.
1# SQLite migration2scp ~/.n8n/database.sqlite user@new-server:~/.n8n/database.sqlite34# PostgreSQL migration5pg_dump -h localhost -U n8n -d n8n > n8n_backup.sql6scp n8n_backup.sql user@new-server:~/78# On the new server9psql -h localhost -U n8n -d n8n < ~/n8n_backup.sqlExpected result: The complete n8n database is available on the new server.
Set up the new server with the same encryption key and configuration
Set up the new server with the same encryption key and configuration
On the new server, install n8n and configure it with the same N8N_ENCRYPTION_KEY from the old server. Set the key as an environment variable in your Docker Compose file, systemd service, or .env file. Also update the WEBHOOK_URL to the new server's domain. If you changed database types (e.g., SQLite to PostgreSQL), set the DB_TYPE and related connection variables. Start n8n and verify that all workflows appear in the editor and all credentials are accessible.
1# Docker Compose on new server2environment:3 - N8N_ENCRYPTION_KEY=your-key-from-old-server4 - WEBHOOK_URL=https://n8n.new-domain.com/5 - DB_TYPE=postgresdb6 - DB_POSTGRESDB_HOST=postgres7 - DB_POSTGRESDB_PORT=54328 - DB_POSTGRESDB_DATABASE=n8n9 - DB_POSTGRESDB_USER=n8n10 - DB_POSTGRESDB_PASSWORD=${POSTGRES_PASSWORD}11 - GENERIC_TIMEZONE=America/New_YorkExpected result: n8n starts on the new server with all workflows visible and credentials decrypted correctly.
Verify the migration and activate workflows
Verify the migration and activate workflows
After starting n8n on the new server, verify the migration by checking three things. First, open several workflows in the editor and confirm all nodes and connections are intact. Second, go to Settings > Credentials and test each credential by clicking the Test button. Third, manually execute a workflow that uses credentials (like an API call) to confirm it works end to end. Once verified, activate the workflows you need running. Deactivate or delete the workflows on the old server to prevent duplicate triggers.
Expected result: All workflows run successfully on the new server. The old server's workflows are deactivated.
Complete working example
1#!/bin/bash2# n8n Migration Script3# Run on the OLD server to create a complete backup45set -e67BACKUP_DIR="./n8n-migration-$(date +%Y%m%d)"8NEW_SERVER="user@new-server.example.com"910echo "Creating backup directory..."11mkdir -p "$BACKUP_DIR/workflows"12mkdir -p "$BACKUP_DIR/credentials"1314# Stop n8n to prevent writes during backup15echo "Stopping n8n..."16systemctl stop n8n 2>/dev/null || docker stop n8n 2>/dev/null || pm2 stop n8n 2>/dev/null || true1718# Export workflows19echo "Exporting workflows..."20n8n export:workflow --all --output="$BACKUP_DIR/workflows/"2122# Export credentials23echo "Exporting credentials..."24n8n export:credentials --all --output="$BACKUP_DIR/credentials/"2526# Copy database27echo "Copying database..."28if [ -f ~/.n8n/database.sqlite ]; then29 cp ~/.n8n/database.sqlite "$BACKUP_DIR/database.sqlite"30 echo "SQLite database copied."31else32 echo "No SQLite database found. Using PostgreSQL dump."33 pg_dump -h localhost -U n8n -d n8n > "$BACKUP_DIR/n8n_dump.sql"34fi3536# Copy config (contains encryption key if auto-generated)37cp ~/.n8n/config "$BACKUP_DIR/config" 2>/dev/null || true3839# Create archive40echo "Creating archive..."41tar -czf "$BACKUP_DIR.tar.gz" "$BACKUP_DIR"4243# Transfer to new server44echo "Transferring to new server..."45scp "$BACKUP_DIR.tar.gz" "$NEW_SERVER:~/"4647echo "Migration backup complete."48echo "On the new server, run:"49echo " tar -xzf $BACKUP_DIR.tar.gz"50echo " Set N8N_ENCRYPTION_KEY in your environment"51echo " Import workflows: n8n import:workflow --input=workflows/"52echo " Import credentials: n8n import:credentials --input=credentials/"Common mistakes when migrating n8n to a New Server Without Losing Data
Why it's a problem: Forgetting to transfer the N8N_ENCRYPTION_KEY, making all credentials unreadable
How to avoid: Find the key in environment variables, Docker Compose, or ~/.n8n/config before starting the migration.
Why it's a problem: Running workflows on both old and new servers simultaneously
How to avoid: Deactivate all workflows on the old server before activating them on the new one.
Why it's a problem: Not updating the WEBHOOK_URL, causing webhook triggers to fail
How to avoid: Set WEBHOOK_URL to the new server's domain in your environment variables before starting n8n.
Why it's a problem: Copying the SQLite database while n8n is still writing to it
How to avoid: Stop n8n before copying database.sqlite to prevent file corruption.
Best practices
- Always save the N8N_ENCRYPTION_KEY before starting the migration — losing it means losing all credentials
- Stop n8n on the old server before copying the SQLite database to prevent corruption
- Export workflows as JSON files even when copying the database directly, as a secondary backup
- Test every credential on the new server before activating workflows
- Update WEBHOOK_URL to the new domain before activating webhook-triggered workflows
- Deactivate workflows on the old server immediately after verifying the new server works
- Update OAuth callback URLs in external services to point to the new server's domain
- Keep the old server running but inactive for a week as a rollback option
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I need to migrate my self-hosted n8n instance from one server to another. I have 50 workflows and 15 credentials. Walk me through the complete process including database migration, encryption key transfer, and DNS changes.
Help me write a bash script that exports all workflows and credentials from my current n8n Docker container, transfers them to a new server, and imports them with the same N8N_ENCRYPTION_KEY.
Frequently asked questions
What happens if I lose my N8N_ENCRYPTION_KEY?
All credentials stored in the database become permanently unreadable. You will need to re-enter every API key, OAuth token, and database password manually on the new server. Workflows themselves are not affected — only credentials are encrypted.
Can I migrate from SQLite to PostgreSQL during the migration?
Yes, but it requires extra steps. Export workflows and credentials as JSON using the CLI, set up PostgreSQL on the new server, configure the DB_TYPE and connection variables, start n8n to create the schema, then import the JSON files. You cannot directly convert the SQLite file to PostgreSQL.
Do I need to reinstall community nodes on the new server?
Yes. Community nodes are installed in the .n8n/nodes directory. Copy this directory to the new server, or reinstall them using the n8n editor's community nodes feature under Settings > Community Nodes.
Will my execution history be migrated?
If you copy the database directly (SQLite file or PostgreSQL dump), execution history is included. If you use the JSON export/import method, only workflows and credentials are transferred — execution history is not.
Can I migrate an n8n Cloud instance to self-hosted?
You can export workflows from n8n Cloud as JSON files and import them into a self-hosted instance. However, credentials must be re-entered manually because n8n Cloud uses its own encryption key that is not accessible to users.
Can RapidDev handle the migration of a production n8n instance?
Yes, RapidDev's engineering team can manage the complete migration process, including zero-downtime cutover, database migration from SQLite to PostgreSQL, DNS changes, and post-migration verification of all workflows and credentials.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation