Skip to content

Instantly share code, notes, and snippets.

@mesaque
Created February 5, 2026 13:16
Show Gist options
  • Select an option

  • Save mesaque/4c7a060dff6dd6b3d64a4522a971311e to your computer and use it in GitHub Desktop.

Select an option

Save mesaque/4c7a060dff6dd6b3d64a4522a971311e to your computer and use it in GitHub Desktop.
N8N SQLITE TO POSTGRESQL MIGRATION SCRIPT
#!/bin/bash
# ==============================================================================
# N8N SQLITE TO POSTGRESQL MIGRATION SCRIPT
# ==============================================================================
# Version: 1.0.0
# Author: Community Contribution
# License: MIT
# Repository: https://github.com/your-repo/n8n-sqlite-to-postgres
#
# DESCRIPTION:
# This script automates the migration of n8n data from SQLite to PostgreSQL.
# It exports all necessary tables from SQLite, disables foreign key constraints
# in PostgreSQL, imports the data, and re-enables constraints.
#
# REQUIREMENTS:
# - sqlite3 CLI tool (will be auto-installed on Debian/Ubuntu if missing)
# - Access to PostgreSQL database (via Docker or direct connection)
# - The n8n SQLite database file (usually named database.sqlite)
# - PostgreSQL database with n8n schema already created (run n8n once with PG)
#
# IMPORTANT: Before running this script:
# 1. Stop n8n if it's running
# 2. Backup your SQLite database
# 3. Configure n8n to use PostgreSQL and start it ONCE to create the schema
# 4. Stop n8n again
# 5. Run this migration script
# 6. Start n8n with PostgreSQL configuration
#
# ==============================================================================
# USAGE
# ==============================================================================
#
# For Docker PostgreSQL:
# ./migrate.sh <sqlite_file> <pg_user> <pg_password> [database_name]
#
# Examples:
# ./migrate.sh database.sqlite postgres mypassword n8n
# ./migrate.sh /path/to/database.sqlite postgres mypassword
#
# ==============================================================================
# CONFIGURATION OPTIONS (Environment Variables)
# ==============================================================================
#
# CONTAINER_POSTGRES: Docker container name running PostgreSQL
# Default: "postgres"
# Example: export CONTAINER_POSTGRES="my-postgres-container"
#
# USE_DOCKER: Set to "false" to connect directly to PostgreSQL without Docker
# Default: "true"
# Example: export USE_DOCKER="false"
#
# PG_HOST: PostgreSQL host (only used when USE_DOCKER="false")
# Default: "localhost"
# Example: export PG_HOST="192.168.1.100"
#
# PG_PORT: PostgreSQL port (only used when USE_DOCKER="false")
# Default: "5432"
# Example: export PG_PORT="5433"
#
# ==============================================================================
# ADAPTING THIS SCRIPT FOR DIFFERENT ENVIRONMENTS
# ==============================================================================
#
# FOR AI ASSISTANTS AND DEVELOPERS:
#
# This script was designed to be easily adaptable. Here are the key areas
# that may need modification for different setups:
#
# 1. DOCKER VS DIRECT CONNECTION:
# - The script uses Docker by default (USE_DOCKER="true")
# - Set USE_DOCKER="false" to use direct psql connection
# - When USE_DOCKER="false", configure PG_HOST and PG_PORT
#
# 2. TABLES TO MIGRATE:
# - The TABLES array (line ~150) contains tables in dependency order
# - Add/remove tables based on your n8n version
# - Order matters! Parent tables must come before child tables
#
# 3. FOREIGN KEY HANDLING:
# - Uses "ALTER TABLE DISABLE TRIGGER ALL" to bypass FK checks
# - This is PostgreSQL-specific and persists across sessions
# - For other databases, different approaches may be needed
#
# 4. CSV IMPORT:
# - Uses PostgreSQL's \copy command
# - Column names are automatically quoted for case-sensitivity
# - Handles empty tables gracefully
#
# 5. COMMON ISSUES AND SOLUTIONS:
# - "relation does not exist": Run n8n with PG first to create schema
# - "permission denied": Check PostgreSQL user permissions
# - "connection refused": Verify host, port, and firewall settings
# - "foreign key violation": Script handles this, but check table order
#
# 6. DIFFERENT N8N VERSIONS:
# - Table structure may vary between n8n versions
# - Check your SQLite tables with: sqlite3 database.sqlite ".tables"
# - Compare with PostgreSQL: psql -c "\dt"
# - Add missing tables to the TABLES array
#
# ==============================================================================
# TABLES MIGRATED (in dependency order)
# ==============================================================================
#
# Core tables:
# - user : User accounts
# - project : Projects/workspaces
# - folder : Folder structure for organizing workflows
# - credentials_entity : Encrypted credentials
# - workflow_entity : Workflow definitions (nodes, connections, etc.)
# - tag_entity : Tags for organizing workflows
#
# Relationship tables:
# - project_relation : User-project relationships
# - shared_credentials : Credential sharing permissions
# - shared_workflow : Workflow sharing permissions
# - webhook_entity : Webhook configurations
# - workflows_tags : Workflow-tag associations
#
# Optional tables:
# - variables : Environment variables
# - execution_entity : Execution history (can be large!)
#
# ==============================================================================
# WHAT THIS SCRIPT DOES NOT MIGRATE
# ==============================================================================
#
# - execution_data: Large binary data, usually not needed
# - workflow_history: Version history, can be recreated
# - insights_*: Analytics data, will be regenerated
# - migrations: Database migration tracking, auto-managed by n8n
#
# ==============================================================================
# Color codes for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# ==============================================================================
# CONFIGURATION - Modify these variables for your environment
# ==============================================================================
# Docker container name (only used if USE_DOCKER="true")
CONTAINER_POSTGRES="${CONTAINER_POSTGRES:-postgres}"
# Set to "false" to use direct PostgreSQL connection instead of Docker
USE_DOCKER="${USE_DOCKER:-true}"
# PostgreSQL connection settings (only used if USE_DOCKER="false")
PG_HOST="${PG_HOST:-localhost}"
PG_PORT="${PG_PORT:-5432}"
# Temporary directory for CSV files
TEMP_DIR="n8n_migration_temp"
# ==============================================================================
# FUNCTIONS
# ==============================================================================
print_header() {
echo -e "\n${BLUE}############################################################${NC}"
echo -e "${BLUE}# $1${NC}"
echo -e "${BLUE}############################################################${NC}"
}
print_status() { echo -e "${GREEN}$1${NC}"; }
print_warning() { echo -e "${YELLOW}$1${NC}"; }
print_error() { echo -e "${RED}✖ ERROR: $1${NC}"; exit 1; }
# Execute PostgreSQL command - adapts to Docker or direct connection
run_psql() {
if [ "$USE_DOCKER" = "true" ]; then
docker exec -e PGPASSWORD="$PG_PASS" "$CONTAINER_POSTGRES" psql -U "$PG_USER" -d "$DB_NAME" -c "$1" 2>&1
else
PGPASSWORD="$PG_PASS" psql -h "$PG_HOST" -p "$PG_PORT" -U "$PG_USER" -d "$DB_NAME" -c "$1" 2>&1
fi
}
run_psql_quiet() {
if [ "$USE_DOCKER" = "true" ]; then
docker exec -e PGPASSWORD="$PG_PASS" "$CONTAINER_POSTGRES" psql -U "$PG_USER" -d "$DB_NAME" -c "$1" > /dev/null 2>&1
else
PGPASSWORD="$PG_PASS" psql -h "$PG_HOST" -p "$PG_PORT" -U "$PG_USER" -d "$DB_NAME" -c "$1" > /dev/null 2>&1
fi
}
run_psql_value() {
if [ "$USE_DOCKER" = "true" ]; then
docker exec -e PGPASSWORD="$PG_PASS" "$CONTAINER_POSTGRES" psql -U "$PG_USER" -d "$DB_NAME" -t -c "$1" 2>/dev/null | xargs
else
PGPASSWORD="$PG_PASS" psql -h "$PG_HOST" -p "$PG_PORT" -U "$PG_USER" -d "$DB_NAME" -t -c "$1" 2>/dev/null | xargs
fi
}
copy_files_to_postgres() {
if [ "$USE_DOCKER" = "true" ]; then
docker cp "$TEMP_DIR/." "$CONTAINER_POSTGRES:/tmp/"
else
# For direct connection, files are already accessible
# You may need to adjust this for remote PostgreSQL
print_warning "Direct connection mode: CSV files are in ./$TEMP_DIR/"
print_warning "Ensure PostgreSQL can access these files or adjust the import path"
fi
}
get_csv_path() {
local table=$1
if [ "$USE_DOCKER" = "true" ]; then
echo "/tmp/$table.csv"
else
# For direct connection, use absolute path
echo "$(pwd)/$TEMP_DIR/$table.csv"
fi
}
# ==============================================================================
# ARGUMENT VALIDATION
# ==============================================================================
if [ "$#" -lt 3 ]; then
echo "N8N SQLite to PostgreSQL Migration Script"
echo ""
echo "Usage: $0 <sqlite_file> <pg_user> <pg_password> [database_name]"
echo ""
echo "Arguments:"
echo " sqlite_file Path to the n8n SQLite database file"
echo " pg_user PostgreSQL username"
echo " pg_password PostgreSQL password"
echo " database_name PostgreSQL database name (default: n8n)"
echo ""
echo "Environment Variables:"
echo " USE_DOCKER=false Use direct psql instead of Docker"
echo " PG_HOST=localhost PostgreSQL host (when USE_DOCKER=false)"
echo " PG_PORT=5432 PostgreSQL port (when USE_DOCKER=false)"
echo " CONTAINER_POSTGRES=name Docker container name (when USE_DOCKER=true)"
echo ""
echo "Examples:"
echo " # Using Docker (default)"
echo " $0 database.sqlite postgres mypassword n8n"
echo ""
echo " # Direct PostgreSQL connection"
echo " USE_DOCKER=false PG_HOST=localhost $0 database.sqlite postgres mypassword n8n"
exit 1
fi
SQLITE_FILE=$1
PG_USER=$2
PG_PASS=$3
DB_NAME=${4:-n8n}
# Validate SQLite file exists
if [ ! -f "$SQLITE_FILE" ]; then
print_error "SQLite file not found: $SQLITE_FILE"
fi
# Check for sqlite3
if ! command -v sqlite3 &> /dev/null; then
echo "Installing sqlite3..."
if command -v apt-get &> /dev/null; then
sudo apt-get update -qq && sudo apt-get install -y sqlite3 > /dev/null
elif command -v yum &> /dev/null; then
sudo yum install -y sqlite > /dev/null
elif command -v brew &> /dev/null; then
brew install sqlite > /dev/null
else
print_error "sqlite3 is required. Please install it manually."
fi
fi
# Validate Docker connection if using Docker
if [ "$USE_DOCKER" = "true" ]; then
if ! docker exec "$CONTAINER_POSTGRES" echo "ok" > /dev/null 2>&1; then
print_error "Cannot connect to Docker container: $CONTAINER_POSTGRES"
fi
fi
# ==============================================================================
# TABLES TO MIGRATE (ORDER MATTERS - parent tables first!)
# ==============================================================================
# Add or remove tables based on your n8n version.
# Tables are imported in this order to respect foreign key dependencies.
# ==============================================================================
TABLES=(
"user"
"project"
"folder"
"credentials_entity"
"workflow_entity"
"tag_entity"
"project_relation"
"shared_credentials"
"shared_workflow"
"webhook_entity"
"workflows_tags"
"variables"
"execution_entity"
)
mkdir -p "$TEMP_DIR"
# ==============================================================================
# STEP 1: EXPORT FROM SQLITE
# ==============================================================================
print_header "STEP 1: EXPORTING FROM SQLITE"
echo -e "${BLUE}Available tables in SQLite:${NC}"
sqlite3 "$SQLITE_FILE" "SELECT name FROM sqlite_master WHERE type='table' ORDER BY name;" | while read t; do
COUNT=$(sqlite3 "$SQLITE_FILE" "SELECT count(*) FROM \"$t\";")
echo " - $t ($COUNT records)"
done
echo ""
EXPORTED_TABLES=()
for TABLE in "${TABLES[@]}"; do
EXISTS=$(sqlite3 "$SQLITE_FILE" "SELECT name FROM sqlite_master WHERE type='table' AND name='$TABLE';")
if [ -z "$EXISTS" ]; then
echo -e "Table ${YELLOW}$TABLE${NC} does not exist in SQLite, skipping..."
continue
fi
echo -n "Exporting $TABLE... "
sqlite3 -header -csv "$SQLITE_FILE" "SELECT * FROM \"$TABLE\";" > "$TEMP_DIR/$TABLE.csv"
ROWS=$(sqlite3 "$SQLITE_FILE" "SELECT count(*) FROM \"$TABLE\";")
echo -e "${GREEN}OK ($ROWS records)${NC}"
EXPORTED_TABLES+=("$TABLE")
done
# ==============================================================================
# STEP 2: PREPARE POSTGRESQL
# ==============================================================================
print_header "STEP 2: PREPARING POSTGRESQL"
copy_files_to_postgres
# Get list of all tables in PostgreSQL
print_status "Getting PostgreSQL table list..."
PG_TABLES=$(run_psql_value "SELECT string_agg(tablename, ' ') FROM pg_tables WHERE schemaname = 'public';")
# ==============================================================================
# DISABLE ALL FOREIGN KEYS (CRITICAL!)
# ==============================================================================
# This uses ALTER TABLE DISABLE TRIGGER ALL which disables FK checks
# and persists across sessions (unlike SET session_replication_role)
# ==============================================================================
print_status "Disabling ALL foreign keys and triggers..."
for TABLE in $PG_TABLES; do
run_psql_quiet "ALTER TABLE \"$TABLE\" DISABLE TRIGGER ALL;"
done
echo -e "${GREEN} Foreign keys disabled on all tables${NC}"
# Truncate tables
print_status "Clearing existing data..."
TRUNCATE_LIST=""
for TABLE in "${EXPORTED_TABLES[@]}"; do
EXISTS_PG=$(run_psql_value "SELECT EXISTS (SELECT FROM information_schema.tables WHERE table_name = '$TABLE');")
if [ "$EXISTS_PG" = "t" ]; then
TRUNCATE_LIST+="\"$TABLE\", "
fi
done
if [ -n "$TRUNCATE_LIST" ]; then
TRUNCATE_LIST="${TRUNCATE_LIST%, }"
run_psql "TRUNCATE TABLE $TRUNCATE_LIST CASCADE;" 2>&1 | grep -v "^NOTICE:"
fi
# ==============================================================================
# STEP 3: IMPORT DATA
# ==============================================================================
print_header "STEP 3: IMPORTING DATA"
IMPORT_ERRORS=()
IMPORT_SUCCESS=()
for TABLE in "${EXPORTED_TABLES[@]}"; do
CSV_FILE="$TEMP_DIR/$TABLE.csv"
if [ ! -f "$CSV_FILE" ]; then
continue
fi
echo -n "Importing $TABLE... "
RAW_HEADER=$(head -n 1 "$CSV_FILE")
LINE_COUNT=$(wc -l < "$CSV_FILE" | xargs)
if [ "$LINE_COUNT" -le 1 ]; then
echo -e "${YELLOW}Skipped (no data)${NC}"
continue
fi
if [ -z "$RAW_HEADER" ]; then
echo -e "${YELLOW}Skipped (empty)${NC}"
continue
fi
# Quote column names for PostgreSQL (handles camelCase)
QUOTED_COLUMNS=$(echo "$RAW_HEADER" | sed 's/,/","/g; s/^/"/; s/$/"/')
EXISTS_PG=$(run_psql_value "SELECT EXISTS (SELECT FROM information_schema.tables WHERE table_name = '$TABLE');")
if [ "$EXISTS_PG" != "t" ]; then
echo -e "${YELLOW}Table does not exist in PostgreSQL${NC}"
continue
fi
CSV_PATH=$(get_csv_path "$TABLE")
COMMAND="\copy \"$TABLE\" ($QUOTED_COLUMNS) FROM '$CSV_PATH' DELIMITER ',' CSV HEADER;"
if [ "$USE_DOCKER" = "true" ]; then
OUTPUT=$(docker exec -e PGPASSWORD="$PG_PASS" "$CONTAINER_POSTGRES" psql -U "$PG_USER" -d "$DB_NAME" -c "$COMMAND" 2>&1)
else
OUTPUT=$(PGPASSWORD="$PG_PASS" psql -h "$PG_HOST" -p "$PG_PORT" -U "$PG_USER" -d "$DB_NAME" -c "$COMMAND" 2>&1)
fi
if [[ $OUTPUT == *"COPY"* ]]; then
COPIED=$(echo "$OUTPUT" | grep -oP 'COPY \K[0-9]+' || echo "$OUTPUT" | grep -oE 'COPY [0-9]+' | grep -oE '[0-9]+')
echo -e "${GREEN}Success ($COPIED records)${NC}"
IMPORT_SUCCESS+=("$TABLE")
else
echo -e "${RED}Failed!${NC}"
echo -e "${YELLOW} Error: $(echo "$OUTPUT" | head -n 2)${NC}"
IMPORT_ERRORS+=("$TABLE: $OUTPUT")
fi
done
# ==============================================================================
# STEP 4: RE-ENABLE CONSTRAINTS
# ==============================================================================
print_header "STEP 4: RE-ENABLING CONSTRAINTS"
print_status "Re-enabling foreign keys and triggers..."
for TABLE in $PG_TABLES; do
run_psql_quiet "ALTER TABLE \"$TABLE\" ENABLE TRIGGER ALL;"
done
echo -e "${GREEN} Foreign keys re-enabled${NC}"
# Fix sequences (auto-increment IDs)
print_status "Fixing ID sequences..."
run_psql_quiet "
DO \$\$
DECLARE
seq_name TEXT;
table_name TEXT;
col_name TEXT;
max_val BIGINT;
BEGIN
FOR seq_name, table_name, col_name IN
SELECT s.relname, t.relname, a.attname
FROM pg_class s
JOIN pg_depend d ON d.objid = s.oid
JOIN pg_class t ON d.refobjid = t.oid
JOIN pg_attribute a ON a.attrelid = t.oid AND a.attnum = d.refobjsubid
WHERE s.relkind = 'S'
LOOP
EXECUTE format('SELECT COALESCE(MAX(%I), 0) FROM %I', col_name, table_name) INTO max_val;
IF max_val > 0 THEN
EXECUTE format('SELECT setval(%L, %s)', seq_name, max_val);
END IF;
END LOOP;
END\$\$;
"
# ==============================================================================
# STEP 5: VERIFICATION
# ==============================================================================
print_header "STEP 5: VERIFICATION"
print_status "Checking data integrity..."
# ==============================================================================
# FINAL REPORT
# ==============================================================================
print_header "FINAL REPORT"
printf "%-25s | %-10s | %-10s | %-10s\n" "TABLE" "SQLITE" "POSTGRES" "STATUS"
echo "--------------------------------------------------------------"
TOTAL_SQLITE=0
TOTAL_PG=0
ERRORS=0
for TABLE in "${EXPORTED_TABLES[@]}"; do
C1=$(sqlite3 "$SQLITE_FILE" "SELECT count(*) FROM \"$TABLE\";" 2>/dev/null || echo "0")
EXISTS_PG=$(run_psql_value "SELECT EXISTS (SELECT FROM information_schema.tables WHERE table_name = '$TABLE');")
if [ "$EXISTS_PG" = "t" ]; then
C2=$(run_psql_value "SELECT count(*) FROM \"$TABLE\";")
else
C2="N/A"
fi
TOTAL_SQLITE=$((TOTAL_SQLITE + C1))
if [ "$C2" = "N/A" ]; then
STATUS="${YELLOW}SKIP${NC}"
elif [ "$C1" -eq "$C2" ]; then
STATUS="${GREEN}OK${NC}"
TOTAL_PG=$((TOTAL_PG + C2))
else
STATUS="${RED}ERROR${NC}"
TOTAL_PG=$((TOTAL_PG + C2))
ERRORS=$((ERRORS + 1))
fi
printf "%-25s | %-10s | %-10s | %b\n" "$TABLE" "$C1" "$C2" "$STATUS"
done
echo "--------------------------------------------------------------"
printf "%-25s | %-10s | %-10s |\n" "TOTAL" "$TOTAL_SQLITE" "$TOTAL_PG"
if [ ${#IMPORT_ERRORS[@]} -gt 0 ]; then
echo ""
print_warning "Errors during import:"
for err in "${IMPORT_ERRORS[@]}"; do
echo -e " ${RED}- $err${NC}"
done
fi
# Cleanup
rm -rf "$TEMP_DIR"
echo ""
if [ $ERRORS -eq 0 ] && [ ${#IMPORT_ERRORS[@]} -eq 0 ]; then
echo -e "${GREEN}✓ Migration completed successfully!${NC}"
else
echo -e "${YELLOW}⚠ Migration completed with $ERRORS error(s). Check the report above.${NC}"
fi
echo ""
echo -e "${BLUE}Next steps:${NC}"
echo " 1. Start n8n with PostgreSQL configuration"
echo " 2. Verify workflows appear correctly"
echo " 3. Test credentials"
echo " 4. Remove default user created by n8n if necessary"
echo ""
echo -e "${BLUE}PostgreSQL environment variables for n8n:${NC}"
echo " DB_TYPE=postgresdb"
echo " DB_POSTGRESDB_DATABASE=$DB_NAME"
echo " DB_POSTGRESDB_HOST=<your_host>"
echo " DB_POSTGRESDB_PORT=5432"
echo " DB_POSTGRESDB_USER=$PG_USER"
echo " DB_POSTGRESDB_PASSWORD=<your_password>"
@mesaque
Copy link
Author

mesaque commented Feb 5, 2026

Tested with N8N version 1.123.18

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment