Skip to content

Instantly share code, notes, and snippets.

@arimendelow
Last active June 7, 2025 22:06
Show Gist options
  • Save arimendelow/e60f8a1303dc00bd4a21e84142623dbc to your computer and use it in GitHub Desktop.
Save arimendelow/e60f8a1303dc00bd4a21e84142623dbc to your computer and use it in GitHub Desktop.
Script for migrating an app using Prisma from Postgres database to D1 (for Redwood GraphQL-> Redwood SDK)
#!/bin/bash
# This script migrates a PostgreSQL database to a Cloudflare D1 database.
# Put this script in the ./scripts directory of your project.
# To run it, CD into the ./scripts directory and run:
# ./psql-to-d1.sh
# If the file is not executable, you can make it executable with:
# chmod +x psql-to-d1.sh
# Define your D1_DATABASE_NAME and POSTGRES_CONN_STRING environment variables in a .env.scripts file.
ENV_FILE=".env.scripts"
echo "Checking environment variables in $ENV_FILE..."
# Check if .env.scripts exists
if [ ! -f "$ENV_FILE" ]; then
echo "Error: $ENV_FILE not found."
exit 1
fi
# Load variables
set -o allexport
source "$ENV_FILE"
set +o allexport
# Check if variables are set
if [ -z "$D1_DATABASE_NAME" ]; then
echo "Error: D1_DATABASE_NAME not set in $ENV_FILE."
exit 1
fi
if [ -z "$POSTGRES_CONN_STRING" ]; then
echo "Error: POSTGRES_CONN_STRING not set in $ENV_FILE."
exit 1
fi
START_TIME=$(date +%s)
# This line will create a new log file or overwrite an existing one
echo "Starting migration process..."
# Define an array of tables to exclude — I only needed to exclude the prisma migration table, but you may need to exclude eg drizzle migrations or cloudflare KV tables.
EXCLUDE_TABLES=("public._prisma_migrations")
# Assign the provided arguments to variables
POSTGRES_DUMP_FILE="pgdb.dump.sql"
CONVERTED_TO_SQLITE="converted_to_sqlite.sql"
# Will reset the outputted files every time the script is run
rm -f "$POSTGRES_DUMP_FILE" "$CONVERTED_TO_SQLITE"
touch "$CONVERTED_TO_SQLITE"
rm -r ../.wrangler
echo "Deleted directory .wrangler to ensure a clean start."
# Check for pg_dump, sed, sqlite3
for cmd in pg_dump sed sqlite3; do
if ! command -v "$cmd" &> /dev/null; then
echo "Error: $cmd is not installed." >&2
exit 1
fi
done
# Build the exclude-table options
EXCLUDE_TABLES_OPTIONS=""
for TABLE in "${EXCLUDE_TABLES[@]}"; do
EXCLUDE_TABLES_OPTIONS+="--exclude-table=$TABLE --exclude-table-data=$TABLE "
done
echo "Creating PostgreSQL dump file '$POSTGRES_DUMP_FILE' and dumping existing data into it..."
# I have `-n public`` because I only want the public schema
pg_dump -n public --data-only --attribute-inserts $EXCLUDE_TABLES_OPTIONS "$POSTGRES_CONN_STRING" > "$POSTGRES_DUMP_FILE"
# Before we can convert the PSQL dump to SQLite3 compatible SQL, we need to have Prisma generate the commands for creating all the tables and such.
# Convert the PostgreSQL dump file to SQL statements compatible with SQLite3
# You will likely need to adjust this sed command to suit your specific PostgreSQL dump file format.
# TODO handle enums
# (NOTE: DO NOT wrap the SQL statements with BEGIN and COMMIT transactions: https://developers.cloudflare.com/d1/best-practices/import-export-data/#convert-sqlite-database-files)
echo "Converting PostgreSQL dump file to SQLite3 compatible SQL..."
sed -E \
-e 's/\\\\:/\:/g' \
-e 's/\\\\//g' \
-e 's/\\\\;/;/g' \
-e '/^SET /d' \
-e '/setval/d' \
-e "s/'true'/1/g" \
-e "s/'false'/0/g" \
-e 's/public\.//' \
-e '/^[[:space:]]*SELECT/d' \
-e "s/'([0-9]{4}-[0-9]{2}-[0-9]{2}) ([0-9]{2}:[0-9]{2}:[0-9]{2}\.[0-9]+)\+[0-9]{2}'/'\1T\2Z'/g" \
"$POSTGRES_DUMP_FILE" > "$CONVERTED_TO_SQLITE"
# Disable FK checks during the import.
# PRAGMA foreign_keys = OFF; is specifically required remotely.
sed -i '' '1i\
PRAGMA foreign_keys = OFF;\
PRAGMA defer_foreign_keys = on;\
' "$CONVERTED_TO_SQLITE"
# Re-enable deferred foreign keys at the end of the file.
# This is only needed locally.
echo "PRAGMA defer_foreign_keys = off;" >> "$CONVERTED_TO_SQLITE"
echo "Conversion to SQLite3 compatible SQL completed."
# Check if the D1 database exists
DB_EXISTS_COUNT=$(npx wrangler d1 list | grep "$D1_DATABASE_NAME" | grep -c .)
if [[ "$DB_EXISTS_COUNT" -ne 0 ]]; then
DB_EXISTS=true
else
DB_EXISTS=false
fi
# If the D1 database does not exist, create it
if [[ "$DB_EXISTS" == false ]]; then
echo "D1 database '$D1_DATABASE_NAME' does not exist. Creating it now..."
npx wrangler d1 create "$D1_DATABASE_NAME"
echo "D1 database '$D1_DATABASE_NAME' created successfully."
# Print the message in bold red
echo -e "\033[1;31m\033[1m\033[48;5;15m
======================================================================
Before continuing, update wrangler config file with the DB binding.
======================================================================
\033[0m"
# Wait for user confirmation before continuing
read -p "Update your wrangler config file with the DB binding, then press [Enter] to continue..."
else
echo "D1 database '$D1_DATABASE_NAME' already exists."
# Check if the D1 database has any tables/data
echo "Checking if D1 database '$D1_DATABASE_NAME' has any tables..."
TABLE_NAMES=$(npx wrangler d1 execute "$D1_DATABASE_NAME" --remote --command='PRAGMA table_list' \
| grep -o '"name": *"[^"]*"' | grep -o '"[^"]*"$' | tr -d '"' \
| grep -Ev '(_cf_KV|sqlite_schema|sqlite_sequence|d1_migrations|sqlite_temp_schema)')
TABLE_COUNT=$(echo "$TABLE_NAMES" | grep -c .)
if [[ "$TABLE_COUNT" -gt 0 ]]; then
echo -e "\033[1;33mWARNING: D1 database '$D1_DATABASE_NAME' already contains tables ($TABLE_COUNT found).\033[0m"
read -p $'\033[1;33mWill need to delete and recreate your DB. Ok to proceed? › (Y/n): \033[0m' USER_CHOICE
if [[ -z "$USER_CHOICE" ]]; then
USER_CHOICE="y"
fi
USER_CHOICE=$(echo "$USER_CHOICE" | tr '[:upper:]' '[:lower:]') # convert to lowercase
if [[ "$USER_CHOICE" == "n" ]]; then
echo "Aborting migration as requested by user."
exit 0
elif [[ "$USER_CHOICE" == "y" ]]; then
echo "Resetting D1 database: $D1_DATABASE_NAME"
npx wrangler d1 delete "$D1_DATABASE_NAME" -y
npx wrangler d1 create "$D1_DATABASE_NAME"
# Print the message in bold red
echo -e "\033[1;31m\033[1m\033[48;5;15m
===========================================================================
Before continuing, update wrangler config file with the new database ID.
===========================================================================
\033[0m"
# Wait for user confirmation before continuing
read -p "Update your wrangler config file with the new database ID, then press [Enter] to continue..."
fi
fi
fi
# Before we can run the import, we need to make sure that the D1 database tables are created.
echo "Creating and then running an init Prisma migration to create tables in D1 database..."
# Ensure the ../migrations directory exists and is empty (do not delete the directory itself, as that breaks migrate:dev)
mkdir -p ../migrations
find ../migrations -mindepth 1 -delete
echo "Ensured no existing migrations in ../migrations"
# Here, the DB should be totally clean, so we can run the Prisma migrations to create the tables.
# Even though there's not yet any migration to apply, I found this to be the most reliable way to create the local D1 database file.
echo y | pnpm --dir .. run migrate:dev
pnpm --dir .. run migrate:new init --no-apply
echo "Generated new migration file in ../migrations"
# Check if the new migration file is empty
LATEST_MIGRATION_FILE="../migrations/0001_init.sql"
if [[ ! -s "$LATEST_MIGRATION_FILE" ]]; then
echo -e "\033[1;31mERROR: The migration file is empty or missing: $LATEST_MIGRATION_FILE\033[0m"
echo "Please ensure your Prisma schema generates a valid migration before continuing."
read -p "Press [Enter] to exit (or hit Ctrl-C).)"
exit 1
# Optionally, could loop here to re-check, but as requested, just wait for input and continue.
else
echo "Verified: Migration file ($LATEST_MIGRATION_FILE) is not empty."
fi
# Apply the migration to create the tables in D1
echo y | pnpm --dir .. run migrate:dev
echo y | pnpm --dir .. run migrate:prd
echo "Applied migration to create tables in both dev and prod in D1"
# Import the SQL statements into D1 using wrangler
echo "Importing SQL statements into D1 database..."
npx wrangler d1 execute "$D1_DATABASE_NAME" --remote --file "$CONVERTED_TO_SQLITE" -y
npx wrangler d1 execute "$D1_DATABASE_NAME" --local --file "$CONVERTED_TO_SQLITE" -y
END_TIME=$(date +%s)
DURATION=$((END_TIME - START_TIME))
echo "Import completed successfully in $((DURATION / 60)) minutes and $((DURATION % 60)) seconds."
@hirefrank
Copy link

Nice work! Btw I'd love to fold this into my script. Any chance I could convince you to own a pr? If not, no worries and I'll get to it!

@hirefrank
Copy link

@arimendelow
Copy link
Author

ooooo will take a look @hirefrank!!!

@hirefrank
Copy link

thx

@arimendelow
Copy link
Author

@arimendelow
Copy link
Author

Made the following updates to this script:

  • Have the config variables loaded in from .env.scripts to avoid checking them in
    • D1 DB name is not sensitive, but wanted to keep it together with pg conn string
  • Check for empty migration file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment