Skip to content

Instantly share code, notes, and snippets.

@vr-greycube
Last active May 9, 2025 08:08
Show Gist options
  • Save vr-greycube/458a424b02eac48a09fb0e0f53717af9 to your computer and use it in GitHub Desktop.
Save vr-greycube/458a424b02eac48a09fb0e0f53717af9 to your computer and use it in GitHub Desktop.
Script to import mysqldump by splitting into individual tables
#!/bin/bash
# Name of the gzipped backup file (passed as command-line argument)
if [ -z "$1" ]; then
echo "Error: Backup file name is required as the first argument."
echo "Usage: $0 <backup_file.sql.gz>"
exit 1
fi
backup_file="$1"
# Directory to restore the dump
restore_dir="restore-dump"
# List of tables to exclude (remove insert statements)
excluded_tables=(
"tabAdditional Salary"
# Add more table names here, one per line, inside the quotes
"another_table_to_exclude"
"yet_another_table"
)
# Create the restore directory if it doesn't exist
if [ ! -d "$restore_dir" ]; then
mkdir "$restore_dir"
echo "Created directory: $restore_dir"
fi
# Extract the gzipped backup file into the restore directory
filename=$(basename "$backup_file" .gz) # Extract filename without .gz
if gunzip -c "$backup_file" > "$restore_dir/$filename"; then
echo "Extracted '$backup_file' to '$restore_dir/$filename'"
else
echo "Error: Failed to extract '$backup_file'. Exiting."
exit 1
fi
# Navigate to the restore directory
cd "$restore_dir" || {
echo "Failed to change directory to '$restore_dir'. Exiting."
exit 1
fi
# Split the extracted SQL file
csplit -z -f dump_part_ -b "%03d.sql" "$filename" "/^DROP TABLE IF EXISTS/" "{*}"
MAX_FILE_SIZE=$((300 * 1024 * 1024)) # 300MB in bytes
for f in dump_part_*.sql; do
file_size=$(stat -c "%s" "$f")
table=$(head -n 1 "$f" | sed -n "s/^DROP TABLE IF EXISTS \`\(.*\)\`;/\1/p")
echo "Processing file: $f"
if [ -n "$table" ]; then
echo "Table name: $table"
else
echo "Could not extract table name."
fi
skip_import=0 # Initialize skip_import flag
# Check if the table is in the excluded list
for excluded_table in "${excluded_tables[@]}"; do
if [ "$table" = "$excluded_table" ]; then
echo "Excluding table '$table' from import."
skip_import=1
# Remove the insert statement from the file
sed -i '' "/LOCK TABLES \`$excluded_table\` WRITE;/,/UNLOCK TABLES;/d" "$f"
echo "Removed insert statements for table '$excluded_table' from file '$f'"
# Recalculate file size after removing the insert statements
file_size=$(stat -c "%s" "$f")
break # Exit the inner loop
fi
done
if (( file_size > MAX_FILE_SIZE )); then
echo "Skipping '$f' as it exceeds 300MB ($((file_size / (1024 * 1024)))MB)"
if [ -n "$table" ]; then
new_filename="${table}.skipped.sql"
mv "$f" "$new_filename"
echo "Renamed skipped file '$f' to '$new_filename'"
else
echo "Skipped file '$f' was not renamed."
fi
continue # Skip to the next file
fi
if [ $skip_import -eq 1 ]; then
echo "Skipping import for '$f' because its table is excluded."
continue
fi
echo "Importing $f ($((file_size / (1024 * 1024)))MB)"
mysql -u root -pPassword dbname < "$f"
done
@vr-greycube
Copy link
Author

script can be used when large .sql.gz create problems
tables to skip insert can be specified
also skipped files greater than specified size are renamed, so they can be imported later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment