
Backup Immich Data to Google Cloud Storage
Complete guide to set up automated backups of your Immich server data to Google Cloud Storage with database dumps, file synchronization, and automated cleanup using cron jobs.
Prerequisites
Google Cloud CLI Setup
1. Install Google Cloud CLI (gcloud) if not already installed:
sudo apt update
sudo apt install apt-transport-https ca-certificates gnupg curl
echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] https://packages.cloud.google.com/apt cloud-sdk main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key --keyring /usr/share/keyrings/cloud.google.gpg add -
sudo apt update && sudo apt install google-cloud-sdk
2. Configure Google Cloud CLI with your credentials:
gcloud init
Follow the prompts to log in to your Google account, select your GCP project, and set the default region (e.g., asia-south1 for Mumbai).
3. Authenticate with a Service Account (recommended for automation):
• Create a service account in the GCP Console under IAM & Admin > Service Accounts.
• Assign the role Storage Admin (roles/storage.admin) to the service account.
• Generate and download a JSON key file for the service account.
• Set the key file for authentication:
gcloud auth activate-service-account --key-file=/path/to/service-account-key.json
GCS Bucket Setup
1. Create a GCS Bucket via the Google Cloud Console or CLI:
gsutil mb -l asia-south1 gs://immich-backups-aditya
Replace immich-backups-aditya with a globally unique bucket name and asia-south1 with your preferred region.
2. Enable Versioning to protect against accidental overwrites:
gsutil versioning set on gs://immich-backups-aditya
Automate Backups with a Cron Job
1. Create a Backup Script
Create a script to handle the backup process:
nano /home/ubuntu/immich-app/backup_to_gcs.sh
Add the following content:
#!/bin/bash
# Configuration
GCS_BUCKET="gs://immich-backups-aditya"
DB_BACKUP_DIR="Daily_Database_Backup"
DATA_DIR="/mnt/immich_data/upload" # Adjust if your data is elsewhere
BACKUP_DATE=$(date +%Y%m%d)
LOG_FILE="/var/log/immich_backup.log"
DIRECTORIES=("backups" "library" "encoded-video" "thumbs" "upload" "profile")
# Function to log messages
log() {
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
}
# Ensure log file exists and is writable
sudo touch "$LOG_FILE"
sudo chown $(whoami):$(whoami) "$LOG_FILE"
sudo chmod 644 "$LOG_FILE"
# Verify gsutil authentication
if ! gsutil ls "$GCS_BUCKET" >/dev/null 2>&1; then
log "ERROR: Cannot access GCS bucket $GCS_BUCKET. Check authentication."
exit 1
fi
log "GCS bucket $GCS_BUCKET is accessible"
# Stream database backup directly to GCS
log "Starting database backup"
if ! docker exec -t immich_postgres pg_dumpall --clean --if-exists --username=postgres | gzip | gsutil -h "x-goog-storage-class:COLDLINE" cp - "$GCS_BUCKET/$DB_BACKUP_DIR/immich_db_$BACKUP_DATE.sql.gz"; then
log "ERROR: Database backup failed"
exit 1
fi
log "Database backup completed: immich_db_$BACKUP_DATE.sql.gz"
# Sync application data directories directly to GCS
for dir in "${DIRECTORIES[@]}"; do
log "Checking directory: $DATA_DIR/$dir"
if [ ! -d "$DATA_DIR/$dir" ]; then
log "WARNING: Directory $DATA_DIR/$dir does not exist or is not accessible, skipping"
continue
fi
# Check if directory has files (better method that handles hidden files)
if [ -z "$(find "$DATA_DIR/$dir" -maxdepth 1 -type f -print -quit)" ]; then
log "WARNING: Directory $DATA_DIR/$dir appears to be empty, skipping"
continue
fi
log "Syncing directory: $dir"
# Fixed rsync command with proper options and error handling
if ! gsutil -m -h "x-goog-storage-class:COLDLINE" rsync -r -d "$DATA_DIR/$dir/" "$GCS_BUCKET/$dir/" 2>&1 | tee -a "$LOG_FILE"; then
log "ERROR: Sync of $dir failed"
exit 1
fi
log "Sync of $dir completed"
done
# Keep only the latest 20 database backups
log "Checking for old database backups to delete"
# List files, filter for immich_db_*.sql.gz, sort newest first, skip latest 20, delete the rest
backup_files=$(gsutil ls "$GCS_BUCKET/$DB_BACKUP_DIR/immich_db_*.sql.gz" 2>/dev/null | sort -r)
if [ -n "$backup_files" ]; then
echo "$backup_files" | tail -n +21 | while read -r file; do
if [ -n "$file" ]; then
if gsutil rm "$file" 2>&1 | tee -a "$LOG_FILE"; then
log "Deleted old backup: $file"
else
log "ERROR: Failed to delete: $file"
fi
fi
done
else
log "No database backups found for cleanup"
fi
log "Cleanup of old backups completed"
log "Backup process finished successfully"
2. Run the Script to Test
bash /home/ubuntu/immich-app/backup_to_gcs.sh
3. Make the Script Executable
chmod +x /home/ubuntu/immich-app/backup_to_gcs.sh
3. Schedule Daily Backup
Schedule the script to run daily at 3 AM:
crontab -e
Add the following line:
0 3 * * * /home/ubuntu/immich-app/backup_to_gcs.sh
Notes
• Storage Class: The script uses the coldline storage class for cost efficiency, suitable for infrequent access. You can change it to standard or nearline based on your needs.
See Google Cloud Storage Pricing for details.• Permissions: Ensure the service account has the necessary permissions (storage.objects.create, storage.objects.delete, storage.objects.list) to manage objects in the bucket.
Logging: The script logs operations to /var/log/immich_backup.log. Ensure the directory exists and is writable:
sudo mkdir -p /var/log
sudo touch /var/log/immich_backup.log
sudo chown ubuntu:ubuntu /var/log/immich_backup.log
• Versioning: Enabling versioning ensures you can recover from accidental deletions or overwrites. Use gsutil ls -a to view object versions if needed.
Testing: Test the script manually first (bash /home/ubuntu/immich-app/backup_to_gcs.sh) to ensure it works as expected.
For more details on Google Cloud Storage, refer to
Google Cloud Storage Documentation