Immich S3 Backup Guide 

Immich Server S3 Backup Guide
Backup Guide

Immich Server S3 Backup Guide

Complete instructions to back up an Immich server's data (database and assets) to Amazon S3. This guide covers manual backup steps, automation with cron jobs, and S3 lifecycle management for cost optimization.

Prerequisites

AWS CLI Setup:

Install AWS CLI if not already installed

sudo apt update
sudo apt install awscli

Configure AWS CLI with your credentials

aws configure

Enter your AWS Access Key ID, Secret Access Key, region (e.g., ap-south-1), and output format (e.g., json).

S3 Bucket Setup:

Create an S3 bucket via AWS Management Console or CLI

aws s3 mb s3://immich-backups-aditya --region ap-south-1

Enable versioning to protect against accidental overwrites

aws s3api put-bucket-versioning --bucket immich-backups-aditya --versioning-configuration Status=Enabled

Automate Backups with a Cron Job

1. Create a Backup Script:

nano /home/ubuntu/immich-app/backup_to_s3.sh

Add the following content:

#!/bin/bash

# Configuration
S3_BUCKET="s3://immich-backups-aditya"
DB_BACKUP_DIR="Daily_Database_Backup"
BACKUP_DATE=$(date +%Y%m%d)
LOG_FILE="/var/log/immich_backup.log"

# Function to log messages
log() {
  echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" >> "$LOG_FILE"
}

# Stream database backup directly to S3
log "Starting database backup"
if ! docker exec -t immich_postgres pg_dumpall --clean --if-exists --username=postgres | gzip | aws s3 cp - $S3_BUCKET/$DB_BACKUP_DIR/immich_db_$(date +%Y%m%d).sql.gz --storage-class STANDARD_IA; then
  log "ERROR: Database backup failed"
  exit 1
fi
log "Database backup completed: immich_db_$(date +%Y%m%d).sql.gz"

# Sync application data directories directly to S3
for dir in backups library encoded-video thumbs upload profile; do
  log "Syncing directory: $dir"
  if ! aws s3 sync /mnt/immich_data/upload/$dir $S3_BUCKET/$dir/ --storage-class STANDARD_IA; then
    log "ERROR: Sync of $dir failed"
    exit 1
  fi
  log "Sync of $dir completed"
done

# Keep only the latest 20 database backups
log "Checking for old database backups to delete"
# List files, filter for immich_db_*.sql.gz, sort newest first, skip latest 20, delete the rest
aws s3 ls $S3_BUCKET/$DB_BACKUP_DIR/ | grep "immich_db_.*.sql.gz" | sort -r | tail -n +21 | while read -r line; do
  file=$(echo "$line" | awk '{print $4}')
  if aws s3 rm $S3_BUCKET/$DB_BACKUP_DIR/$file; then
    log "Deleted old backup: $file"
  else
    log "ERROR: Failed to delete: $file"
  fi
done
log "Cleanup of old backups completed"

2. Make the Script Executable:

chmod +x /home/ubuntu/immich-app/backup_to_s3.sh

3. Schedule Daily Backup:

Schedule the script to run daily at 3 AM:

crontab -e

Add the following line:

0 3 * * * /home/ubuntu/immich-app/backup_to_s3.sh

Clean Up Old Backups (Optional)

Set an S3 lifecycle policy to manage older backups:

Go to S3 > immich-backups-aditya > Management > Lifecycle rules > Create rule

Example: Transition to DEEP_ARCHIVE after 30 days, expire after 365 days

Important Notes

Regularly verify backups by checking the S3 bucket contents

Ensure AWS CLI credentials have appropriate permissions for S3 operations

Monitor the backup script logs for errors if automated via cron

Adjust the lifecycle policy based on your retention needs to optimize costs

This setup ensures your Immich server data is securely backed up to S3 with optional automation and cost-effective storage management.