Netgrimoire/Infrastructure/Backups.md
2026-02-11 19:51:49 +00:00

24 KiB

title description published date tags editor dateCreated
Setting Up Kopia true 2026-02-11T19:51:39.461Z markdown 2026-01-23T22:14:17.009Z

Kopia Backup System Documentation

Overview

This system implements a two-tier backup strategy:

  1. Primary Repository (/srv/vault/kopia_repository) - Full backups of all clients
  2. Vault Repository (/srv/vault/backup) - Targeted critical data backups, replicated offsite via ZFS send/receive

The Vault repository sits on its own ZFS dataset to enable clean replication to offsite Pi systems.


Architecture

Clients (docker2, cindy's desktop, etc.)
    ↓
    ├─→ Primary Backup → /srv/vault/kopia_repository (all data)
    └─→ Vault Backup → /srv/vault/backup (critical data only)
                            ↓
                    ZFS Send/Receive
                            ↓
                    ┌───────┴───────┐
                    ↓               ↓
                Pi Vault 1      Pi Vault 2
                (offsite)       (offsite)

Initial Setup on ZNAS

Prerequisites

  • Docker installed on ZNAS
  • ZFS pool available

1. Create ZFS Datasets

# Primary repository dataset (if not already created)
zfs create -o mountpoint=/srv/vault zpool/vault
zfs create zpool/vault/kopia_repository

# Vault repository dataset (for offsite replication)
zfs create zpool/vault/backup

2. Install Kopia Server (Docker)

services:
  kopia:
    image: kopia/kopia:latest
    container_name: kopia
    hostname: kopia
    restart: unless-stopped
    user: "1964:1964"
    ports:
      -  51515:51515
    environment:
      PUID: 1964
      PGID: 1964
      TZ: America/Chicago
      KOPIA_PASSWORD: F@lcon13
      KOPIA_SERVER_USERNAME: admin
      KOPIA_SERVER_PASSWORD: F@lcon13
    command: 
      - server
      - start
      #- --tls-generate-cert
      - --tls-cert-file=/app/cert/my.cert
      - --tls-key-file=/app/cert/my.key
      - --address=0.0.0.0:51515
      - --server-username=admin
      - --server-password=F@lcon13
    volumes:
      - /DockerVol/kopia/config:/app/config
      - /DockerVol/kopia/cache:/app/cache
      - /DockerVol/kopia/cert:/app/cert
      - /srv/vault/kopia_repository:/repository
      - /DockerVol/kopia/logs:/app/logs
    networks:
      - netgrimoire
    deploy:
      placement:
        constraints:
          - node.hostname == znas
      labels:
        diun.enable: "true"
        homepage.group: "Backup"
        homepage.name: "Kopia"
        homepage.icon: "kopia.png"
        homepage.href: "https://kopia.netgrimoire.com"
        homepage.description: "Snapshot backup and deduplication"
        kuma.kopia.http.name: "Kopia Web"
        kuma.kopia.http.url: "http://kopia:51515"
        # Optional Caddy reverse proxy
        caddy: kopia.netgrimoire.com
        caddy.import: authentik
        caddy.reverse_proxy: "kopia.netgrimoire.com:51515"
      

networks:
  netgrimoire:
    external: true

Note: Server cert SHA256 fingerprint: 696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2

3. Create Kopia Repositories

# Enter the container
docker exec -it kopia-server /bin/sh

# Create primary repository (if not already done via GUI)
# This was created via GUI at /srv/vault/kopia_repository

# Create vault repository for offsite backups
kopia repository create filesystem --path=/app/vault

# Exit container
exit

4. Create User Accounts

# Enter container
docker exec -it kopia-server /bin/sh

# Primary repository users
kopia server users add --ask-password admin@docker2
kopia server users add --ask-password cindy@DESKTOP-QLSVD8P
# Password for cindy: LucyDog123

# Vault repository users (for targeted backups)
kopia repository connect filesystem --path=/app/vault
kopia server users add --ask-password admin@docker2-vault
kopia server users add --ask-password cindy@DESKTOP-QLSVD8P-vault
# Use same passwords or different based on security requirements

# Exit container
exit

Client Configuration

Linux Client (docker2)

Primary Backup Setup

  1. Install Kopia

    # Download and install kopia .deb package
    wget https://github.com/kopia/kopia/releases/download/v0.XX.X/kopia_0.XX.X_amd64.deb
    sudo dpkg -i kopia_0.XX.X_amd64.deb
    
  2. Remove old repository (if exists)

    sudo kopia repository disconnect || true
    sudo rm -rf /root/.config/kopia
    
  3. Connect to primary repository

    sudo kopia repository connect server \
      --url=https://192.168.5.10:51515 \
      --override-username=admin@docker2 \
      --server-cert-fingerprint=696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2
    
  4. Create initial snapshot

    sudo kopia snapshot create /DockerVol/
    
  5. Set up cron job for primary backups

    sudo crontab -e
    
    # Add this line (runs every 3 hours)
    */180 * * * * /usr/bin/kopia snapshot create /DockerVol >> /var/log/kopia-primary-cron.log 2>&1
    

Vault Backup Setup (Critical Data)

  1. Create secondary kopia config directory

    sudo mkdir -p /root/.config/kopia-vault
    
  2. Connect to vault repository

    sudo kopia --config-file=/root/.config/kopia-vault/repository.config \
      repository connect server \
      --url=https://192.168.5.10:51515 \
      --override-username=admin@docker2-vault \
      --server-cert-fingerprint=696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2
    
  3. Create vault backup script

    sudo nano /usr/local/bin/kopia-vault-backup.sh
    

    Add this content:

    #!/bin/bash
    # Kopia Vault Backup Script
    # Backs up critical data to vault repository for offsite replication
    
    KOPIA_CONFIG="/root/.config/kopia-vault/repository.config"
    LOG_FILE="/var/log/kopia-vault-cron.log"
    
    # Add your critical directories here
    VAULT_DIRS=(
      "/DockerVol/critical-app1"
      "/DockerVol/critical-app2"
      "/home/admin/documents"
    )
    
    echo "=== Vault backup started at $(date) ===" >> "$LOG_FILE"
    
    for dir in "${VAULT_DIRS[@]}"; do
      if [ -d "$dir" ]; then
        echo "Backing up: $dir" >> "$LOG_FILE"
        /usr/bin/kopia --config-file="$KOPIA_CONFIG" snapshot create "$dir" >> "$LOG_FILE" 2>&1
      else
        echo "Directory not found: $dir" >> "$LOG_FILE"
      fi
    done
    
    echo "=== Vault backup completed at $(date) ===" >> "$LOG_FILE"
    echo "" >> "$LOG_FILE"
    
  4. Make script executable

    sudo chmod +x /usr/local/bin/kopia-vault-backup.sh
    
  5. Set up cron job for vault backups

    sudo crontab -e
    
    # Add this line (runs daily at 3 AM)
    0 3 * * * /usr/local/bin/kopia-vault-backup.sh
    

Windows Client (Cindy's Desktop)

Primary Backup Setup

  1. Install Kopia

    # Using winget
    winget install kopia
    
  2. Connect to primary repository

    kopia repository connect server `
      --url=https://192.168.5.10:51515 `
      --override-username=cindy@DESKTOP-QLSVD8P `
      --server-cert-fingerprint=696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2
    
  3. Create initial snapshot

    kopia snapshot create C:\Users\cindy
    
  4. Set exclusion policy

    kopia policy set `
      --global `
      --add-ignore "**\AppData\Local\Temp\**" `
      --add-ignore "**\AppData\Local\Packages\**"
    
  5. Create primary backup script

    # Create scripts folder
    New-Item -ItemType Directory -Force -Path C:\Scripts
    
    # Create backup script
    New-Item -ItemType File -Path C:\Scripts\kopia-primary-nightly.ps1
    

    Add this content to C:\Scripts\kopia-primary-nightly.ps1:

    # Kopia Primary Backup Script
    # Repository password
    $env:KOPIA_PASSWORD = "LucyDog123"
    
    # Run backup with logging
    kopia snapshot create C:\Users\cindy `
      --progress `
      | Tee-Object -FilePath C:\Logs\kopia-primary.log -Append
    
    # Log completion
    Add-Content -Path C:\Logs\kopia-primary.log -Value "Backup completed at $(Get-Date)"
    Add-Content -Path C:\Logs\kopia-primary.log -Value "---"
    
  6. Secure the script

    • Right-click C:\Scripts\kopia-primary-nightly.ps1 → Properties → Security
    • Ensure only Cindy's user account has read access
  7. Create scheduled task for primary backup

    • Press Win + R → type taskschd.msc
    • Click "Create Task" (not "Basic Task")

    General tab:

    • Name: Kopia Primary Nightly Backup
    • ✔ Run whether user is logged on or not
    • ✔ Run with highest privileges
    • Configure for: Windows 10/11

    Triggers tab:

    • New → Daily at 2:00 AM
    • ✔ Enabled

    Actions tab:

    • Program: powershell.exe
    • Arguments: -ExecutionPolicy Bypass -File C:\Scripts\kopia-primary-nightly.ps1
    • Start in: C:\Scripts

    Conditions tab:

    • ✔ Wake the computer to run this task
    • ✔ Start only if on AC power (recommended for laptops)

    Settings tab:

    • ✔ Allow task to be run on demand
    • ✔ Run task as soon as possible after scheduled start is missed
    • Stop the task if it runs longer than...

    Note: When creating the task, use PIN (not Windows password) when prompted. For scheduled task credential: use password Harvey123= (MS account password)

Vault Backup Setup (Critical Data)

  1. Create vault config directory

    New-Item -ItemType Directory -Force -Path C:\Users\cindy\.config\kopia-vault
    
  2. Connect to vault repository

    kopia --config-file="C:\Users\cindy\.config\kopia-vault\repository.config" `
      repository connect server `
      --url=https://192.168.5.10:51515 `
      --override-username=cindy@DESKTOP-QLSVD8P-vault `
      --server-cert-fingerprint=696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2
    
  3. Create vault backup script

    New-Item -ItemType File -Path C:\Scripts\kopia-vault-nightly.ps1
    

    Add this content to C:\Scripts\kopia-vault-nightly.ps1:

    # Kopia Vault Backup Script
    # Backs up critical data to vault repository for offsite replication
    
    $env:KOPIA_PASSWORD = "LucyDog123"
    $KOPIA_CONFIG = "C:\Users\cindy\.config\kopia-vault\repository.config"
    
    # Define critical directories to back up
    $VaultDirs = @(
      "C:\Users\cindy\Documents",
      "C:\Users\cindy\Pictures",
      "C:\Users\cindy\Desktop\Important"
    )
    
    # Log header
    Add-Content -Path C:\Logs\kopia-vault.log -Value "=== Vault backup started at $(Get-Date) ==="
    
    # Backup each directory
    foreach ($dir in $VaultDirs) {
      if (Test-Path $dir) {
        Add-Content -Path C:\Logs\kopia-vault.log -Value "Backing up: $dir"
        kopia --config-file="$KOPIA_CONFIG" snapshot create $dir `
          | Tee-Object -FilePath C:\Logs\kopia-vault.log -Append
      } else {
        Add-Content -Path C:\Logs\kopia-vault.log -Value "Directory not found: $dir"
      }
    }
    
    # Log completion
    Add-Content -Path C:\Logs\kopia-vault.log -Value "=== Vault backup completed at $(Get-Date) ==="
    Add-Content -Path C:\Logs\kopia-vault.log -Value ""
    
  4. Create log directory

    New-Item -ItemType Directory -Force -Path C:\Logs
    
  5. Create scheduled task for vault backup

    • Press Win + R → type taskschd.msc
    • Click "Create Task"

    General tab:

    • Name: Kopia Vault Nightly Backup
    • ✔ Run whether user is logged on or not
    • ✔ Run with highest privileges

    Triggers tab:

    • New → Daily at 3:00 AM (after primary backup)
    • ✔ Enabled

    Actions tab:

    • Program: powershell.exe
    • Arguments: -ExecutionPolicy Bypass -File C:\Scripts\kopia-vault-nightly.ps1
    • Start in: C:\Scripts

    Conditions/Settings: Same as primary backup task


ZFS Replication to Offsite Pi Vaults

Setup on ZNAS (Source)

  1. Create snapshot script

    sudo nano /usr/local/bin/vault-snapshot.sh
    

    Add this content:

    #!/bin/bash
    # Create ZFS snapshot of vault dataset for replication
    
    DATASET="zpool/vault/backup"
    SNAPSHOT_NAME="vault-$(date +%Y%m%d-%H%M%S)"
    
    # Create snapshot
    zfs snapshot "${DATASET}@${SNAPSHOT_NAME}"
    
    # Keep only last 7 days of snapshots on source
    zfs list -t snapshot -o name -s creation | grep "^${DATASET}@vault-" | head -n -7 | xargs -r -n 1 zfs destroy
    
    echo "Created snapshot: ${DATASET}@${SNAPSHOT_NAME}"
    
  2. Make executable

    sudo chmod +x /usr/local/bin/vault-snapshot.sh
    
  3. Schedule snapshot creation

    sudo crontab -e
    
    # Add this line (create snapshot daily at 4 AM, after vault backups complete)
    0 4 * * * /usr/local/bin/vault-snapshot.sh >> /var/log/vault-snapshot.log 2>&1
    
  4. Create replication script

    sudo nano /usr/local/bin/vault-replicate.sh
    

    Add this content:

    #!/bin/bash
    # Replicate vault dataset to offsite Pi systems
    
    DATASET="zpool/vault/backup"
    PI1_HOST="pi-vault-1.local"  # Update with actual hostname/IP
    PI2_HOST="pi-vault-2.local"  # Update with actual hostname/IP
    PI_USER="admin"
    REMOTE_DATASET="tank/vault-backup"  # Update with actual dataset on Pi
    
    # Get the latest snapshot
    LATEST_SNAP=$(zfs list -t snapshot -o name -s creation | grep "^${DATASET}@vault-" | tail -n 1)
    
    if [ -z "$LATEST_SNAP" ]; then
      echo "No snapshots found for replication"
      exit 1
    fi
    
    echo "Replicating snapshot: $LATEST_SNAP"
    
    # Function to replicate to a target
    replicate_to_target() {
      local TARGET_HOST=$1
      echo "=== Replicating to $TARGET_HOST ==="
    
      # Get the last snapshot on remote (if any)
      LAST_REMOTE=$(ssh ${PI_USER}@${TARGET_HOST} "zfs list -t snapshot -o name -s creation 2>/dev/null | grep '^${REMOTE_DATASET}@vault-' | tail -n 1" || echo "")
    
      if [ -z "$LAST_REMOTE" ]; then
        # Initial replication (full send)
        echo "Performing initial full replication to $TARGET_HOST"
        zfs send -c $LATEST_SNAP | ssh ${PI_USER}@${TARGET_HOST} "zfs receive -F ${REMOTE_DATASET}"
      else
        # Incremental replication
        echo "Performing incremental replication to $TARGET_HOST"
        LAST_SNAP_NAME=$(echo $LAST_REMOTE | cut -d'@' -f2)
        zfs send -c -i ${DATASET}@${LAST_SNAP_NAME} $LATEST_SNAP | ssh ${PI_USER}@${TARGET_HOST} "zfs receive -F ${REMOTE_DATASET}"
      fi
    
      # Clean up old snapshots on remote (keep last 30 days)
      ssh ${PI_USER}@${TARGET_HOST} "zfs list -t snapshot -o name -s creation | grep '^${REMOTE_DATASET}@vault-' | head -n -30 | xargs -r -n 1 zfs destroy"
    
      echo "Replication to $TARGET_HOST completed"
    }
    
    # Replicate to both Pi systems
    replicate_to_target $PI1_HOST
    replicate_to_target $PI2_HOST
    
    echo "All replications completed at $(date)"
    
  5. Make executable

    sudo chmod +x /usr/local/bin/vault-replicate.sh
    
  6. Set up SSH keys for passwordless replication

    # Generate SSH key if needed
    ssh-keygen -t ed25519 -C "znas-replication"
    
    # Copy to both Pi systems
    ssh-copy-id admin@pi-vault-1.local
    ssh-copy-id admin@pi-vault-2.local
    
  7. Schedule replication

    sudo crontab -e
    
    # Add this line (replicate daily at 5 AM, after snapshot creation)
    0 5 * * * /usr/local/bin/vault-replicate.sh >> /var/log/vault-replicate.log 2>&1
    

Setup on Pi Vault Systems (Targets)

Repeat these steps on both Pi Vault 1 and Pi Vault 2:

  1. Create ZFS pool on SSD (if not already done)

    # Assuming SSD is /dev/sda
    sudo zpool create tank /dev/sda
    
  2. Create dataset for receiving backups

    sudo zfs create tank/vault-backup
    
  3. Set appropriate permissions

    # Allow the replication user to receive snapshots
    sudo zfs allow admin receive,create,mount,destroy tank/vault-backup
    
  4. Verify replication (after first run)

    zfs list -t snapshot | grep vault-
    

Maintenance and Monitoring

Regular Health Checks

On Clients:

# Linux
sudo kopia snapshot list
sudo kopia snapshot verify --file-parallelism=8
sudo kopia repository status

# Windows (PowerShell)
kopia snapshot list
kopia snapshot verify --file-parallelism=8
kopia repository status

On ZNAS:

# Check ZFS health
zpool status

# Check vault snapshots
zfs list -t snapshot | grep "vault/backup"

# Check replication logs
tail -f /var/log/vault-replicate.log

On Pi Vaults:

# Check received snapshots
zfs list -t snapshot | grep vault-backup

# Check available space
zfs list tank/vault-backup

Monthly Maintenance Tasks

  1. Verify vault backups are replicating

    # On ZNAS
    cat /var/log/vault-replicate.log | grep "completed"
    
    # On Pi systems
    zfs list -t snapshot -o name,creation | grep vault-backup | tail
    
  2. Test restore from vault repository

    # Connect to vault repo and verify a random snapshot
    kopia --config-file=/path/to/vault/config repository connect server --url=...
    kopia snapshot list
    kopia snapshot verify --file-parallelism=8
    
  3. Check disk space on all systems

  4. Review backup logs for errors

Backup Policy Recommendations

Primary Repository:

  • Retention: 7 daily, 4 weekly, 6 monthly
  • Compression: enabled
  • All data from clients

Vault Repository:

  • Retention: 14 daily, 8 weekly, 12 monthly, 3 yearly
  • Compression: enabled
  • Only critical data for offsite protection

ZFS Snapshots:

  • Keep 7 days on ZNAS (source)
  • Keep 30 days on Pi vaults (targets)

Disaster Recovery Procedures

Scenario 1: Restore from Primary Repository

# Linux
sudo kopia snapshot list
sudo kopia snapshot restore <snapshot-id> /restore/location

# Windows
kopia snapshot list
kopia snapshot restore <snapshot-id> C:\restore\location

Scenario 2: Restore from Vault Repository (Offsite)

If ZNAS is unavailable, restore directly from Pi vault:

  1. On Pi vault:

    # Mount the latest snapshot
    LATEST=$(zfs list -t snapshot -o name | grep vault-backup | tail -n 1)
    zfs clone $LATEST tank/vault-backup-restore
    
  2. Access Kopia repository directly:

    kopia repository connect filesystem --path=/tank/vault-backup-restore
    kopia snapshot list
    kopia snapshot restore <snapshot-id> /restore/location
    
  3. Clean up after restore:

    zfs destroy tank/vault-backup-restore
    

Scenario 3: Complete System Rebuild

  1. Rebuild ZNAS and restore vault dataset from Pi
  2. Reinstall Kopia server in Docker
  3. Point server to restored vault repository
  4. Reconnect clients to primary and vault repositories
  5. Resume scheduled backups

Troubleshooting

Client can't connect to repository

# Check server is running
docker ps | grep kopia

# Check firewall
sudo ufw status | grep 51515

# Verify certificate fingerprint
# SERVER CERT SHA256: 696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2

Vault replication failing

# Check SSH connectivity
ssh admin@pi-vault-1.local "echo Connected"

# Check ZFS pool health
zpool status

# Check remote dataset exists
ssh admin@pi-vault-1.local "zfs list tank/vault-backup"

# Manual test send
zfs send -n -v zpool/vault/backup@latest | ssh admin@pi-vault-1.local "cat > /dev/null"

Windows scheduled task not running

  • Check Task Scheduler → Task History
  • Verify PIN/password authentication (use password Harvey123= for task credential)
  • Check that computer is awake at scheduled time
  • Review power settings (prevent sleep, wake for tasks)
  • Check log files: C:\Logs\kopia-primary.log and C:\Logs\kopia-vault.log

Snapshot cleanup not working

# Manually clean old snapshots
zfs list -t snapshot -o name,used,creation | grep vault-backup

# Remove specific snapshot
zfs destroy zpool/vault/backup@vault-YYYYMMDD-HHMMSS

Security Notes

  1. Passwords in scripts: Current implementation stores passwords in plaintext in scripts. For production, consider:

    • Windows Credential Manager
    • Linux keyring or encrypted credential storage
    • Environment variables set at system level
  2. SSH keys: Replication uses SSH keys. Keep private keys secure and use passphrase protection where possible.

  3. Network security: Kopia server uses HTTPS with certificate validation. Ensure certificate fingerprint is verified on first connection.

  4. Physical security: Offsite Pi vaults should be stored in secure locations with different risk profiles (fire, flood, theft).


Quick Reference Commands

Kopia Client Commands

# List snapshots
kopia snapshot list

# Create snapshot
kopia snapshot create /path/to/backup

# Verify integrity
kopia snapshot verify --file-parallelism=8

# Check repository status
kopia repository status

# View policies
kopia policy list

# Mount snapshot (Linux)
kopia mount <snapshot-id> /mnt/snapshot

# Use alternate config (for vault repository)
kopia --config-file=/path/to/vault/repository.config snapshot list

ZFS Commands

# List snapshots
zfs list -t snapshot

# Create manual snapshot
zfs snapshot zpool/vault/backup@manual-$(date +%Y%m%d)

# Send full snapshot
zfs send zpool/vault/backup@snapshot | ssh user@host zfs receive tank/backup

# Send incremental
zfs send -i @old @new zpool/vault/backup | ssh user@host zfs receive tank/backup

# List replication progress
zpool status -v

# Check dataset size
zfs list -o space zpool/vault/backup

Appendix: System Specifications

ZNAS:

  • ZFS fileserver
  • Docker running Kopia server
  • IP: 192.168.5.10
  • Datasets:
    • /srv/vault/kopia_repository (zpool/vault/kopia_repository) - Primary repository
    • /srv/vault/backup (zpool/vault/backup) - Vault repository (replicated)

Clients:

  • docker2 (Linux) - Backs up /DockerVol/
    • Primary: Every 3 hours
    • Vault: Daily at 3 AM (critical directories only)
  • DESKTOP-QLSVD8P (Windows - Cindy's desktop) - Backs up C:\Users\cindy
    • Primary: Daily at 2 AM
    • Vault: Daily at 3 AM (Documents, Pictures, Important files)
    • Kopia password: LucyDog123
    • Task Scheduler credential: Harvey123=

Offsite Vaults:

  • Pi Vault 1 - Raspberry Pi with SSD (tank/vault-backup)
  • Pi Vault 2 - Raspberry Pi with SSD (tank/vault-backup)

Server Certificate:

  • SHA256: 696a4999f594b5273a174fd7cab677d8dd1628f9b9d27e557daa87103ee064b2

Workflow Summary

Daily Backup Flow

2:00 AM - Cindy's desktop primary backup runs 3:00 AM - docker2 vault backup runs 3:00 AM - Cindy's desktop vault backup runs 4:00 AM - ZNAS creates ZFS snapshot of vault dataset 5:00 AM - ZNAS replicates vault snapshot to both Pi systems Every 3 hours - docker2 primary backup runs

What Gets Backed Up Where

Primary Repository (Full Backups):

  • docker2: /DockerVol/ (all Docker volumes)
  • Cindy: C:\Users\cindy (entire user profile, minus temp files)

Vault Repository (Critical Data for Offsite):

  • docker2: Selected critical Docker volumes
  • Cindy: Documents, Pictures, Important desktop files

Offsite (Via ZFS Send):

  • Entire vault repository (all clients' critical data)
  • Replicated to 2 separate Pi systems

Future Enhancements

Consider adding:

  • Email notifications on backup failures
  • Monitoring dashboard (Grafana/Prometheus)
  • Backup validation automation
  • Additional retention policies per client
  • Encrypted credentials storage
  • Remote monitoring of Pi vault systems
  • Automated restore testing
  • Bandwidth throttling for replication
  • Multiple ZFS snapshot retention policies

Change Log

  • 2025-02-11 - Initial comprehensive documentation created
    • Added two-tier backup strategy (primary + vault)
    • Added ZFS replication procedures for offsite backup
    • Added Pi vault setup instructions
    • Added disaster recovery procedures
    • Consolidated all client configurations
    • Added workflow diagrams and timing

Support and Feedback

For issues or improvements to this documentation, contact the system administrator.

Useful Resources: