I see many posts asking about what other lemmings are hosting, but I’m curious about your backups.

I’m using duplicity myself, but I’m considering switching to borgbackup when 2.0 is stable. I’ve had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

  • davad@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    Restic using resticprofile for scheduling and configuring it. I do frequent backups to my NAS and have a second schedule that pushes to Backblaze B2.

  • conrad82@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    I use syncthing to sync files between phone, pc and server.

    The server runs proxmox, with a proxmox backup server in VM. A raspberry pi pulls the backups to an usb ssd, and also rclone them to backblaze.

    Syncthing is nice. I don’t backup my pc, as it is done by the server. Reinstalling the pc requires almost no preparation, just set up syncthing again

  • 💡dim@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    All nextcloud data gets mirrored with rsync to a second drive, so it’s in 3 places, original source and twice on the server

    Databases are backed up nightly by webmin to second drive

    Then installations, databases etc are sent to backblaze storage with duplicati

  • KitchenNo2246@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    I use borgbackup + zabbix for monitoring.

    At home, I have all my files get backed up to rsync.net since the price is lower for borg repos.

    At work, I have a dedicated backup server running borgbackup that pulls backups from my servers and stores it locally as well as uploading to rsync.net. The local backup means restoring is faster, unless of course that dies.

  • Amius@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    Holy crap. Duplicity is what I’ve been missing my entire life. Thank you for this.

  • cwiggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    My important data is backed up via Synology DSM Hyper backup to:

    • Local external HDD attached via USB.
    • Remote to backblaze (costs about $1/month for ~100gb of data)

    I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren’t crucial, it would just be helpful to rebuild if something went down.

  • local_taxi_fix@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    For PCs, Daily incremental backups to local storage, daily syncs to my main unRAID server, and weekly off-site copies to a raspberry pi with a large external HDD running at a family member’s place. The unRAID server itself has it’s config backed up to the unRAID servers and all the local docker stores also to the off-site pi. The most important stuff (pictures, recovery phrases, etc) is further backed up in Google drive.

  • cwiggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    My important data is backed up via Synology DSM Hyper backup to:

    • Local external HDD attached via USB.
    • Remote to backblaze (costs about $1/month for ~100gb of data)

    I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren’t crucial, it would just be helpful to rebuild if something went down.

  • rambos@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Am I the only one using kopia :)?

    Im quite new in selfohsting and backups. I went for duplicaty and it is perfect, but heared bad stories and now I use kopia daily backups to another drive and also to B2. Duplicaty is still doing daily backups, but only few important folders to google drive.

    Ive heared only good stories about kopia and no one mentioned it

    • manned_meatball@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 years ago

      there are dozens of us, dozens!

      but seriously, it’s the best one I’ve seen for e2e encrypted + incremental cloud backups. It checks all the boxes for what I want from a backup tool. Recovery is really easy too.

  • ipkpjersi@lemmy.one
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 years ago

    I usually write my own scripts with rsync for backups since I already have my OS installs pretty much automated also with scripts.

  • hxhz@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    I use a Backuppc instance hosted on an off site server with a 1Tb drive. It connects through ssh to all my vms and backups /home and any other folders i may need. It handles full and incremental backups, deduplication, and compression.

  • tj@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 years ago

    I have a central NAS server that hosts all my personal files and shares them (via smb, ssh, syncthing and jellyfin). It also pulls backups from all my local servers and cloud services (google drive, onedrive, dropbox, evernote, mail, calender and contacts, etc.). It runs zfs raid 1 and snapshots every 15 minute. Every night it backs up important files to Backblaze in a US region and azure in a EU region (using restic).

    I have a bootstrap procedure in place to do a “clean room recovery” assuming I lost access to all my devices - i only need to remember a tediously long encryption password for a small package containing everything needed to recover from scratch. It is tested every year during Christmas holidays including comparing every single backed and restored file with the original via md5/sha256 comparison.