I’m planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I’d like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I’m more trying to go for a 2-1 with 2 copies and one offsite, but that’s besides the point. Now I’m wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it’s gonna be a few TB worth of HDDs which aren’t exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it’s uploaded?

While I’d preferably upload the data to a provider I trust, accidents happen, and since they don’t need to access the data, I’d prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there’s the offsite storage provider. Personally I’d appreciate as many suggestions as possible, as there is of course no one size fits all, so if you’ve got good experiences with any, please do send their names. I’m basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I’d like to be able to get the data off there after I’ve replaced my drives. That’s all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it’s mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn’t do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

    • dave@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      using a meshVPN like tailscale or netbird would another option as well. it would allow you to use proper backup software like restic or whatever, and with tailscale on both devices, it would allow restic to be able to find the pi device even if the other person moved to a new house. (although a pi with ethernet would be preferable so all they have to do is plug it in to their new network and everything would be good. if it was a pi zero then someone would have to update the wifi password)

      • huquad@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Funny you mention it. This is exactly what I do. Don’t use the relay servers for syncthing, just my tailnet for device to device networking.

      • huquad@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        12 hours ago

        My most critical data is only ~2-3TB, including backups of all my documents and family photos, so I have a 4TB ssd attached which the pi also boots from. I have ~40TB of other Linux isos that have 2-drive redundancy, but no backups. If I lose those, i can always redownload.

    • Matriks404@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      16 hours ago

      Huh, that’s a pretty good idea. I already have a Raspberry Pi setup at home, and it wouldn’t be hard to duplicate in other location.

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        16 hours ago

        In theory you could setup a cron with a docker compose to fire up a container, sync and once all endpoint jobs are synced to shut down.
        As it seemingly has an API it should be possible.

      • huquad@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        Agreed. I have it configured on a delay and with multiple file versions. I also have another pi running rsnapshot (rsync tool).

      • thejml@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        Have it sync the backup files from the -2- part. You can then copy them out of the syncthing folder to a local one with a cron to rotate them. That way you get the sync offsite and you can keep them out of the rotation as long as you want.