52
submitted 7 months ago by WbrJr@lemmy.ml to c/linux@lemmy.ml

Hi! A friend just recommended the backup tool that comes with Ubuntu. I took a look at it and was wondering what you guys include and exclude from the backups. I just installed wire guard VPN and but the config file in the etc/wireguard folder, where it belongs. I would have to include this folder as well if I want to keep my configs. And I guess many programs do the same, so how do you know what to include, so you can just revert to the last backup if something breaks or you get a new machine? Maybe that is a stupid question, but it was going through my head for some time now. Thanks a lot!

top 30 comments
sorted by: hot top controversial new old
[-] atzanteol@sh.itjust.works 15 points 7 months ago

My philosophy is "anything I can't reproduce easily". This will vary depending on the machine and data. But it's been a good guide so far.

[-] Nia_The_Cat@beehaw.org 10 points 7 months ago* (last edited 7 months ago)

I back up the entirety of my /home directory except for cache, temp, trash (trash is stored at /home/$user/.local/share/Trash), Download folder, and a folder I named "NOBACKUP".

It backs up a lot of stuff I probably don't need, but I'd rather back up more than I need, than to be caught not backing up something that I did need.

edit: oh, I have a btrfs snapshot of /root too, but I don't think that's something the backup tool in Ubuntu can do since it defaults to ext4

[-] lemmyvore@feddit.nl 4 points 7 months ago

It's best not to overuse native filesystem snapshots. Someone else was saying they delete them daily, that's the right spirit.

Filesystem snapshots can't be dissociated from that filesystem and they are strictly incremental to the point they're literally all-or-nothing which is quite inconvenient.

They're good for those "oh fuck" moments when you've just deleted the wrong dir but that's about it.

[-] Nia_The_Cat@beehaw.org 1 points 7 months ago* (last edited 7 months ago)

That's a good point, I use my /home backup via borgbackup which I keep for a bit longer (store 7 days + last 2 weekly before it prunes them), and my /root btrfs snapshots were set to be kept for 7 days just out of habit. I'll probably dial it back to 2-3 days instead. I do intend them as just rollbacks rather than actual backups but I tend to be too overly cautious for my own good sometimes

I like to keep a few more than the last day of snapshots as a minimum in case there was something silently breaking my system that I didn't notice for a few days and is too advanced for me to fix normally

[-] lurch@sh.itjust.works 6 points 7 months ago* (last edited 7 months ago)

the most important thing is your user files. everything else just speeds up recovery.

you should keep a bootable recovery medium around, like an installer USB, so you don't have to bother your neighbours for one at 2 in the morning..

to restore faster, you either make disk images (can restore everything quickly in one go) or save partiton layouts, configs and package selections as well as everything you installed without package management. if you don't do this second part, you have to sit through a reinstall and figure out everything again and that sucks if you don't have time. like you really need to open that document, but you forgot the name of the program you use to edit it etc..

if you use just one large file system, you can tar everything up, using --one-file-system , so it skips stuff like the inside of mounted snap packages, which also are present in another place. on restore you then have to format untar and install a boot loader. beware that EFI boot can be difficult to set up and has another partition. so this is just for pros. however, this enables to use tar features like differential backups.

[-] avidamoeba@lemmy.ca 3 points 7 months ago

Does --one-file-system prevent the need for excluding /dev, /sys, etc.?

[-] lemmyvore@feddit.nl 3 points 7 months ago

Yes but be careful with that option because it depends how it's implemented by each tool. Some of them will not cross btrfs subvolumes for example.

[-] avidamoeba@lemmy.ca 1 points 7 months ago

Yeah, I think I can see some other funny cases if there's multiple partitions with separate filesystems on them. Just doing the regular tar/rsync with exclusions is likely safer as it would work for either case.

[-] ChojinDSL@discuss.tchncs.de 5 points 7 months ago

If you don't know, or aren't sure. Backup everything if you have the space. Once you've hit a couple of disaster scenarios, it will become apparent what stuff is really important.

Obviously, the stuff you can't recreate otherwise is most important. But apart from that, even the stuff you can recreate from other sources might be worth backing up because of time savings. E.g. faster to restore from backup than to recreate.

[-] kevincox@lemmy.ml 3 points 7 months ago

Yup. Step 1 is backup everything. Step 2 is maybe improve your reproducibility and then remove the things that can be reproduced from the backups.

[-] pbjamm@beehaw.org 1 points 7 months ago

Also, while it may be fairly easy to recreate the OS/Application install from scratch that is generally small potatoes storage wise compared to you music/movies/photos etc that you for sure want to back up.

[-] Brewchin@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

Great advice. For me, it's the irreplaceable data first, and then stuff like configs and credentials/keys.

My borg-backup (to my NAS) config is "My Documents" type files, /etc stuff I'm likely to customise, and home stuff except the stuff like "*Cache", "*Storage", assets/icons/history/recent/blah. It's tedious to fine-tune, but I figure too much is infinitely better than too little.

If I want to be able to do an image-based restore, then I'd use a different tool. But life's too short for that.

[-] Asudox@lemmy.world 4 points 7 months ago

I generally backup the entire home folder and the configuration files.

[-] limelight79@lemm.ee 4 points 7 months ago

Data and configurations.

If you have the space, software is nice because it's easier to get the system going again, but the data (your files - music, documents, pictures) and system configuration files (/etc for example) are the most critical. If you have databases set up, learn about their dump commands and add that.

You don't have to use the same method for everything. My pictures are backed up to another side in a second computer and to Amazon Glacier for $2/month (I'll have to pay to download them if I ever need it, but I'll gladly pay if I'm in that situation - those should only be needed if I have a major house fire or something like that). My weekly backups are my /home directories, /etc, /root, a database dump, and maybe one or two other important things.

[-] kevincox@lemmy.ml 3 points 7 months ago

Really configuration is best not backed up but created from some source of truth like a Git repo. But a backup can serve as a poor-man's version control.

[-] limelight79@lemm.ee 2 points 7 months ago* (last edited 7 months ago)

An interesting idea, but it might be overkill for a home setup.

[-] Penguincoder@beehaw.org 2 points 7 months ago

An OS can be restored. Backup your data, so /home for sure and maybe any custom configs for /etc, like your wireguard configs. So anything you specifically edited/added for /etc directory.

[-] everett@lemmy.ml 5 points 7 months ago

Skipping the OS backup is reasonable, but you probably want to at least save a package list. Add something like dpkg -l > ~/packages.txt to your backup script.

[-] avidamoeba@lemmy.ca 2 points 7 months ago* (last edited 7 months ago)

If you want to be able to restore the machine completely, with everything installed and configured, then yes you have to backup everything. There's generally two ways, file-level backup where you'd use something like rsync, tar, etc. and block-level where you'd backup the whole partition/disk using something like dd, clonezilla, etc. The latter is the easiest to restore but it's a bit of a pain to backup because the system generally has to be offline, booted from alternative OS. The forner is a bit more difficult to restore but not by much, and it's so easier to backup. You can do it while the system is live. I'd probably try that first. Find documentation on backing up a complete root filesystem with rsync/tar and you're good to go. Some ideas. It's typically a single command which can be run on a schedule.

The built-in GUI backup tool is generally intended for your own user data. In order to be able to backup other things it'll have to run as root or be given caps and that might get more complicated than using straight rsync/tar.

[-] lemmyvore@feddit.nl 2 points 7 months ago

You can use Borg for both things you mentioned. It stores deduplicated chunks so it doesn't care if you backup files or a block device.

Not sure why you'd have to be offline to do that though.

[-] avidamoeba@lemmy.ca 1 points 7 months ago* (last edited 7 months ago)

Because if you're not offline, something is writing to the filesystem and changing blocks while you're copying. If you're lucky what you copied would be outdated. If you're less lucky it would cause fs inconsistency which would be cleaned up by fsck. If you are even less lucky you'd end up with silently corrupted files, e.g. a text file with old parts mixed with new. If you're even less lucky, you'd hit a vital metadata part of the fs and it would not be mountable anymore.

To clarify, the filesystem being block-copied has to be offline or mounted RO, not the whole OS. However if that's the root/home filesystem, then you can't unmount it while the OS is online.

If you don't want to deal with that you need a filesystem or volume manager that supports snapshots, then you can copy the snapshot. E.g. snapshot your root LVM vol, then block-copy the snapshot.

[-] WbrJr@lemmy.ml 1 points 7 months ago

What I am always wondering, to set up Linux until everything runs without problem, it takes quite some time for me. I use Linux for about a year regularly, and had to set it up about 4-5 times. And it almost always is a pain and I need to search online for some time until everything works. Is it getting easier the more often it's done? Or do you create a setup script that runs everything if you reinstall the system?

[-] avidamoeba@lemmy.ca 1 points 7 months ago* (last edited 7 months ago)

I use config-as-code for some stuff but in reality there are many manual steps that aren't covered. This is why I run an LVM mirror (RAID1) with two SSDs and I keep a full backup. The system hasn't been reinstalled in 10 years.

If you feel the way you do, you should probably just do a full disk backup with clonezilla or dd every X days and be done with it. If X is large, e.g. months, you should also run home dir backup more often. The Ubuntu built-in tool is great for that. Then when something dies, restore the whole OS from the clonezilla/dd backup, boot, then restore the most recent home dir backup, reboot, and you're back. Minimal effort.

[-] AnokLola@mastodon.social 0 points 7 months ago

@avidamoeba @WbrJr Just install a pre-configured distro like Mint or Fedora and stay away from Arch

[-] WbrJr@lemmy.ml 1 points 7 months ago

I started my journey with fedora, but got annoyed by things like not working videos. Ubuntu works for me pretty well and I had very little issues with it compared to fedora. And that's what I seek in an os

[-] UnfortunateShort@lemmy.world 1 points 6 months ago

I auto-backup my entire /home, except for stuff I explicitly exclude and hidden files. I only explicitly include some of the latter, because I don't want to back up all the stuff programs put there without my knowledge.

Config files outside of /home I copy semi-manually to and from a dedicated dir in which I replicate exactly where they go in my actual FS. I have written shell functions that easily allow me to backup and restore stuff from there and it's synced to my cloud storage.

[-] GadgeteerZA@fedia.io 1 points 7 months ago

@WbrJr@lemmy.ml I'm on Manjaro Linux but principles are the same. I have an SSD boot drive and a 4TB hard drive for /home data etc. I also have a second 4TB drive for backups:

  1. Timeshift app - does snapshots of OS to backup drive. I have 4x hourly snapshots, 2 daily ones, and one weekly one. This allows easy roll back from any updates or upgrades that went wrong.
  2. luckyBackup app - does a full rsync backup daily of /home data and configs. There are other rsync apps too, and you can opt for versions it you have space. But usually I've been fine with recovering anything I deleted or overwrote by mistake. I do this more for hard drive failure. I do also have one additional 1TB drive I keep in a safe. I connect this myself once a month or so for an offline backup.
[-] cmnybo@discuss.tchncs.de 1 points 7 months ago

I take a btrfs snapshot of my root partition daily so I can easily revert to an older version if I break something or get a bad update. There's nothing on my desktop or laptop root partition that can't be easily replaced, so I don't bother with any backups apart from the snapshots.

On my server, I keep multiple backups of /etc/ since there is a lot of stuff in there that I manually setup.

If you just want to backup the configuration, you can backup the entire /etc/ directory, it will only take a few MB when compressed.

[-] Tick_Dracy@lemm.ee 0 points 7 months ago* (last edited 7 months ago)

Hijacking this topic, I use this software on Windows, which does incremental backups of the system (including the OS, alongside documents, downloads, etc). It can also be easily restored by booting a custom image from an USB and restore the image created.

Is there anything like this with Linux?

[-] gigatexal@mastodon.social 1 points 7 months ago

@Tick_Dracy @WbrJr ZFS snapshots and boot environments could probably do this. Not sure about the usb thing though. @allanjude (tagging Allan so I don’t besmirch ZFS too much).

this post was submitted on 07 Apr 2024
52 points (100.0% liked)

Linux

48143 readers
747 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS