Simple Backups

Don't forget to make backups, folks! I strongly argue against using cloud backup services. You have probably no idea how and where data is being backed up. Besides, it is not that hard to do it yourself! As long as you are not running a server with >1000 people, you would probably just do fine backing up your home directory.

How to backup?

A simple backup archive of your home directory:

        $ 
        $ tar --create  --file BACKUP.tar /home/rob

This creates a so-called tarball of your folder. Data from your home directory is written to BACKUP.tar. If you are not familiar with linux, this is similar to the famous Windows zip file, but without compression. You might have encountered the tar.gz a few times. This is just like a tarball but compressed. We can add the option --auto-compress to compress. Tar automatically recognizes file extensions and compresses accordingly. Below, the first line creates a tar.gz file, the second one writes to a zip file:

        $ tar --create --auto-compress --file BACKUP.tar.gz /home/rob
        $ tar --create --auto-compress --file BACKUP.zip /home/rob

The complete /home/rob directory is copied to this archive. The easiest and safest to view and extract files is by using an archiver application with a gui. Using your terminal, tar can show the contents of the archive by using the --list function. Combining this with grep we can filter certain files:

        $ tar --list --file BACKUP.tar.gz | grep .doc
        /home/rob/Document.doc
        /home/rob/Document1.doc
        /home/rob/Document2.doc

So let's say you accidentely deleted Document2.doc, but luckily you made a backup the day before. You can extract and recover this file by:

        $ tar --keep-old-files --verbose --extract \ 
                home/rob/Document2.doc --file BACKUP.tar.gz

We add the option --keep-old-files to not accidentily overwrite the file in your home directory. For restoring the whole archive just leave out any arguments after --extract. Below illustrates the long and short version:

        $ tar --extract --verbose --file BACKUP.tar.gz
        $ tar -xvf BACKUP.tar.gz *short version

Where to backup?

Good question. Some options:

  1. to another folder: if you screw up editing some files in your current working directory
  2. to another partition: if you screw up your filesystem on your current partition
  3. to another drive: if your current drives gets toasted, i.e. mechanical failure.
  4. to another computer: if you accidentily burn your house down
  5. to many other computers, aka the cloud: if you want to make sure your data is backed up, but besides that also saved forever and being spread amongst many places around the world, vulnerable for whatever law is set by those countries.

Option a is similar as the command described above. Just make sure your backup is not in the same folder(!)

Option b, c are quite similar to each other. It is common to write the backup to a mounted external drive. Make sure you mounted correctly. The following command writes a compressed tarball of my home directory to my external harddrive mounted out /media/externaldrive:

        $ tar --create --auto-compress \
        --file /media/externalharddrive/BACKUP.tar.gz /home/rob

Option d. In this case we just simple copy the file (secure ssh copy) to another computer.

        $ tar --create --auto-compress --file BACKUP.tar.gz /home/rob
        $ scp BACKUP.tar.gz rob@1.2.3.4:/var/backups/

When to backup?

Another good question.

  1. never
  2. rarely (or manually)
  3. scheduled, cronjobs
  4. on change, inotify
  5. always

Option b is just manually typing in backup commands a few times a week. For option c you can write a simple shell script (i.e. backup.sh) and put it in /etc/cron.daily. For example, the following script would write a daily backup (format like: BACKUP-2019-03-04.tar.gz) to a backup directory

        #!/bin/bash
        # write full backups to external drive 
        BACKUP_OF_DIR=/home/rob
        WRITE_TO_DIR=/media/external
        DATE=$(date -I)

        # if directory exist, create an archive
        if [-d $WRITE_TO_DIR ]; then
                tar --create --auto-compress --file $WRITE_TO_DIR/BACKUP-$DATE.tar.gz 
        fi

        # remove made backups older than 90 days
        # TODO

Option d: TODO

So, option e would be some kind of trojan horse on your computer or device, continuously scanning folders, eating resources and uploading to the cloud.

Final words

This article can be viewed as the basics of basics. Of course, there are many other ways to do backups. There are also a few security and performance issues involved. Whatever tools , good decisions about when and where to backup are equal or even more important.