Backup Your Data!
“Important data always has a backup. You can also invert this: If there is no backup for some data,…” (Stonki on email@example.com) What follows comes *with no warranties*. If you trash your data, its your fault!
This is far from complete! It needs you to become something valuable.
Backing up data always boils down to copying files to a secure location, which is (well at least should) not be harmed when the original data get corrupted.
There are different levels of
secure locations as well as different levels of
corruption. A RAID-system is considered protection against hardware failures. It is not a substitute for backing up. If you accidently
rm -rf /, what happens to the mirror? It will show the void of the original. Glad the one who has a backup. It has also to be considered what
milestones should be kept, and which backups should be overwritten by new ones. If a daily backup overwrites yesterdays, we have no luck in case of failure during the time of backup. If we overwrite the backup made two days ago, we might not notice small corruption of data and copy worthless data.
full, incremental, going mental… someone will have to fill in this blank.
Pizza is in the oven ;)
Simply copy the data to another directory. This directory can be on another disk another disk on another controller * another computer (mounted via NFS, Samba or any other networking filesystem.) Anything is possible. You should make sure that all files (also the hidden ones starting with a .) are copied.
laptop:/home/stw # cp -ax /important-data /save/storage/
should do the trick.
man cp is your friend.
tar put many files into one big file. With some options, it will also compress the data. That will save you disk-space, but will make the backup more prone to data-loss in case of corruption of the archive.
laptop:/home/stw # tar cfz /save/storage/important-data.tgz /important-data for zip-compression. laptop:/home/stw # tar cfj /save/storage/important-data.bz2 /important-data for b2zip-compression. (Better but slower)
store away the archives using any other method, like cp, scp, rsync or burning to CD/DVD.
This is what I use for backing up my main data on a daily basis.
# script: DailyBackup
export FN=`date +%Y-%m-%d-%a-%Hh%M.tgz`
time tar cvfz /save/storage/Daily-$FN /important-data
# End of Script: DailyBackup
I have made this script the login-shell of a certain
backup-user. That way, anybody (even on a windows-box) with no knowlege of linux can trigger the backup by logging in via ssh as that user.
The archives have the date and time in their names and are not overwritten automatically. I need to weed out unnecesary backups once in a while before the disk runs full.
Rsync syncronizes directories. It does not copy data needlessly. It determines which files have changed and which part of the file has changed and only transfers the differences. Pretty neat algorithm. This makes rsync ideal for backups across some media which might be payable by time or bandwidth (read: Internet).
laptop:/home/stw # rsync -v –stats
-e ssh –delete -caz /important-data user@backuphost:~/save/storage/
This copies the important data to another host. -e ssh uses a secure shell as transport-media, thus makes it safe to be used across the internet for offsite backups.
–delete deletes all files in the destination-directory which are not present (anymore) in the source-directory (watch out if you really want this).
You guessed it:
man rsync is your friend
Certainly, you can use any of the available frontends like K3B and the like to write the CD/DVD.