Today, we’re going to be covering the 3 most important rules for any Linux System Administrator: Backups, Backups, Backups.
That’s right, if you don’t have backups, you have failed in your duties. Every single thing else you may have done to secure your system cannot replace the need for backups. Systems get cracked, hard drives fail, CPUs fail, RAM fails, password are forgotten, files get rm’d by mistake, patches break systems.
This article is about creating a simple backup script to backup your MySQL databases to an offsite location, complete with cron jobs, and encryption.
Q: Aren’t automated backups complicated, and don’t I have to use costly proprietary software?
A: No, and no. Sure, you can take that route, and many environments choose to go that route. Frankly, it all depends on your environment’s needs, but in my limited experience the way most companies go about their backups is all wrong.
Create MySQL user with limited permissions
The first step is to create a user with limited privileges: essentially read-only access and only on those databases which are required. The reasons for this should be obvious. In my case, it’s appropriate to give read access to only one database for my backup user. We’ll also be granting the ‘LOCK TABLES‘ permission as well, since that is required to use the mysqldump command.
Authenticate to your mysql instance (generally called with something like mysql -u mydbuser -p ) Now, let’s create the user and add the permissions.
You should now my a prompt that reads mysql>
CREATE USER user_name; SET PASSWORD FOR user_name = PASSWORD 'mypassword'; GRANT SELECT, LOCK TABLES ON database_name.* TO user_name;
Don’t forget the semicolon at the end of each line. If you do, you’ll just get a blank mysql> prompt, just go ahead any type a semicolon (;) by itself and hit return (I do this constantly).
Bash Scripting Outline
Now that we have our user setup, let’s go over the basic flow of our script.
- Establish appropriate filenames
- Dump the contents of each db
- Compress the contents of each db
- Encrypt the contents so they are securely transmitted
- Transmit our backups offsite
- Cleanup a few files
For compression, we’re going to be compressing these backups with bzip2, which offers better compression the gzip
For encryption, we’re going to be using gpg with a simple passphrase. While gpg does compress files, it’s not to the level of bzip2; since we’re going to schedule our backups during off-peak hours, the extra CPU time shouldn’t be a huge issue. If CPU and RAM is thin in your environment, you can adjust your compression scheme as you see fit. Also, if your SQL dumps are extremely large, you may want to make adjustments as well.
In our script, we’re creating file names composed of the current date and the database name. This script will not complete 100% successfully if the files already exist. If you are testing this script out, be sure to manually remove each file created by the script before running it again.
Please note, in the filename portion, `date…` does not have single quotes, those are back-ticks, which are above the ‘tab’ key on a US keyboard. This causes bash to run that portion as a command. This script is intended for daily backups. If you need more frequent backups, you can append the date command to include the hours and minutes if you wish.
An additional package which may not already be installed on your system is required for this script. The package is mutt. It’s a nice terminal email program that is very lightweight, and allows the addition of attachments to your mail. The standard mail command does not allow file attachments.
If you decide to copy and paste the script below, make sure your backups directory exists! I use /home/mybackupadmin/backups in this example, but you can use whatever you wish. I have created a user mybackupadmin on the system, and stored the script in the /home/mybackupadmin directory as backupscript. The script should be owned by the mybackupadmin user, and the directory and all of it’s contents should only accessibly by that user. Make sure the script is executable by running chmod o+x /home/mybackupadmin/backupscript
I left in some comments if you want to backup other dbs on the same system, you can see how you only need to edit a couple of lines. For larger databases, it may make more sense to back up each db in it’s own script, that way they can be dumped in parallel, but you should edit the last portion of the script to remove the *.gpg portions to something filename specific.
Completed script:
#!/bin/bash dbName1="mydb1" #dbName2="mydb2" #dbName3="mysql" fileName1=$dbName1"_`date +%Y-%m-%d`.sql" #fileName2=$dbName2"_`date +%Y-%m-%d`.sql" #fileName3=$dbName3"_`date +%Y-%m-%d`.sql" backupDir="/home/mybackupadmin/backups" #mysqldump -u myuser --password='mypassword' --all-databases --single-transaction > newdump.filz mysqldump -u myuser --password='mypassword' $dbName1 > $backupDir/$fileName1 #mysqldump -u myuser --password='mypassword' $dbName2 > $backupDir/$fileName2 #mysqldump -u myuser --password='mypassword' $dbName3 > $backupDir/$fileName3 bzip2 $backupDir/$fileName1 #bzip2 $backupDir/$fileName2 #bzip2 $backupDir/$fileName3 gpg -c --passphrase 'MyEncryptionP@ZZphras3' $backupDir/$fileName1".bz2" #gpg -c --passphrase 'MyEncryptionP@ZZphras3' $backupDir/$fileName2".bz2" #gpg -c --passphrase 'MyEncryptionP@ZZphras3' $backupDir/$fileName3".bz2" echo "backups completed" | mutt -s "Backups for `date +%D`" -a $backupDir/*.gpg -- myemail@mydomain.com rm $backupDir/*.gpg echo "Backups completed on `date +%D`" >> $backupDir/backuplog.log
After the file is encrypted, it is emailed and removed from the system. The database dump will still exist in /root/backups in it’s unencrypted form. The reason for this is a) there is no point in encrypting the backups on the local system, so we remove the extra file b)keeping the backups intact at least for a time will prevent us from having to re-upload recent backups, saving valuable bandwidth if you’re on a limited web-hosting plan.
Setting Up the Cron Job
If you are root, you can edit the mybackupadmin user’s cron file with the following command:
crontab -u mybackupadmin -e
Insert the following into the crontab:
10 0 * * * /home/mybackupadmin/backupscript
The format is minute hour day month day_of_week command. The code above will run /home/mybackupadmin/backupscript every day at 12:10 AM.