This article outlines a MySQL database backup script that allows you to implement an automated daily (or more frequent) date stamped and compressed backup of your MySQL databases. The advantage of using this script is that it will backup all the databases, store them in a directory (/data/db-backups) and logs its activity to a file in (/logs/db-backup) for auditing and review in case of any issues.

Ideally the script needs to be automated using CRON so that at least everyday a full backup of each database occurs. This will enable you to have some degree of recoverability in case of errors in your database or assist in the easy migration of your database(s) to another system. Like all automated processes, once installed it should continue forever to make backups so you will need to ensure that old backups are removed so that your disks do not fill up.

The Script

The script is a simple clean BASH shell script tested on CentOS Linux Distribution but capable of running on other variants of Linux with little to no modification.

  1. Cut and paste the following text into an editor and make changes as needed to enhance or reduce the functionality.
# mysql-backup.sh
# use mysqldump to Dump DB and compress it on the fly to a mounted partition
mkdir -p $BACKUP_DIR
chmod 777 $BACKUP_DIR
SERIAL="`date +%Y%m%d-%H%M%S`"

# Log Functions
function LogStart
echo "====== Log Start =========" >> $LF
echo "Time: `date`" >> $LF
echo " " >> $LF
function LogEnd
echo " " >> $LF
echo "Time: `date`" >> $LF
echo "====== Log End   =========" >> $LF

function GetDBList
echo "Calling GetDBList()" >> $LF
mysqlshow |grep "|"| tr -d ' '|tr -d '|'| egrep -v Databases > $DBLIST

function DoBackup
echo "Calling DoBackup()" >> $LF

echo "Host [$H]" >> $LF
echo "DB File [$DBFILE]" >> $LF
if [ -a  $DBFILE ]
mv $DBFILE $DBFILE.`date '+%M%S'`
echo "Dumping ${DB}" >> $LF
mysqldump -B ${DB}  --add-drop-database --add-drop-table >> ${DBFILE}
echo "Zipping up file!" >> $LF
gzip ${DBFILE}
echo "Done!" >> $LF

FILE_DATE=`date '+%Y-%m-%d'`
mkdir -p $LF_DIR
chmod 777 $LF_DIR
touch $LF
chmod 664 $LF


#                     MAIN Code Start

while read line
echo "Backuping up: $line"
done < $DBLIST
echo "All backups Completed" >> $LF

Testing the Backup

  1. To verify that the backup script is working first execute the backup command as follows:
  2. Then, perform a directory listing of the /data/db-backup directory created by the script, you should see some files, each representing your compressed, backed up database!
    [root@yourserver]#ls -l /data/db-backups
    -rw-r--r-- 1 root root    510 Mar  5 15:21 db-information_schema-20140305-152103.sql.gz
    -rw-r--r-- 1 root root    526 Mar  5 15:22 db-information_schema-20140305-152229.sql.gz
    -rw-r--r-- 1 root root 142122 Mar  5 15:21 db-mysql-20140305-152103.sql.gz
    -rw-r--r-- 1 root root 142283 Mar  5 15:22 db-mysql-20140305-152229.sql.gz

Automating the Script

The script should be placed in a directory called /usr/local/bin. This directory is specifically set aside for user written system applicable scripts.

The script should be made executable by the root user, use the following command to flag the file as executable:

"chmod 744 /usr/local/bin/mysql-backup.sh"

To automate the execution of the script we will use the CRON application, so we will add a line to our crontab using the command "crontab -e", this will pop up an editor and we will add a line to as follows:

30 22 * * * /usr/local/bin/mysql-backups.sh > /dev/null 2>&1

Basically this will enable the script to run at 10:30pm every day. It will also hide any output from the program as the program logs to disk.


To recover a backup, use gunzip to de-compress the file then use mysql < backup-file-date-time.sql to play back the SQL file, this will overwrite any existing data.

Here is a typical example of what you need to run:

cd /data/db-backups
gunzip db-mybigdatabase-20140305-152229.sql.gz
mysql < db-mybigdatabase-20140305-152229.sql

Warning: Be very careful of what you are doing as you will overwrite your database if you make a mistake.

Where to from here?

If you are backing up your server, consider automatically copying the files to another server and recovering them automatically, this enables you to have a production copy of select databases for in-house testing and auditing purposes.

Was this article helpful?

Related Articles