I have used variation of this backup script for quick backups of typical LAMP stack websites (e.g. based on Drupal or WordPress). This saves the database along with the files into an archive…
#!/bin/bash # Check number of arguments if [ $# -ne 5 ]; then echo "Wrong number of arguments provided. Run as:" echo "$0 dbhost dbname dbuser dbpass webroot" exit 1 fi # Set arguments as more readable variable names dbhost=$1 dbname=$2 dbuser=$3 dbpass=$4 webroot=$5 # Print argument values echo "DB host: ${dbhost}" echo "DB name: ${dbname}" echo "DB user: ${dbuser}" echo "DB pass: ${dbpass}" echo "Webroot: ${webroot}" sleep 3 # Assign a backup ID for this run backup_id=$dbname-$(date +'%Y-%m-%d-%H-%M-%S') # Prepare the temp directory echo "Preparing temp directory" sleep 3 tempdir=/tmp/$backup_id mkdir -p $tempdir/webroot if [ $? != 0 ]; then echo "Could not make directory ${tempdir}/webroot. Exiting"; exit 1 fi echo "Temp directory: ${tempdir}" # Dump the web DB to an SQL file echo "Backing up database to ${tempdir}/${dbname}.sql" sleep 3 cd $tempdir mysqldump --password=$dbpass --user=$dbuser --host=$dbhost --add-drop-table $dbname > $dbname.sql if [ $? != 0 ]; then echo "Could create ${tempdir}${dbname}.sql. Exiting"; exit 1 fi echo "DB backup file: ${tempdir}${dbname}.sql" # Copy web files to temp directory echo "Copying files from ${webroot} to ${tempdir}/webroot" sleep 3 cp -fvrT $webroot $tempdir/webroot if [ $? != 0 ]; then echo "Could not copy all files. Exiting"; exit 1 fi # Archive/compress tarball echo "Packaging /tmp/${backup_id} to /tmp/${backup_id}.tar.gz " sleep 3 cd /tmp tar czvf $backup_id.tar.gz $backup_id if [ $? != 0 ]; then echo "Could not copy all files. Exiting"; exit 1 fi # Clean up temp files and let user know where to find the backup file echo "Removing temp directory $tempdir" rm -vr $tempdir echo "Backup file: /tmp/${backup_id}.tar.gz"
Depending on the website, there may be various non-web-accessible (private) directories to include in the script as well. Once the script is modified for a particular website, it can be put in a cron job and modified to copy the archive to a backup server.