Heal Your Church WebSite

Teaching, rebuking, correcting & training in righteous web design.

cPanel backup using an MS-DOS batch file and wget

Unless you want to end up like Blanche DuBois, never, ever depend on the kindness of strangers to safeguard your backups. This means keeping a set of backups that are far removed from where your site is hosted. This way if and when your webserver gets slammed by a streetcar named disaster, you’re in a position to quickly hop onto a new platform and get your church or charity’s site back online.

One of the biggest problems with offline backups is when to run them and where to store them. For many of you, you’ve signed up with an inexpensive webhost that offers the cPanel control panel which in turn requires that you login with your browser to get at your backups via port 2082 (as opposed to the default, 80). This is great for security reasons, not so great if you’d like to automate backups so they run at 3:00 AM.

So for the sake of a good night’s sleep and to get your backups offsite, I’ve written a simple MS-DOS Batch file that you can schedule to run as a “Scheduled Task” via your local PC’s control panel (not to be confused with your webserver’s cPanel). It requires six simple things:

  • That you have wget installed on your machine, which you can get for free at GnuWin32.
  • Be sure your DATE/T command delimits dates using a backslash ‘/’ … otherwise edit bach file to taste.
  • Either create a a directory named c:\backups\foo or modify the batch file to suit your needs.
  • Edit te batch file, replacing ‘USR’ with your website account username.
  • Schedule a task with a parameter, in this case, the parameter is your website’s password.
  • That your clock and your website’s clock will be on the same date at the time the script is executed.
@echo off
rem – - – - – - – - – - – - – - – - – - – - – -
rem – got this script via http://www.healyourchurchwebsite.com
rem – syntax: c:\sitebackup.bat PASSWORD
rem – - – - – - – - – - – - – - – - – - – - – -
cd c:\backups\foo

for /f “tokens=2-4 delims=/ ” %%a in (‘DATE/T’) do set mdate=%%c%%a%%b
for /f “tokens=2-4 delims=/ ” %%a in (‘DATE/T’) do set MM=%%a
for /f “tokens=2-4 delims=/ ” %%a in (‘DATE/T’) do set DD=%%b
for /f “tokens=2-4 delims=/ ” %%a in (‘DATE/T’) do set YY=%%c
IF %MM:~0,1%==0 SET MM=%MM:~1%
IF %DD:~0,1%==0 SET DD=%DD:~1%
set cdate=%MM%-%DD%-%YY%

@echo on
mkdir %mdate%
cd %mdate%

wget -c http://USR:%1@foo.com:2082/getbackup/backup-foo.com-%cdate%.tar.gz
wget -c http://USR:%1@foo.com:2082/getsqlbackup/database1.gz
wget -c http://USR:%1@foo.com:2082/getsqlbackup/database2.gz
wget -c http://USR:%1@foo.com:2082/getaliasbackup/aliases-foo.com.gz
wget -c http://USR:%1@foo.com:2082/getfilterbackup/filter-foo.com.gz

dir /od

Now I realize some of you Linux loving uber-geeks are ready to leave a snarky remark about MS-DOS, batch files, Bill Gates and the like. Soas not to lose your undying love and respect, here is the bash shell version … provided you know how to schedule a cron job and chmod as needed:

# – - – - – - – - – - – - – - – - – - – - – -
# got this script via http://www.healyourchurchwebsite.com
# syntax: sh sitebackup.sh PASSWORD
# – - – - – - – - – - – - – - – - – - – - – -

mdate=`date +%Y%m%d`
MM=`date -u ‘+%m’`
DD=`date -u ‘+%d’`
cdate=`date +%Y-${MM#0}-${DD#0}`
echo ${cdate}

mkdir ${HOME}/backups/${mdate}
cd ${HOME}/backups/${mdate}

wget -c http://USR:${1}@foo.com:2082/getbackup/backup-foo.com-${cdate}.tar.gz
wget -c http://USR:${1}@foo.com:2082/getsqlbackup/database1.gz
wget -c http://USR:${1}@foo.com:2082/getsqlbackup/database2.gz
wget -c http://USR:${1}@foo.com:2082/getaliasbackup/aliases-foo.com.gz
wget -c http://USR:${1}@foo.com:2082/getfilterbackup/filter-foo.com.gz

ls -la ${HOME}/backups/${mdate}
exit 0

Got some other cool backup scripts you want to share? Wanna offer a tweak to the above? Don’t be shy, leave a comment.

Oh, and regardless of your operating system preferences, and let’s be doers of and not just readers … and make some backups before you install the brand-spanking new Movable Type 3.0 Developer Edition.


  1. We have three cron jobs that run once a week via the cPanel. The cron job makes the two SQL files from the specific SQL database fields (we have our main site db and then our bugtracking db, and some tables we don’t backup) that we want to back up. Those are simply mysql dumps, to a directory under the root of the server (but not web root). In that folder there is a php script that gets run that simply zips up all of the sql files that are in that directory and gives it the day that it was executed. Then we download the zip files for backup.

    Our cron jobs look something like this (the first one specifies the tables, where the second one just gets all of the tables, and the last one runs the PHP script):
    cron1: mysqldump -u[USERNAME] -p[PASSWORD] –opt [DATABASE] [TABLE1 TABLE2 TABLE3 ...] > dbbackups/[FILENAME].sql
    cron2: mysqldump -u[USERNAME2] -p[PASSWORD2] –opt [DATABASE2] > dbbackups/[FILENAME2].sql
    cron3: php dbbackups/zipbackups.php

    Our PHP file just has this line in it (I suppose we could have put this into a cron job too, but this allows for porting).
    shell_exec(“tar -zcvf “.date(“m_d_y”).”.tar.gz *.sql “);

  2. Kristen, how do you get the backups offsite?

  3. Well, it’s not very automated or technical, but we just download the zip files via FTP, and then after a few times make a CD.