Quickly backup files with this bash script

Bash ScriptThis is something that I use on a regular basis on all of my servers. How many times have you been ready to edit a file and either don’t make a backup copy or make one but by now are real tired of typing out copy one file to another name with a date stamp and blah blah blah. It’s not hard to do, but it gets old quick typing the same thing over and over again, plus you might not always name them the same thing or the same way, so now your backup files have different naming patterns and whatnot.

Don’t worry, I have an easy solution. I created a simple script to backup the file specified and append a time and date stamp to the end of it. I symlink this to the command ‘bu’ in someplace like /usr/bin so it’s always in the path of whatever user I might be (myself, root, backup, whoever?), and then POW, it’s easy to backup files plus they are always named the same way – you just type “bu filename”. Now, if you don’t like the way I name my file copies, feel free to customize this to suit your needs. Also, I currently have the script make the copy right next to the original file, but it would be easy to always copy the files to a backup directory somewhere if you wanted, the possibilities are endless!

OK, on to the script goodness:

#!/bin/bash
 
if [ "$1" == "" ]; then
  echo "No input given, stopping"
  exit
fi
 
YEAR=`date | awk '{print $6}'`
MONTH=`date | awk '{print $2}'`
DAY=`date | awk '{print $3}'`
TIME=`date | awk '{print $4}' | awk -F: '{print $1"-"$2"-"$3}'`
 
echo -n "Backing up the file named $1 ... "
/bin/cp -p $1 $1_${YEAR}.${MONTH}.${DAY}_${TIME} > /tmp/bu_run.log 2>&1
echo "done."

There you have it, a simple file backup script it bash that can save you time and many, many keystrokes. Drop me a comment and let me know what you think, or if you have any suggestions or improvements.

MySQL Database Backup Script

Here we go folks, I thought I would share a handy little script with you that I use to backup all of the databases on a particular Linux/UNIX server.  I do this by getting a list of the databases, and then using mysqldump to dump them all to a text file.  This seems to be the best way (short of replication) to get good clean backups of the data.  Toss it into a cron job and you can have it done automagically.  There isn’t anything yet to rotate files, but I might add that later.  Also, I am going to try and rewrite this in PERL so our Windows (and other OS’s that don’t have a shell like Bash) brethren can run this script as well.  For now though, it’s written for Bash but almost any shell would work I think.

OK, onto the script.  Continue reading

grsync – a great backup and file sync tool

grsync_maingrsync_aboutGreetings everyone, I am back with more information about backing up your files.  I know, I know, talking about backups might be boring, but one day a good backup will save your butt, I guarantee it.  Previously I posted an article extolling the virtues of rsync, a very powerful command line tool for syncing files both locally and across networks via SSH.  This is great for command line addicts like myself, and especially because you can use it in scripts such that along with shared keys and keychain it becomes a powerful tool in your arsenal of sysadmin goodies.

Now, for folks that aren’t command like geeks, or maybe just want a quick and easy way to backup some files, there is a nifty little tool called grsync.  This is (as you can probably guess from the name) a gui for the command line rsync, making it much more user friendly.  Also, it’s quite nice for pointing and clicking what you want, and then seeing what the command it will use is, a learning tool.

The home of grsync is here:

http://www.opbyte.it/grsync/

For Debian, Ubuntu users you can find it in the repositories.

Best Backup Tool For Budding Networks

One thing I have been doing for many years now, decades even, is backing up and restoring data.  It’s easy to backup stuff at home, simply copy your valuable bits and bytes to an external hard drive or write them to a CD-R or DVD-R.  This makes backups easy, though a bit cumbersome, especially as hard drives and data requirements get bigger and bigger.  Think about all the digital content we have nowadays versus just a few years ago.  Movies, music, games and more that get purchased and downloaded right off the net mean more and more gigabytes to backup.

Screenshot of BackupPC software

Screenshot of BackupPC software

Still though, for saving the critical stuff like documents and pictures, CD’s and DVD’s are OK.  But what about when you have more than one computer?  With prices falling and computer technology getting more and more prevalent in the home, it’s not un-common for households to have at least two computers in the form of a desktop and laptop.  However, I am seeing more and more households with computers for mom and dad, the kids, grandparents and then some laptops on top of all that!  Whew!  Now we are getting into one major pile of work to try and back all that up.

In the commercial world where you are backing up a data center full of servers and/or cubes laden with workstations, you buy commercial software like Veritas Backup Exec or NetBackup or Arcserve, etc.  Throw your data onto tapes inside a robotic tape library and manage it all from one central console.  Now, that’s all well and good, but it’s very expensive and doesn’t exactly fit in the average home very well.

So, where does that leave people like me and I am sure many of you out there that still have several computers to backup?  We are caught in a kind of in between place.  Well, I am going to share some good stuff that I have found, and actually have been very impressed with. Continue reading

Rsync R Your Friend

Need to sync some files? Locally or remotely? How about re-thinking an old friend, rsync?

You may be like I was, and have discounted rsync for a long time due to the security risk imposed by running the “r” daemon on your servers. Guess what? You can not only use rsync to sync up local directories on the same server (this can be real handy for backups), but you can also sync from one server to another via SSH rather than the rsync daemon. This would be much like scp, only you can sync whole directory trees.

So. Let’s say you want to sync two local directories, how would you do that? Well, if we are syncing /export/datadir to /export/backupdir it would look something like this:

rsync -aruv /export/datadir/ /export/backupdir/

It’s just that easy. Now, those command line switches, what do they do? Check it out:

a = archive
r = recursive
u = skip files that are newer on the receiving end
v = verbose, tell me what's going on

There is another one that is good when syncing between two separate servers, and that is the “z” switch. This tells rsync to use compression during the file transfer thus saving bandwidth. Let’s see what the above would look like from one server to another, as if you are running the command from the server you are syncing to:

rsync -aruvhz --progress server1:/export/datadir/ /export/backupdir/

There are a couple other options there, did you notice? I have added the “h” which tells rsync to output information in human readable format (GB, MB, K, etc), and the –progress which tells rsync to report exactly that, the progress of each transfer. You can use these with local transfers too, mix and match as you see fit.