Keeping Your Database Safe

If you are involved in the Internet in anyway greater than a simple observer, there is a high likelihood that you work with a database in some way. Whether you are an application developer or a weekend blogger, loosing your database can mean many days recovering data, if it can be recovered at all.

I work with many custom apps as well as managing client WordPress installs. One fact-of-life I've come to realize is, eventually you will loose a database. It is not a matter of if, but when and my professionalism — yours as well — will be gauged by how we handle a failure.

I've developed a database management routine and I present it here so it may help you as well. 

First thing to remember is, backing up databases takes up resources, resources which directly incur cost. Choosing the correct management routine is a walk on a tight rope suspended between two pillars, cost and importance. So, for some blogs, which have a moderate following with one or two new posts each week, I set up database backups on a weekly schedule. Mission critical apps, on the other hand, receive daily backups.

Regardless of the schedule chosen, the actual backup procedure is performed by either a Perl or PHP script which dumps the database, gzips the sql file and FTPs (or SFTPs) that file to a third party server. We schedule this file to run with CRON.

It doesn't matter what type of server you send the zipped file to, the point is to store it outside of your production server's data center. This will protect your data in case of a catastrophic physical disaster at the production data center.

In my particular case, I send backups to an account in the Rackspace Cloud. One could use Amazon or Media Temple or any traditional server for that matter just as easily. I simply choose a cloud environment because data is stored across the cloud rather on one physical box. This further insulates my data from the effects of physical damage to disks, etc.

It's also important to manage these backups after you initialize the schedule. Decide on one day each week to review the backups. On my mission critical backups, I have one day which I unzip the latest backup and confirm the data is correct. This may seam like overkill, but consider a circumstance in which an update has been applied to your server, this change causes an error with your backup script and from that point forward your backups contain only half the data. Try explaining that to your client/customers.

It's also important to define up-front your backup rotation and stick with it. How many days, weeks or months of backups will you keep? Remember you will encounter real cost concerns the more data you retain. On the upside, being able to reset the database to exactly 4 days back when requested is very fulfilling.

Fast ImageMagick Install on Ubuntu


So I just had to tackle adding Image Magick to a new Ubuntu server. More specifically the Perlmagick adapter as the client’s app is in Perl.

So here’s the deal. You first impulse might be to use the beloved CPAN command. Don’t! CPAN was my first try as well; normally it’s awesome about taking care of dependencies but in this case it was not.
Use Aptitude, Ubuntu’s kick butt package manager. Two lines you’ll need:
  1. aptitude install imagemagick
  2. aptitude install perlmagick
Done. Be sure to use the full command, “aptitude” as it handles dependencies a bit better than apt-get.