Backup and Compress a MySQL Database With One Command

I’ll show you how to restore the backup, too!

I like to use simple bash scripts to do various tasks. Backing up MySQL is one of them.

I recently decided to start compressing my MySQL backups, as I started including all databases in one fell swoop. I use bzip2 to compress the .sql files produced by mysqldump. bzip2 is standard on pretty much every *nix operating system, so you likely won’t need to install it.

I’m also using Tarsnap for backups now, which is a great service, btw. So cutting the size down on the backups sent to Tarsnap will save me a bit of money. I’ll be doing an article later on that focuses entirely on Tarsnap.. I’m pretty in love with it. You can find an article about installing and using Tarsnap, right here at

Online backups for the truly paranoid
A secure online backup service.

Anyway, here’s the command I use to backup and compress all databases on my MySQL server:

mysqldump -uroot -p --opt --all-databases | bzip2 -cq9 > /home/tyler/mysql-backups/backupname.sql.bz2

That will create a backup of all databases. The -cq9 piece in the bzip2 command uses stdin for input and tells bzip2 to be quiet. The number, 9, specifies the compression level that bzip2 should use.

The script embedded in the Gist below is what I use to all my databases to /home/tyler/mysql-backups/, and then that folder gets backed up to Tarsnap.

To restore the database, you’ll want to bunzip2 the .sql.bz2 file first:

bunzip2 backupname.sql.bz2

That will leave you with a backupname.sql file. Then bring the resulting .sql file into MySQL like so:

mysql -uroot -pyourpasswordfornoprompt < backupname.sql

The databases will need to either already exist, or there will need to be CREATE DATABASE statements in the .sql file. It’s up to you. I like to create my databases before hand, but it’s just personal preference.

That’s all there is to it. How do you take care of your backups? I’d love to hear how others are doing it. Comments are open.

There’s a thread going on at Hacker News, too.

Update: Made a slight modification to the code and gist suggested by sluggo.

Update 2: HackerNews user Nanzikambe suggested the method above will destroy disk I/O on your server. He suggests using ZFS snapshots instead. The example he posted is in the Gist below, and includes the ability to send the backup to a remote server. A good tutorial on backing up MySQL using ZFS can be found here.


Aggregate System and Application Logs with Papertrail

Frustration-free log management

I’ve been using Papertrail for a few months now, and absolutely love it. Being able to search logs across all my servers at once is crazy nice.

I can even get alerts when someone logs in via SSH, which, by itself, has made Papertrail well worth it.

A non-production server was compromised, due to a since-rectified configuration issue. Papertrail notified me almost immediately, allowing for immediate action to be taken.

There’s a variety of pricing plans, and there’s even a free for life plan, which includes plenty of features for most folks. I’m currently on the free plan, but plan on upgrading soon. Adding more servers and will need the extra space at Papertrail.

Do you use a log management service?

View Results

Loading ... Loading ...

In addition to collecting logs from your servers, you can also send logs from your applications. Got a PHP application that’s erroring out for some reason? You can send that error to Papertrail for later investigation.

Same deal with Apache logs, MySQL logs, and pretty much every other piece of software that generates logs.

Not many limits on what you can configure Papertrail to do for you. It’s very powerful.

I suggest you give it a try. Installation is super easy, especially if you’re using rsyslog. Below is a screenshot of their installation instructions. Doesn’t get much easier than that.


Change your WordPress Theme Favicon

Really easy to do, but rarely done

Not sure how to create your favicon? I typically use GIMP, along with this tutorial. It has some good info on creating “proper” favicons, too, so it’s a worthwhile read even if you aren’t creating a favicon at the moment.

I like X-Icon Editor, too (it’s in the screenshot attached to this post), and there’s tons of converters for creating .ico favicon files.

Just add this modify the following and add it to your theme’s functions.php file, or add it to your WordPress functionality plugin.

function my_favicon() { ?>
<link rel="shortcut icon" href="/path/to/favicon.ico" >
<?php }
add_action('wp_head', 'my_favicon');

Change /path/to/favicon.ico to point to your favicon.ico file and you’re done!


Loading Scripts in WordPress Themes and Child Themes, Quickly

Here’s how I load scripts and styles in WordPress themes, in functions.php.

function load_scripts() {
    // load styles
    wp_enqueue_style( 'custom-css-1', get_stylesheet_directory_uri() . '/css/custom-css-1.css' );
    wp_enqueue_style( 'my-secret-script-css', get_stylesheet_directory_uri() . '/css/my-secret-script.css' );

    // load javascript
    wp_enqueue_script( 'my-secret-script', get_stylesheet_directory_uri() . '/js/my-secret-script.js' );

add_action( 'wp_enqueue_scripts', 'load_scripts' );

The above code would load up two css files, custom-css-1.css and my-secret-script.css. It would also load one JavaScript file, my-secret-javascript.js.

This post by Brian Krogsgard at WPCandy describes the proper way of loading scripts, using wp_register_script, as it takes dependencies into consideration and allows loading of scripts by their defined name. However, I think the wp_register_script methodology is lengthier and a bit more difficult to read, mostly due to the dependency stuff.

However, you do need to take dependencies into consideration if you’re using a JavaScript library, like jQuery.

wp_register_style should also be used to load styles with dependencies. This article at Tuts+ has a very good description and examples on loading CSS stylesheets using wp_register_style and wp_enqueue_style.

You should also be aware of the differences when authoring themes and plugins, which Brian also explains in his post on WPCandy. That mostly involves one WordPress function out for another.

Let me know if you run into any issues, comments are open!


Record Results From the Command Line

For those who live in the command line as much as possible

Install speedtest-cli

speedtest-cli is a Python app that provides a command line interface for testing bandwidth using Installation is simple. It should work on Linux and OS X.

The Bash Scripts

There’s two scripts, and Pretty self-explanatory. Results from are stored in st_results in the current working directory. results are stored in st_results_simple, also in the current working directory. Results

Below are the results of two speedtests run with, along with the sharing image URL.

Retrieving configuration…
Retrieving server list…
Testing from Mediacom Communications (…
Selecting best server based on ping…
Hosted by CHRJO (Council Bluffs, IA) [204.50 km]: 20.981 ms
Testing download speed………………………………….
Download: 32.44 Mbit/s
Testing upload speed…………………………………………..
Upload: 5.57 Mbit/s
Share results: .

Retrieving configuration…
Retrieving server list…
Testing from Mediacom Communications (…
Selecting best server based on ping…
Hosted by American Broadband (Blair, NE) [213.79 km]: 20.981 ms
Testing download speed………………………………….
Download: 33.03 Mbit/s
Testing upload speed…………………………………………..
Upload: 5.49 Mbit/s
Share results: .

You can get less verbose output by modifying the speedtest-cli flags. speedtest-cli --simple --share will produce very simple results that are a bit easier to read. Two tests with the script are below. Note the absolute crap speeds. Wonderful hotel wi-fi, hah! Results

Ping: 13.351 ms
Download: 1.92 Mbit/s
Upload: 0.94 Mbit/s
Share results: .

Ping: 13.431 ms
Download: 1.80 Mbit/s
Upload: 0.93 Mbit/s
Share results: .

It’s a very, very simple way of logging the results, but it’ll do for most situations. When I get some free time this week, I’m going to combine and and make it accept a --simple argument to generate the simple log.

After that’s done, I’ll be dropping the results into a SQLite database. I’ve gotten pretty familiar with SQLite lately, so it shouldn’t be too difficult.

Once this stuff is logging to a SQLite databse, I’ll put it up on GitHub, I can’t be the only one who would love to run SQL queries against this sort of personalized bandwidth data.

Update April 19, 2014: Updated the code in the GitHub Gist to include the date and time of the speedtest.