Russian Roulette in Bash

Wanna play a game?

sudo [ $[ $RANDOM % 6 ] == 0 ] && \ sudo rm -rf / --no-preserve-root || echo "*CLICK*"

Benchmarking: MD5 vs bcrypt


So I’ve recently been doing some tests, brute-forcing some hashes for two main reasons:

1. To see how well my hardware compares to others.

2. To find out for myself the effectiveness of secure password hashing between MD5 and bcrypt.

This post is not designed to be a guide on how to brute-force hashes.


MD5 hashing has been known to be a pretty bad candidate for hashing secure passwords for some time, but developers continue to use it in their applications, unaware of the risks (or possibly aware, but they didn’t care).

One of the many failures of the MD5 algorithm is that it is designed to be very fast, and that’s exactly what you don’t need for secure password storage. On the other hand, bcrypt is designed to be a slower algorithm, which is far better for secure password storage.

But why is this? Well the main problem in having a fast hashing algorithm for secure password storage is that with modern, off the shelf hardware, it can take very little time to crack a hash via brute-forcing methods.


On my desktop, I have a AMD Radeon HD 6570 and a Intel Core i3-2100, they’re not the fastest, or the slowest, nor are they the newest, but it’s good enough for what I need daily.

So I tried brute forcing some hashes that I created to get some speed benchmarks and these are the results I got:

MD5 on GPU: 1243.3 MH/s (Million Hashes per Second)

MD5 on CPU: 27.55 MH/s

bcrypt on GPU: 543 H/s (Hashes per Second)

bcrypt on CPU: 64 H/s

You can see how much slower bcrypt is compared to MD5. Remember slower is better in this context.

What you can also see is how the GPU is much quicker at generating hashes compared to the CPU. This is because the GPU is designed for the type of floating point math required for rendering graphics, and the same sort of math is needed for generating hashes.

The following calculations show how long it could take to brute-force a hashed password containing 8 characters, with mixed upper and lowercase letters, and numbers. There are in total 218,340,105,584,896 different password combinations that could match this rule.

MD5 on GPU: ~2 days

MD5 on CPU: ~91 days

bcrypt on GPU: ~4,653,931 days

bcrypt on CPU: ~39,485,696 days

Although 2 days sounds a long time to crack a MD5 hash, that’s nothing compared to 4 million days to crack bcrypt! Of course, these calculations don’t take into account hash collisions, which MD5 is also well known for.

You can also optimize the rules reduce the amount of combinations needed and leave the most commonly used password to create a dictionary. You could also pre-calculate the hashes to make subsequent runs more quicker.


For developers: Don’t use MD5 for secure password storage.

For crackers: Hashing on the GPU is by far quicker than on the CPU.

However, if you simply just wanted a unique hash for a file, MD5 or SHA-1 might be a more suitable choice than bcrypt because of the speed differences.

Further Reading

Brute-force attack

MD5 Security

Managing Web Projects with Git



A web server with SSH access and Git installed.


The plan here is to have a web site which has two repositories, a bare repository (we’re calling Hub) alongside a conventional repository (called Prime) containing the live site. Two Git hooks link the pair pushing and pulling changes between them.

Updates can be made on the live site and a git push command can be done to push changes back to the Hub repository. The Hub repository can be cloned to a local computer running Git which can also then push changes back to the Hub repository again.

Getting Started

The first step is to initialize the Prime Git repository in the live website directory and commit the pre-existing web files.

cd htdocs
git init
git add .
git commit -m "Initial import of pre-existing website files."

We can now create the empty Hub repository outside the web root directory

mkdir hub_repo.git
cd hub_repo.git
git --bare init

Then back into the Prime repository to set up the remote and to push Prime into Hub.

cd ~/htdocs
git remote add hub ~/hub_repo.git
git remote show hub
git push hub master

Hooking It Together

post-update – Hub repository

Create a file in hub_repo.git/hooks/ called post-update and enter the following.


echo "**** Pulling changes into Prime [Hub's post-update hook]"

cd $HOME/htdocs || exit
unset GIT_DIR
git pull hub master

exec git-update-server-info

post-commit – Prime repository

Now move to the Prime repository, go to .git/hooks/ and create a plain text file called post-commit


echo "**** pushing changes to Hub [Prime's post-commit hook]"

git push hub

With this hook in place, all changes made to Prime’s master branch are immediately available from Hub.


Write a program that prints the numbers from 1 to 100. But for multiples of three print “Fizz” instead of the number and for the multiples of five print “Buzz”. For numbers which are multiples of both three and five print “FizzBuzz”.


for ($i = 1; $i <= 100; $i++ ) {
    if ($i % 15 == 0)
        echo 'FizzBuzz';
    elseif ($i % 5 == 0)
        echo 'Buzz';
    elseif ($i % 3 == 0)
        echo 'Fizz';
        echo $i;
    echo '<br>';


class FizzBuzz {
    public function __construct($start = 1, $end = 100) {
        for ($i = $start; $i <= $end; $i++)
            echo $this->calc($i), '<br>';

    private function calc($i) {
        if ($this->modulus($i, 15))
            return 'FizzBuzz';
        elseif ($this->modulus($i, 5))
            return 'Buzz';
        elseif ($this->modulus($i, 3))
            return 'Fizz';
        return $i;

    private function modulus($foo, $bar) {
        return ($foo % $bar == 0);

(new FizzBuzz);

Monitoring Website Uptime with a Bash Script

Several sites exist that will alert you when a website goes down, However it is possible to do this yourself.

A simple bash script which can be used and put into a cronjob to run at a set interval (say every 5 - 15 minutes).


SITES="," # Separate multiple websites with commas.
EMAILS="" # Separate multiple email addresses with commas.

for SITE in $(echo $SITES | tr "," " ")
    if [ ! -z "${SITE}" ]; then

        CURL=$(curl -s --head $SITE)

        if echo $CURL | grep "200 OK" > /dev/null
            echo "${SITE} responded with a 200 OK status."

            SUBJECT="Looks like ${SITE} is down"
            MESSAGE="The web server for ${SITE} has failed to respond with a 200 OK status."

            for EMAIL in $(echo $EMAILS | tr "," " ")
                echo "$MESSAGE" | mail -s "$SUBJECT" $EMAIL
                echo $SUBJECT
                echo "Alert sent to $EMAIL"

Modify the parameters to suit your needs and save the script as Run the command in the terminal to make sure it works correctly.


If something’s going amiss you could test the website directly using:

curl --head

A line mentioning HTTP/1.1 200 OK is what you need to look for.

Keep in mind, when the cronjob is setup, if your websites are on the same server as this script is running on and the cause of downtime is your server going down, you might not receive the alert email.

STALLMANQUEST: A tribute to Richard Stallman

Speeding Up PHP Applications

Nothing you probably haven’t heard before. Just some quick notes which might help speed up a PHP application and my opinions on them.

I tested these theories out based on the difference of microtime() at the start and beginning of execution over 100000 for loops. This is in no way a complete test.

  • Using else if is slower than elseif. Doing a find and replace for all your code is a quick way to achieve a good speed up.

    Opinion: Yes, interestingly this makes a significant difference.

  • Using preg_replace adds extra processing time. Try to use the built-in string functions instead.

    Opinion: Yes of course, this made a difference. Even stacking function such as strip_tags and trim together were faster.

  • Use of require_once and include_once adds extra work checking too see if the file has already been required/included. Rework your code to use just require or include.

    Opinion: Yes this made a difference.

  • Try using single quotes (') rather than double quotes (") where possible, i.e. when not using line breaks or carriage returns. also When using echo and concatenating strings, try using commas (,) rather than periods (.).

    Opinion: This made a very small, negligible difference to processing time.

And that’s it for now…

Follow up: I found this

The Problem With Facebook by 2veritasium

Single Point of Failure: The Day Google Forgot To Check Passwords by Tom Scott