The Joomla Honeypot Project Experiment

At itoctopus, we are paranoid about the security of our managed clients’ websites – as such, we always research revolutionary ways to better protect these websites against potential exploits.

Last week, we conducted a honeypot project on one of the largest websites that we manage. For those of you who don’t know what a honeypot project is, it is, in essence the capture and the analysis of the malicious traffic that a website receives in order to better protect it. Typically, a honeypot project consists of a bait to lure attackers (the same way honey lures bears), but, in the case of a very large and a very high traffic website, an artificial bait is not necessary since the importance of a website is in itself a bait.

Before going into the details, we will need to explain the security setup of the website:

  • The website’s backend was protected with an .htaccess password: This method (described here) literally blocks all types of attacks on the backend. This means that we can completely forget about the attacks on the backend and focus on the attacks targeting the frontend.
  • The “index.php” file was the only file allowed to be directly executed by the Apache web server: This ensures that any attack on the website must be funneled through the index.php file first (as direct access to other PHP files on the Joomla website is blocked). This is super important since, as you will find out below, we are doing all the work in the defines.php file which is loaded by the index.php file.

  • ModSecurity was enabled on the server: This means that many known attacks are blocked before even reaching the Joomla instance. This will allow us to focus on new types of attacks.

Now that we have explained the existing security setup of the website, we can start explaining our little honepot project experiment. Are you ready? Let’s go!

Since Joomla loads the defines.php file before anything else (even before executing any serious code), we decided to add the code that will capture all the data in the requests sent to the website in the defines.php file. So, we created a defines.php file in the root directory of the website and we added the following PHP code to it:

<?php
	$file = 'project-honeypot.php';
	$projectHoneypotData = file_get_contents($file);
	$projectHoneypotData .= 'Date: '.date("Y-m-d H:i:s")."\n";
	$projectHoneypotData .= 'IP: '.$_SERVER['REMOTE_ADDR']."\n";
	$projectHoneypotData .= '--SERVER Data--'."\n";
	$projectHoneypotData .= print_r($_SERVER, true);
	if (!empty($_GET)){
		$projectHoneypotData .= '--GET Data--'."\n";
		$projectHoneypotData .= print_r($_GET, true);
	}
	if (!empty($_POST)){
		$projectHoneypotData .= '--POST Data--'."\n";
		$projectHoneypotData .= print_r($_POST, true);
	}
	if (!empty($_FILES)){
		$projectHoneypotData .= '--FILES Data--'."\n";
		$projectHoneypotData .= print_r($_FILES, true);
	}
?>

We saved the defines.php file and we uploaded it back, and then we left the website until the next day.

The next morning we checked the project-honeypot.php file, and we were amazed by the amount of garbage that hit the website: malicious referrers/user agents (that were not caught by ModSecurity), weird GET parameters, base64 POST values (some of these POST requests weren’t even relevant, since they were specially crafted for WordPress websites), and many, many attempts to upload hacked files.

The amount of data in that file was overwhelming. We had 2 options from there: either we needed to blacklist all the bad requests that we found or we needed to whitelist the good requests. We decided to go with the latter option, simply because whitelisting is a much more efficient process than blacklisting, since blacklisting means that we will need to continually monitor the website for future bad requests.

So we checked all the clean requests on the website and we whitelisted them all and blacklisted everything else (essentially, any request that was not on the whitelist was 403’d). For example, a valid POST request was one where $_POST[‘option’] = ‘com_search’ (this is sent when someone is searching for something on the website), so we whitelisted it.

Still, after whitelisting what should be whitelisted, we recorded all the whitelisted requests to the project-honeypot.php file. After 24 hours, we checked that file again, and we noticed that the absolute majority of the whitelisted requests were OK – however, there were a few bad requests among them. So, we had to blacklist those bad requests (at a granular level, any security work must include whitelisting and blacklisting – whitelisting alone is not sufficient). We repeated this process for a few days, until we were sure that almost all malicious requests were blocked from the server.

At the same time, we monitored the Apache logs for any legitimate 403s (e.g. false positives that were blocked by our whitelisting) and we did find a couple, which we immediately whitelisted.

Now, the million dollar question is, how efficient was all this?

We can safely say that the work was very efficient – we were much more comfortable with the security of the website, and we had no doubts in our minds that the website was much more secure than before. The whole experiment was so successful that we were very confident to recommend this process to our other managed clients.

Was there a performance hit caused by the security work above?

Not at all – in fact – there was a performance gain! Yes – you heard that right – there was a performance gain! And that’s because all those blocked requests were no longer loading the Joomla environment (this reminds us of a previous post of ours where we explained how not to load the Joomla environment for non existing files).

We hope that you found our post instructive and useful, and we hope that you will benefit from it to supercharge the security of your Joomla website. If you need help in doing that, then look no further. Just contact us and let us do the work for you quickly, cleanly, and affordably!

No comments yet.

Leave a comment