Getting rid of the hacked files and spam links wasn’t the end of it
Dreamhost notified me that the load on my server was excessive and they’d disabled StorageMojo.
Yikes! Had I been hacked again? DDOS attack? What?
Building the correct mental model
In short order I brought up my SFTP client, my tracking site, the Dreamhost webpanel and my son on chat. He had me toss a new index.html file into the site folder to let people know that the problem was getting addressed.
On to problem solving
It took a while to figure it out because I’d never seen it before.
The load was coming from Google referrals for charming search terms that I’m going to misspell on purpose in hopes of not attracting similar traffic:
- download sh*mail
- downlode free 1ndian s3x movies
- pharmasuitical affiliate prom0
- 0rgish/behe*ding
- h1nd1 p0rn m0v1es
*Lots* of pee-oh-rn requests for many different ethnic types. Some things are universal – at least among guys.
There were no hacked files still on StorageMojo – I’d gotten them all last week and they were still gone. But the tracking site was referring to them, so for a while I thought they were there but that for some reason I couldn’t see them.
But then my son checked what happened when someone tried to go to the spam links. The site was delivering a “system error” message – not the static 404 page I’d expect – so the site wasn’t delivering the spam content and it really was gone. Presumably processing for the “system error” page created much of the extra overhead Dreamhost was seeing.
For a while StorageMojo was getting thousands of hits an hour from these Google referrals. At some point Google must have crawled the site again, saw the content was no longer there, and stopped referring people.
Not a moment too soon!
So what was this all about?
My son hypothesized:
This looks like a two-step scheme…step one is that they hacked your site and got all those bad SEO files uploaded. Step two is to send lots of fake Google traffic through your site to increase PageRank.
Then I went one step further and checked out one of the spam pages that Google had cached. In big bright colors it told me that my XP system was infected with viruses and I should download their *free* virus scanner.
Whoa, scary. Except I’m on a Mac.
Botnet recruitment? I don’t know.
The StorageMojo take
I’ve made a number of changes to tighten up StorageMojo. As I was researching this I found that there are many security “folk remedies” out there, but very little on what the high priority issues are.
Keeping software up to date seems to be the critical success factor – and sad to say, I’d been lax. In addition to keeping current I’m now checking my site files more often among other changes.
Hopefully these requests will tail off as Google stops referring people. And StorageMojo can go back to being a quiet little site.
Thank you for your patience.
Comments welcome, of course.
I am glad that you shared all this because it has made me so much more aware for my sites.
Running a web *application* clearly isn’t your core business. Why wouldn’t you outsoursce to say Blogspot (Google)? Cost is minimal (actualy probably far lower given how much time and effort you’ve just wasted), and the provider worries about the securtiy/patching/uptime.
What’s the new B-School mantra? Something like “if its an aspect of your business that faces competitive pressure, you must keep it yourself. Outsource everything else with a positive ROI”. Robin, your *content* is your core business, not running the website itself, right?
Tripwire (http://www.tripwire.com/products/enterprise/ost/) or its ilk are your friend for things like this. (don’t know if your host allows/provides it, I use it on internal Linux servers).
It will take a “snapshot” of everything on your system, then alert you to anything that has changed. I don’t believe that it will prevent attacks, but is certainly useful for finding out what files have been changed, failed logins, etc. It helps you catch problems quicker, and ensure that you have cleaned up everything.
Hi Robin,
Sorry to hear about your troubles here, but it’s certainly a sobering & educational reminder of the realities of web “commerce” in this day and age!
At the risk of trivializing your technical problem space, let me share a simplistic best-practice from my early days of selling NetApp filers to Web 1.0 companies:
– Run your site from a locked-down storage-based read-only snapshot
Most blog pages are read-mostly, write-sometimes (post, comments, etc…) workloads which may still be well-suited to an update process which basically appends HTML changes to a “safe” edit area which can be subsequently be re-snapshotted and re-served back into the “wild”
Best of all, you can build this yourself with ZFS if you’d like. Or perhaps borrow Ryan’s sage advice above and outsource the whole thing. A halfway compromise of course would be to merely outsource the storage piece to a proven web storage supplier instead of building your own!
Sorry, couldn’t resist! 😀
Well done!
I applaud your skills, tenacity and success but I second ValB’s and Ryan’s suggestions.