The problem is easily solved. All it takes is taking advantage of RFC 2616, section 14.25's "If-Modified-Since." http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html If you set the "Last-Modified" header on your site server, as explained here: http://kb.siteground.com/article/LastModified_HTTP_header_explained.html Then all you need to do is have a robot ping the server at a reasonable interval and have it trip an alarm whenever your pages are "replaced." A break-in would set off such an alarm. It's what we do, it's what many others do as well to be notified of a potential attack on any of their sites. In an http request by a robot, you can even add unique values that can be HTTP queried as additional insurance instead of having an alarm go off when your webmaster makes changes often. There are many ways to mitigate the damage or at least get a heads-up if you're compromised. That's the kind of stuff we normally write about over at "the Island."