How Safe Is Your Website Or Blog? 3 Steps to a Better Night’s Sleep
By Scott Thomas
Would you know if your site had been hacked? Could you restore it if it had been hacked? These three steps will help you prepare for the worst.
If you follow technology news, it seems that high-visibility websites are being compromised (hacked) with astonishing frequency. Even to the point where at least one hacker group is using the threat of its hacking as a political weapon.
You might think that big sites, like major corporate sites or government sites, are so well protected that they can’t be hacked. You would be wrong. But what about the little businesses? If anything, they are less likely to be secured against threats, both because the business lacks the resources, and because the hosting company controls the security of the site. So, what is a business to do to protect itself?
Here are three steps you can take today to make it more likely that, whatever the cause of website troubles, you can reduce the pain of having to restore the site.
1. Preparation: Make sure your host has regular backups
It seems like a no-brainer, really, but check with your hosting company. Are they doing daily (nightly) backups? Most hosting companies use what are known as “virtual web servers” – several (many) websites are hosted on the same physical machine and actually have the same IP address (the numeric address used to find a site on the internet). If any one of the sites on that machine is compromised, the other sites are also at risk.
If they have to restore the websites from a backup, will it be restored as of the previous day? Or will it be from a week (or more) ago? If your site is updated regularly (which is a good thing!), a backup from more than one day ago may miss significant changes to your content.
2. Monitoring: Automatically monitor your website for changes
If you do nothing to monitor your website, you won’t know there is a problem until either you come across it, or (more likely) your potential guests spot it and mention it when they call. That could be quite some time after the problem occurs.
The solution is to create a way of having your site automatically monitored, so you can just check it daily to see if there have been changes made without your knowledge. One way to do that is to use a tool like Page2RSS. Put your website’s address (or any other page you would like to monitor) into the search box and click “To RSS”. Then take the resulting URL and add it to your favorite feed reader (if you don’t have one, either Google “feed readers” or just use Google Reader). Check the reader daily, and whenever there are any changes to your website, they will appear there. If you didn’t make the changes, you’ll know that someone else did!
3. Protection: Make your own backups
(a) Windows or Linux
Even though your web hosting company may be backing up your site regularly, when they restore the site from a backup, things can go wrong. You can guard against this by making your own backups and storing them on your computer, on an external storage site like DropBox, or whatever.
If you’re using Windows, the easiest way to backup your website is to use the free utility, Wget (Wget is usually included in most Linux distributions, or can be easily installed with your package manager). Detailed instructions on how to download your full website with Wget are here, and full documentation for Wget is here, but the simple version is, after installing Wget, open a terminal (command) window by typing “cmd” (without the quotes) into the Run or Search box in your Start menu (depending on your Windows version), then run:
wget --user-agent="Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/18.6.872.0 Safari/535.2 UNTRUSTED/1.0 3gpp-gba UNTRUSTED/1.0" -m -k -K -E http://www.yoursite.com
This will make Wget tell the webserver that it is a Chrome browser (some sites block Wget), create a “mirror” or full backup of your site (m), convert internal links so you can view the pages locally (k), backup pages where the links have been converted (K), and add “html” file extensions to any files that don’t already have them (E). Adjust these options to your own preferences, of course.
Store that in a safe place, and set up your Windows Task Scheduler (or Linux cron) to do this daily. By default Wget will download newer files, adding a number to the filename to avoid overwriting older versions of the file. If you want it to overwrite them (so you always keep the newest version), add -N to the list of letters before your website URL.
(b) Mac
On a Mac you can use Wget if you are willing to compile it yourself. If you don’t know what that means, you are probably better off using a different backup tool. There are several listed on CNet and on Softpedia. Not having a Mac, we haven’t tested them, but they do appear to perform functions similar to Wget on other platforms.
(c) Backing up sites powered by a database
If your site is powered by a database (like most content management systems, such as WordPress, Drupal, Joomla, etc.), you will need to take some extra steps. Using Wget (or one of the Mac programs), you’ll be actually downloading all the content of your site, and converting it to html pages. That will make your site completely portable, but it doesn’t allow you to restore it directly back to the server.
First, you’ll need to be able to backup your database. If your web host offers a database tool like PHPMyAdmin, you can simply use it to download (export) a complete backup of your database and save it locally. You restore it using the import function fo PHPMyAdmin. If not, you will need to locate and install a MySQL tool (or the comparable tool for any other database you are using) such as MySQL Workbench, that will allow you to remotely access and export the database to your machine.
Second, you’ll need to download not just the page content (which is what Wget was doing), but the actual files that make up the engine behind your website. To do that you’ll need FTP (or equivalent) access to your webserver. Then, find an FTP program that allows for automatic downloading, and install it. Set up the program to autmatically download all the files under your website’s directory on the server (usually this is a directory called “public_html” or “www” or “httpdocs” or similar. Store that in a safe place and automate the backup for a nightly function.
Now, if things change on your site, and they weren’t meant to change, you are in a position to know it quickly, and to be able to restore the site from your own backup if your web host can’t.
Sleep easier.