Quite a few websites, including big ones like Wikipedia, are joining together on Wednesday to protest the U.S.'s planned Stop Online Piracy Act. What looks like an easy proposition – just replace your site with a temporary page that explains the reasons for the blackout – is actually quite a bit more complicated. Google, after all, is going to look at your site during this time and may just decide that your page has changed and change the way it indexes you. This could have disasterous consequences for your search engine rankings. While Google itself hasn't really taken a position regarding the protest itself yet, Google UK employee Pierre Far has posted a number of helpful tips for webmasters who plan to participate in the blackout.
Here are his basic tips, though maybe his most important piece of advice is to keep things simple. Don't mess with your DNS settings, don't try to be too clever for your own good. You only want to participate in the blackout for one day, after all, and not suffer the SEO consequences for months to come.
When you take your site down for planned downtime (and this is true for any planned downtime, not just this protest), make sure you site returns a 503 header. What's a 503 header? It's pretty similar to the 404 errors you have surely seen before – the ones websites show you when a page can't be found.
A 503 message is very similar, but it tells Google and other search engines that this is just a temporary outage. If you know how long the outage will last, you can even tell Google when to start indexing your page again.
As Pierre Far also notes, returning a 503 header also ensures that Google doesn't think you are just posting content to your site that is a duplicate of what it is seeing on other sites that are participating in the blackout and posted the same protest message as you.
Don't Mess With Your Robots.txt File
Far also tells webmasters what to do with their robots.txt file – the document that tells search engines how to (and how not to) index a page. His basic advice is to just leave the file alone. Just make sure your robots.txt file's status code isn't a 503.