It has been the case for many years that the most optimal way to handle defunct pages on your website was to have the server return a 404 (Not found) HTTP status code. Google has just recently confirmed, however, that they now consider the 410 (Gone) response code to be a stronger signal that the page has gone away for good.
HTTP response codes are designed for both users and search engine spiders to give them information about what has happened to a site’s page. When a user stumbles upon a page that is issuing a 404 or 410 response code, they will sometimes see the message “Page Not Found”. So, if both of these response codes yield the same response for the user, what is the benefit of using one over the other?
According to Google, when a page issues a 404 header response, it may sometimes still revisit the page to ensure that it is truly defunct. What this means in terms of indexing is anyone’s guess, but using the 410 response code will at least ensure that Google will never go back to that page again and get to the more important pages on the website, thus facilitating crawlability.
The 410 response code should be used when there is no other option, meaning that this page cannot be redirected to a similar or corresponding page. So if you’re absolutely sure that a page no longer exists and will never exist again, using a 410 would likely be a good thing. It’s probably not worth the time or effort rewriting a server by changing the 404 to a 410, but using the 410 in the future will at least give Google the stronger message that they are looking for.
Determining the placement of certain pages of ones site can be a challenging task for any webmaster. Many people, especially if you are new to developing websites, are perplexed by the question of whether or not to go the subdomain route (blog.example.com) or the directory route (www.example.com/blog).
The general consensus is that subdomains are usually reserved for pages that aren’t completely associated with the general theme of the site. Google, for example has http://maps.goolge.com, http://books.google.com/ and http://blogsearch.google.com/. From an SEO perspective, it has been said that “link juice” will not necessarily flow from the main domain to pages within a subdomain. This has proven not to be the case; however, as some webmasters have seen pages within their subdomains garner the same link value as their main domain.
The real question, though, is whether there is any kind of duplicate content on the subdomain. There is a trap that some webmasters fall into where, because they haven’t set up their site architecture in a logical manner, that some pages of their subdomain duplicate pages from their main domain. While this will rarely incur some kind of penalty within the search engines, it may prevent the search engines from crawling the most important pages on the website given the “crawler caps” the search engine spiders have in place. Matt Cutts of Google has said that for newer webmasters, the subdirectory structure is the way to go until you are more confident with your site’s architecture. I tend to agree.
Google’s Webmaster Tools is a central location for webmasters to view and update diagnostic data concerning their websites in relation to the Google algorithm. What are some of its SEO benefits?
From an SEO perspective, Google’s Webmaster Tools has really pertinent information, such as “Top Search Queries” and “Links to your Site”. Top Search Queries” allows webmaster to analyze which search term or “keywords” are the most used on their site and which position in Google they are ranking for those keywords. “Links to your Site” lets you see the websites that are linking to your site. It even breaks down which pages these websites are linking to. This is invaluable to webmasters as this is the only way to get a comprehensive view of which sites are linking to you. Simply using the “Link:” command in Google only gives you a very limited report of a site’s backlinks due to Google’s stringent guidelines on privacy.
Google’s Webmaster Tools also facilitates your website’s crawlability. By using Webmaster Tools’ XML Sitemap Submission, you can submit an XML sitemap to help Google find the deeper level pages on your website. Webmaster Tools will also let you view errors in your robots.txt file, “Page Not Found” errors and even the last time the Google spider successfully accessed your homepage.
Personally, Google’s Webmaster Tools is one of the most useful SEO tools around and there are many reasons for using it.