Articles in the Algorithm Updates & News Category

If you depend on Organic website traffic for new and returning business, it’s important to understand the effects that search engine algorithm updates can have on your web traffic. Usually run to prevent search spam and improve the SERPs, search engine algorithm updates can be confusing for web masters and interactive marketers. Learn about the latest algorithm updates and what they mean for your website.

December 23 2011

Google Webmaster Tools Help: Crawl Errors

by

One of the most useful aspects of Google Webmaster Tools is the ability for webmasters to assess how “crawlable” their site is. In the “Diagnostics” section, one can see the reason Google is unable to crawl and index certain pages of their website. Here are some of the issues Google will report on in this section:

  • “In Sitemaps”: This is where Google will show which URLs are inaccessible from an XML Sitemap that you may have uploaded to Webmaster Tools under Site Configuration>>Sitemaps. Here, Google will display the URL it’s having difficulty with and the associated problem it may be having:

 

Google Webmaster Tools Help: Crawl Errors

 

In the above example, the errors could have been caused because the Sitemap contains older, removed pages and/or the URL contained within the Sitemap has been manually restricted (intentionally) by the webmaster.

  • “Not Found”: If this section appears in the Diagnostics utility in Webmaster Tools, it could mean that Google has detected pages that issue one of the most common header responses: 404 Not Found. These errors can be tricky as they may show up because Google has found links from external websites leading to pages that you have removed from your site. It could also mean that Google has detected links on your website that are “broken” and Google will show the page where this broken link resides so you may update or remove it.
  • “Restricted by robots.txt”: This section displays pages on the site that have been blocked from web spider crawling via the site’s own robots.txt file: www.example.com/robots.txt. A robots.txt file is a simple text file, uploaded to the root directory that tells spiders which sections of the site to skip. This section is a useful way to see if the instructions you’ve entered into the robots.txt file are correct and functional.
  • “Unreachable”: Will include pages from the site that are completely inaccessible to the search engine spiders due to onsite server or network errors. These errors will usually not appear after the webmaster/IT administrator has fixed the webserver in question.

For a more comprehensive list of diagnostic errors found in Webmaster Tools, visit: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=35120

December 5 2011

How Does Yahoo Shutting Down Site Explorer Effect SEO?

by

In 2010, Yahoo announced that its organic search results would be powered by Bing which led to many Yahoo properties being discontinued. So, as expected, on November 21st, Yahoo officially took down its free, search analysis tool, “Site Explorer”. What does this mean for Webmasters? Even though the data has been integrated into Bing Webmaster Tools, it seems evident that any type of robust (and free) online tool for checking things like backlinks is gone forever. There are, of course, a plethora of third party tools, but those tools are compiling an aggregate of a multitude of different platforms, and are not a true reflection of the “true” backlink data that a specific search engine has.

Bing Webmaster Tools has made some effective improvements over the last several months and the verified webmaster for a website will now be able to get even more comprehensive data. Bing Webmaster Tools will likely show an increased number of backlinks being reported for a website, as well as provide a central location for both Yahoo and Bing data.

However, many SEO’ers, including myself, will miss the practicality of the old Site Explorer. Site Explorer was the only free, public database from a major search engine that gave you an easy way to look at the amounts of backlinks to a specific page. While privacy was a concern, making certain information public inspired other webmasters to improve their own inbound link building techniques as well as making it easy for the more novice webmaster to take a quick glance at a competitor’s website for inbound link ideas, thus inspiring innovation in the world of SEO.

November 30 2011

Google +1 Button Tracking in Webmaster Tools

by

For those keeping up with the “Google +1 Button” hype, you may have realized by now that you can access its associated data via Google Webmaster Tools:

Google +1 Button Tracking in Webmaster Tools

From your dashboard, click to expand the “+1 Metrics” section; this will reveal sections on Search Impact, Activity and Audience. In this section, you will see:

The Search Impact report: Will show you the impressions (how many times have pages with the +1 button shown in search results). Will also display the clickthrough rate (how many people actually selected this page in search results). Important because you’ll want to be able to distinguish the effectiveness of using the +1 button on certain pages compared to others.

The Activity report: Allows webmasters to view how many times a page has been “+1’d” from a specific page or from search results.

The Audience report: Displays demographic data for the +1’s such as age location and gender.

As with much of the specific numbers in all of the metrics in Webmaster Tools, much of it is an “average” or an “aggregate” number. For instance, the data you are able to view in the categories above is only displayed if a certain number of people have clicked on your +1 button.  

Learn all about it from a recent Google Webmaster Central Blog post:
http://googlewebmastercentral.blogspot.com/2011/06/1-reporting-in-google-webmaster-tools.html.

© 2017 MoreVisibility. All rights reserved