Article Archive by Darren Franks


January 6 2012

Google Panda: How to Approach Building Links

by Darren Franks

In a previous blog post entitled “Latest Google Algorithm Update – Now People Panic!”, as well as in the latest MoreVisibility YouTube Video, I discussed the Google algorithmic changes named “Panda” and how to address certain aspects of your site to ensure that a site-wide penalty isn’t incurred due to low quality content.

Here are some tips on what to avoid so that your link building efforts remain in line with Google Panda:

  • Avoid link submissions to directories that have hundreds (or even thousands) of irrelevant links included in its categories.
  • Avoid submitting a link to a site that has an inordinate amount of ads on a page with little to no quality content.
  • Don’t necessarily rely on a submission site’s PageRank. A site’s PageRank is not always accurate in the first place, plus, its importance has been greatly reduced as of late, as it’s generally not an accurate gauge of a website’s authority.
  • Is the category/page you wish to submit your link to even in the search engine indexes? This sounds obvious, but if Google hasn’t crawled and indexed a page in a directory, it’s not going to attribute that inbound link to your site and your efforts will be in vain. Additionally, if a page in a directory is not indexed, this could be indicative that Google has either penalized that directory or the site has poor programming, inhibiting crawler access.

Remember, the Panda Update penalties will impact your whole site and the effects can be drastic, so ensure that your link cultivation efforts aren’t thwarted because of submission to one or two low quality directories. Major websites have been penalized for their link building practices due to Panda, including JCPenney, so no one is immune.

December 23 2011

Google Webmaster Tools Help: Crawl Errors

by Darren Franks

One of the most useful aspects of Google Webmaster Tools is the ability for webmasters to assess how “crawlable” their site is. In the “Diagnostics” section, one can see the reason Google is unable to crawl and index certain pages of their website. Here are some of the issues Google will report on in this section:

  • “In Sitemaps”: This is where Google will show which URLs are inaccessible from an XML Sitemap that you may have uploaded to Webmaster Tools under Site Configuration>>Sitemaps. Here, Google will display the URL it’s having difficulty with and the associated problem it may be having:

 

Google Webmaster Tools Help: Crawl Errors

 

In the above example, the errors could have been caused because the Sitemap contains older, removed pages and/or the URL contained within the Sitemap has been manually restricted (intentionally) by the webmaster.

  • “Not Found”: If this section appears in the Diagnostics utility in Webmaster Tools, it could mean that Google has detected pages that issue one of the most common header responses: 404 Not Found. These errors can be tricky as they may show up because Google has found links from external websites leading to pages that you have removed from your site. It could also mean that Google has detected links on your website that are “broken” and Google will show the page where this broken link resides so you may update or remove it.
  • “Restricted by robots.txt”: This section displays pages on the site that have been blocked from web spider crawling via the site’s own robots.txt file: www.example.com/robots.txt. A robots.txt file is a simple text file, uploaded to the root directory that tells spiders which sections of the site to skip. This section is a useful way to see if the instructions you’ve entered into the robots.txt file are correct and functional.
  • “Unreachable”: Will include pages from the site that are completely inaccessible to the search engine spiders due to onsite server or network errors. These errors will usually not appear after the webmaster/IT administrator has fixed the webserver in question.

For a more comprehensive list of diagnostic errors found in Webmaster Tools, visit: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=35120

December 5 2011

How Does Yahoo Shutting Down Site Explorer Effect SEO?

by Darren Franks

In 2010, Yahoo announced that its organic search results would be powered by Bing which led to many Yahoo properties being discontinued. So, as expected, on November 21st, Yahoo officially took down its free, search analysis tool, “Site Explorer”. What does this mean for Webmasters? Even though the data has been integrated into Bing Webmaster Tools, it seems evident that any type of robust (and free) online tool for checking things like backlinks is gone forever. There are, of course, a plethora of third party tools, but those tools are compiling an aggregate of a multitude of different platforms, and are not a true reflection of the “true” backlink data that a specific search engine has.

Bing Webmaster Tools has made some effective improvements over the last several months and the verified webmaster for a website will now be able to get even more comprehensive data. Bing Webmaster Tools will likely show an increased number of backlinks being reported for a website, as well as provide a central location for both Yahoo and Bing data.

However, many SEO’ers, including myself, will miss the practicality of the old Site Explorer. Site Explorer was the only free, public database from a major search engine that gave you an easy way to look at the amounts of backlinks to a specific page. While privacy was a concern, making certain information public inspired other webmasters to improve their own inbound link building techniques as well as making it easy for the more novice webmaster to take a quick glance at a competitor’s website for inbound link ideas, thus inspiring innovation in the world of SEO.

© 2020 MoreVisibility. All rights reserved.