Articles written in December, 2011

December 28 2011

The Google +1 Black Market


Relatively speaking, Google’s +1 button is a new feature in search. However, it has already begun to affect searches in a big way. So much so, people have been trying to make, sadly, an unscrupulous business out of it.

Lately, sites have been springing up that offer to sell +1’s for your website. For a fee, you can get any where from 50 to several thousand unique clicks for the +1 button on your site — a practice which goes directly against Google’s quality guidelines. In the biz, its something we refer to as “black hat SEO.”

While tactics like this may be tempting, and can even provide some short term benefit, they can become detrimental or disastrous in the long run. In the case of buying +1’s for your site, there can be a number of ill-effects.

You may receive a penalization at a later date — Google prides itself on providing quality search results, and it doesn’t take kindly to those who try to game the system. If future algorithms can detect your purchased +1’s, you will have wasted your money and seriously harmed your website’s ranking in Google search.

It’s a spamming technique, and lowers quality — Consider what the +1 button is: a relevancy indicator to enhance social search. By paying a few hundred unrelated, non-relevant users to +1 your site, you can hurt your ranking in the long term and obscure your brand’s overall message to consumers.

It can mess up your analytics — The “audience report” in Google Analytics tells you the demographic and geographic information about users who’ve +1’d the pages on your site. It’s a great way to learn about your audience so you can cater to them better. Paying for a large amount of unnatural +1’s will skew this data and ruin your chances to find and target your actual, converting audience.

All of these negative aspects have the potential to harm your site. For long term success, you should always follow the best practices guidelines and stick to “white hat” SEO techniques.

December 23 2011

Google Webmaster Tools Help: Crawl Errors


One of the most useful aspects of Google Webmaster Tools is the ability for webmasters to assess how “crawlable” their site is. In the “Diagnostics” section, one can see the reason Google is unable to crawl and index certain pages of their website. Here are some of the issues Google will report on in this section:

  • “In Sitemaps”: This is where Google will show which URLs are inaccessible from an XML Sitemap that you may have uploaded to Webmaster Tools under Site Configuration>>Sitemaps. Here, Google will display the URL it’s having difficulty with and the associated problem it may be having:


Google Webmaster Tools Help: Crawl Errors


In the above example, the errors could have been caused because the Sitemap contains older, removed pages and/or the URL contained within the Sitemap has been manually restricted (intentionally) by the webmaster.

  • “Not Found”: If this section appears in the Diagnostics utility in Webmaster Tools, it could mean that Google has detected pages that issue one of the most common header responses: 404 Not Found. These errors can be tricky as they may show up because Google has found links from external websites leading to pages that you have removed from your site. It could also mean that Google has detected links on your website that are “broken” and Google will show the page where this broken link resides so you may update or remove it.
  • “Restricted by robots.txt”: This section displays pages on the site that have been blocked from web spider crawling via the site’s own robots.txt file: A robots.txt file is a simple text file, uploaded to the root directory that tells spiders which sections of the site to skip. This section is a useful way to see if the instructions you’ve entered into the robots.txt file are correct and functional.
  • “Unreachable”: Will include pages from the site that are completely inaccessible to the search engine spiders due to onsite server or network errors. These errors will usually not appear after the webmaster/IT administrator has fixed the webserver in question.

For a more comprehensive list of diagnostic errors found in Webmaster Tools, visit:

December 19 2011

How to Optimize Your Images


When it comes to SEO, many people only think about text and code. Of course, these are major elements of good SEO, but one would be surprised to see the sad state of images in the field. Images can (and should) be optimized, and it’s not even difficult to do. When optimizing the images on your site or blog posts, remember these three things:

Optimize the Name of the Image: For the same reason you optimize a title or H1 tag, you should optimize the file name of your image. Search engines cannot “see” what an image is, but they can read its file name — which is why it needs to be accurate and keyword-rich. After all, no one searches for “img00759.jpg.”

Optimize the Alt. Tag: The alt. tag is not only another chance to help search engines understand your image; it’s also helpful from a user perspective. If the user is having a browser issue that keeps the image from loading, they will still have the alt. text to help them understand what they should be seeing. Visually impaired users who use a program to read aloud text on the page can only enjoy the image if there is alt. text available for the program to read. Keep these uses for alt. text in mind when writing it.

Optimize Relevant Images: Just like you can have irrelevant content, you can also have irrelevant images. Oftentimes, irrelevant images are used on a page to serve some design purpose. Although they may be off-topic, they can add to the visual appeal of a page and are fine to use. However, images that don’t contribute meaningfully to the content on the webpage or the keywords that you are trying to target do not need to be optimized; you can focus your efforts elsewhere.

© 2016 MoreVisibility. All rights reserved