Articles written in June, 2009

June 30 2009

Concerned About Your Bounce Rate?


You’ve worked extremely hard for months and have finally achieved first page positions in the search results for many of your important keywords, yet you’re still not happy with your site’s bounce rate. What could be wrong?    

Let’s first start out with a few definitions.   A bounce is a single page visit.   A bounce rate is the percentage of visitors that arrive at one page and exit the site before viewing another page.   So the real question is not what is a bounce rate, but rather, what can I do to improve (decrease) my site’s bounce rate?

The first thing you should do is to check the coding of your site. Have all of your pages been tagged with the proper tracking code?   If not, this could be the problem. If only your homepage is tagged, your Analytics account will not be able to account for any other page views on the website.

Is the website’s design or usability a factor?   We all know about the importance of first impressions.   The same applies to your website. The presentation and design of the site can affect the bounce rate. Are the pages cluttered or do irritating pop-ups appear when a visitor arrives to the site?   Is there an intuitive navigation that enables someone to easily find what he or she is looking for? Take the time to address these questions and ensure that the design and navigation options are not creating obstacles, preventing your visitors from viewing other pages.

Do the page titles and descriptions correspond to the content on the page? Throughout the optimization process, you have crafted meta data so that the titles and descriptions are compelling and keyword-rich, but if the content on the page does not match up with those titles and description tags, you will be setting your pages up for failure.   Make sure that the titles and descriptions for all of your pages describe the content accurately.

The search engines have advanced algorithms and do a decent job of providing searchers relevant results.   However, if you have optimized pages for keywords that aren’t what the searcher is expecting to find, you are going to have a difficult time keeping the visitor on your site. Taking the time to conduct keyword research is crucial. All of the pages on the site need to be optimized for precisely what they are about. There are often variations of keywords that would make sense to optimize the content of a page around, and this is where keyword research is most important.

While there’s not a magic number that is good or bad, it’s never too late to review the above items to ensure that you’re providing the best experience for the visitor, which can reduce the bounce rate. It’s essential to know your visitors, why they are arriving to your site, and what they are looking for once they get there.

June 24 2009

Benefits of a Robots.txt


The robots.txt file is a simple text file (no html) that is placed in your website’s root directory in order to tell the search engines which pages to index and which to skip.
Many webmasters utilize this file to help the search engines index the content of their websites.

If webmasters can tell the search engine spiders to skip pages that they do not consider important enough to be crawled (eg. printable versions of pages, .pdf files etc.), then they have a better opportunity to have their most valuable pages featured in the search engine results pages. The robots.txt file is a simple method of essentially easing the process for the spiders to return the most relevant search results.

That being said, I have seen many occasions where the robots.txt has not been used in the best way possible. For instance, webmasters are prone to make mistakes when installing the robots.txt and the repercussions can be severe. There is a simple instruction that restricts all search engine spiders from crawling the entire site:

User-agent: *
Disallow: /

Without the “forward slash” in the instructions, search engines are granted access to the entire site. So, the inclusion of this one character in the robots.txt can prevent a website from showing in the search engines. There could be many reasons why webmasters would do this intentionally (website is still relatively new and they may still want to tweak certain pages for keyword density etc.), but more often than not, it is a mistake and is usually only realized when the site hasn’t shown up in the search engine indexes for months.

Errors aside, another benefit of having a robots.txt is that you can specify the location of the Google .xml or Yahoo sitemap with this simple instruction:

sitemap: (this assumes the xml sitemap is located at the root of the domain).

This also increases spiderability for the search engines. Of course, even though this is a small aspect of the search engine optimization process, if utilized correctly, a robots.txt can be a significant benefit.

Posted in: Tips & Tools

June 16 2009

Bing vs. Google


Now that Bing has arrived, hopefully you’ve at least checked it out. Bing certainly has a visual appeal, providing beautiful background images and tidbits of interesting information as you move your mouse around the landscape.


After all of the hype around MSN’s new search engine, I wanted to share with you some of the information that has been circulating as a result of an eye tracking study comparing Google and Bing. The full study can be found here, but below are a few of the main points.

  • Google and Bing do not differ in terms of the amount of time searchers spend looking at the organic results.   In this particular study, participants looked at the organic search results an average of 7 seconds.
  • The attention given to sponsored links located above the organic results is high for both Google and Bing. More than 90% of participants looked in that area during each search.   Sponsored links on the right, however, attracted more attention on Bing (about 42% of participants per search) than they did on Google (about 25% of participants per search).
  • Another difference between the two is the related searches feature. Bing offers their related searches on the left, while Google’s are below the organic search results near the bottom of the page. As a result of the location, Bing’s related searches had greater visibility than Google’s related searches.   Bing’s attracted the attention of 31% of participants per search. Google’s attracted the attention of only 5% of participants per search.

One question that I have been asked frequently is – will Bing affect search engine optimization (SEO)?   My thought is that it’s highly unlikely. Although Google, Yahoo, and Bing’s search results vary, Google is still likely to remain on top.   Google has an incredibly strong brand and searchers seem to be at least fairly happy with the results that are provided.   Google’s strong brand and searcher’s satisfaction with the results (for the most part) will likely keep other competitors at bay for a while. After all, people Google things. Will we soon Bing things?  

© 2016 MoreVisibility. All rights reserved