Articles written in January, 2008

January 30 2008

My site is 100% optimized. Can I stop working on SEO?


I cannot count the number of times a client has said those exact words to me. Truth be told, if you want to maintain a long term presence online, it is absolutely essential that SEO be an ongoing process of adding new, search engine friendly content. Here are just a few of the many reasons why SEO should never remain stagnant.

SEO is a very dynamic industry; hence what was most valued as important in the eyes of the search engines one year ago, is no longer as important today. Google Page Rank, for example, is still deemed to be an integral part of a site’s natural positioning, however, is no longer the most mission critical factor. Today, the implementation of a Link Building Strategy is considered to be crucial in improving natural search. The engines (especially Google) are heavily weighing their organic results on how many relevant links a site has, as well as the manner in which these links are obtained.

Your competition is likely doing everything they can to surpass you online. Think of it this way: if you’ve ever hired (or even thought about hiring) an SEO agency to optimize your site, you are in a competitive industry and should deduce that your competition is doing the same. Your web site should be viewed as a work in progress; the more new and optimized content your site has, the more information the search engine spiders have to crawl. Think Blogs, Social Media, etc.

The search engines, specifically Google, Yahoo and MSN, like it when you play by the rules. Sure, there are a variety of ways to trick or fool the engines to gain better rankings in the short term. Rest assured, these tactics will catch up with you and could eventually lead to your site getting banned from the engines. We, at MoreVisibility, always adhere to a best practices approach, follow the rules set forth by each engine and advise our clients’ to do the same.

January 25 2008

How to Resolve the Canonicalization Issue without Access to your Server


Resolving the canonicalization issue can be a big headache because not all webmasters have the kind of server administration control that allows complete control over which version of your domain name is displayed to the world. However, it is important to note that while the optimal way to ensure that all search engines see only one version of your domain is by redirecting the non-canonical version to the canonical one, it’s not the only way.

First, the canonicalization issue is only an issue if search engines find your site under the “wrong” version. If search engine indexes are only showing your site under your preferred version, then it is less of a problem.

If you want to find out if your site has a problem, type into the query box for each search engine like this:

If all the listings are for the same version of your domain as shown here, canonicalization is not yet a pressing issue for you:

However, this does not mean you are in the clear. Depending on how your site is built, all it would take is one other site to link to you using the non-canonical version and your site could end up with pages in search engine indexes from both versions of your domain. This could result in a canonicalization issue whereby some pages don’t receive the link value that they deserve and that can affect rankings.

To prevent this with Google, just register with Webmaster Tools and set a preferred domain. Other search engines do not have this option, so if you’re worried about how your results may be displayed in Yahoo or, a more universal fix may be in order.

What if you do have a problem? If your site is already showing pages from both versions of the domain in the index, using Google’s preferred domain tool is not the way to go. It can actually cause pages of your site to fall out of the index and this is definitely not a good solution. Poorly linked pages are better than no pages at all!

To make this a more universal fix, ensure that all the link references on your site are absolute rather than relational. What this means is that instead of linking to internal pages using code like this:

format all anchor tag link references to include the complete domain name:

This way, even if a spider does find one of your pages under the wrong version, it will not proceed to crawl the rest of the site under the wrong domain.

This last solution will work to resolve the canonicalization issue even if the search engines have already found your site under both versions, although not as quickly. By changing all link references on the site to refer to the absolute version of the preferred domain, eventually search engines will also prefer the pages from that preferred domain because the pages of the non-preferred version will show fewer in-bound links. Once these non-preferred pages have been replaced in the indexes, you can clinch the deal by setting your preferred domain with Google’s Webmaster Tools.

January 11 2008

Optimizing for — Why and How


Search engine optimization to many people means optimizing for Google, but there are good reasons why this is a shortsighted approach. For one thing, search engine algorithms are always changing and any site that can do well on all three search engines is naturally more likely to be resilient to any changes that any one search engine might make. This is because the criteria that each search engine uses are different. For another thing, not all searchers are as savvy as you might think. There are lots of internet users out there that only know how to search with the buttons that came with the browser and if that browser is Internet Explorer, they’re probably searching on MSN. A search for [presidential election] on Google, Yahoo and Live Search give roughly the same results for Google and Live and a radically different result for Yahoo. In addition, both Google and Live show big changes depending on whether you search for election or elections while Yahoo’s results are largely unchanged. Results like this may cause a poor website owner to wonder if it is possible to rank well for any keywords in all three search engines, but we can attest that it can be done. We have seen some of our clients do it by providing good quality sites, with lots of relevant content. Relying only on Google for traffic can leave you high and dry when the algorithm changes as we saw it do just a few short weeks ago.

Now that’s new Live Search has been up and running for a while, everyone might want to bone up on their Live Search webmaster site optimization recommendations. Here are a few we found while looking at Live’s webmaster help.

  • Create a robots.txt file or use meta tags to control how the MSNBot crawls your site.
  • Validate your code and fix broken links. You can validate your code at
  • Use simple URLs that do not change frequently.
  • Include targeted keywords in the visible text on the page.
  • Keep pages short – under 150 KB.
  • Provide at least one static text link to every page you want indexed.
  • Avoid using images to display content — use text.
  • Add a site map page to the site.
  • Keep every page only one or two pages away from the homepage.
  • If you want your snippet in the search results to be attractive, put descriptive content close to the beginning of the page.
  • Keep each page on topic and ensure that each page has a clear and distinct purpose on the site.
  • Place descriptive text in a Description meta tag.

Webmasters that want to stay current can keep up by following Live’s Webmaster blog or even checking out their site in their new Webmaster Portal. Go there and let us know what you think.

© 2016 MoreVisibility. All rights reserved