Article Archive by Author

December 13 2007

Duplicate Sub-domain SERP Results Thwarted by the Host Crowding Filter

by

The latest old search engine news is that Google has begun including sub-domains in their “host crowding” filter which restricts search results to two results from each domain. Matt Cutts reported on his blog that, despite early reports that this was a change in progress, Google has been filtering out duplicate sub-domains from some search results for several weeks. In particular, searches returning fewer results — the long-tail results — now contain a maximum of two results per domain and that includes sub-domains. The days when one site could rule an entire page of search results are numbered.

This news should not come as a big surprise to any of Morevisibility’s clients as we have been warning of the dangers of duplicate content across domains and sub-domains for some time. However, what many may not realize is that it is not only Google that is diligently weeding out duplicate content from its results. Comparable long tail searches on both Google and Yahoo reveal that they are both doing a pretty good job of excluding duplicate results from the same domain and/or sub-domain. Live search results are also being filtered (although the option to turn off the filter is a nice touch — based on an informal test, un-checking the “Group results from the same site” option in the Options page appears to turn it off):

Duplication of pages in search results is a problem for everyone — both searchers and site owners and it’s good to see that the search engines are making headway on this duplicate sub-domain issue so that more sites can appear in search engine results and a wider variety of results are available for searchers.

November 12 2007

Lower PageRank: Not as big a Problem as You Think

by

Lower PageRank has been a hot topic in SEO forums and blogs over the last couple of weeks. Many highly regarded sites experienced drops in PageRank and in some cases, the drops are significant as reported on SearchEngineLand last week. Complaints about lower quality search results have sent Google back to the algorithm in a real battle with spam sites and others who would take over all the top search spots and lower PageRank for some is the result. The prime targets of Google’s efforts have been directories, blogs and other advertisers that are providing links for money.

Along with falling PageRank have come reports of significant drops in the rankings for some sites. Interestingly, there is not a clear one-to-one relationship between lower PageRank and falling search engine results rankings suggesting that Google is discounting the value of PageRank in their algorithm.

This comes as no surprise to some who claim that PageRank has actually been devalued for some time now in favor of Trust Rank — a method of evaluating links based less on quantity of links and more on quality of links. In particular, paid links from directories and blogs are expected to become less valuable to search rankings in the coming months.

So, why am I not worried? Because any good link strategy will cultivate inbound links with the idea of getting traffic – not just ranking – and because ultimately, content is still king. As search engine algorithms improve the quality of results, a well-designed site with good quality content will always rise to the top.

October 18 2007

Three Rules for Higher Search Rankings

by

Without a doubt, the single most important factor in getting a good ranking in search engine results is whether or not the page is included in the search engine indexes. If your page is not in the indexes, then there is absolutely no way ever that it will achieve any kind of ranking in the results pages … not good, not bad, not any.

So, here are the three basic rules for getting indexed with search engines and consequently being well on your way to a top ranking in search engine results:

1. Make your pages easy for robots to find.

  • Build inbound links to the site from other sites that focus on the same search topic.
  • Add sitemaps: Google and Yahoo xml external sitemaps are good but you also need an internal, plain html sitemap linked from every page of your site.
  • Don’t bury your pages. Make every page available within one or two clicks of any other page on the site.
  • If you move, leave a forwarding address. In other words, if you change the name of a file, redirect the old filename to the new one.

2. Make your links easy to find.

3. Make each page unique.

  • Search engines don’t want to display page after page from the same site all talking about the same topic so they will filter out pages their algorithm tells them are duplicate content or even pages from the same site with very similar themes.
  • If each page is distinct, it has a better chance of being included in search engine indexes.
© 2017 MoreVisibility. All rights reserved