Duplicate Sub-domain SERP Results Thwarted by the Host Crowding Filter

- December 13, 2007

The latest old search engine news is that Google has begun including sub-domains in their “host crowding” filter which restricts search results to two results from each domain. Matt Cutts reported on his blog that, despite early reports that this was a change in progress, Google has been filtering out duplicate sub-domains from some search results for several weeks. In particular, searches returning fewer results — the long-tail results — now contain a maximum of two results per domain and that includes sub-domains. The days when one site could rule an entire page of search results are numbered.

This news should not come as a big surprise to any of Morevisibility’s clients as we have been warning of the dangers of duplicate content across domains and sub-domains for some time. However, what many may not realize is that it is not only Google that is diligently weeding out duplicate content from its results. Comparable long tail searches on both Google and Yahoo reveal that they are both doing a pretty good job of excluding duplicate results from the same domain and/or sub-domain. Live search results are also being filtered (although the option to turn off the filter is a nice touch — based on an informal test, un-checking the “Group results from the same site” option in the Options page appears to turn it off):

Duplication of pages in search results is a problem for everyone — both searchers and site owners and it’s good to see that the search engines are making headway on this duplicate sub-domain issue so that more sites can appear in search engine results and a wider variety of results are available for searchers.

Comments are closed at this time.

© 2016 MoreVisibility. All rights reserved