Duplicate content is a hot topic and has been for quite a while. It is also one of the most misunderstood issues in search engine optimization. Many webmasters and even some search marketers spend an extraordinary amount of time and resources trying to avoid the dreaded “Duplicate Content Penalty”, when in fact a penalty derived from duplicate content is fairly rare and reserved specifically for sites which have been observed trying to manipulate search engine rankings directly; i.e. search engine spammers.
The more common issue associated with duplicate content found by search engines is the “Duplicate Content Filter”. When a search engine finds two or more pages with identical or even nearly identical content it applies a filter which allows only one instance of the content to be returned in search results. This is in no way a penalty and does not affect the site in whole, just the specific page as it relates to the specific search query. The goal of the search engines is to provide their users with “unique” content and this filter helps to ensure each page returned in the search results is unique.
In the past couple of weeks Google has published an article with some very specific information on how it sees and handles duplicate content as well as some bullet points on issues to watch for concerning duplicate content. Additionally, another new US Patent relating to identifying and handling duplicate content has been granted to Google.Read More
I admit it; I love the TV show “Vegas”. One of the things I always found fascinating in that show was the facial recognition tool they use in the security room. It is great the way they scan through a database of faces and find a match to someone they just video capped from the security cameras. It’s basically a search engine for faces.Read More
Good natural search engine rankings are understandably the goal of almost all websites. With the competition so fierce in many markets for these rankings, the major search engines try to keep the playing field level by not divulging to much information about how they rank the sites. If the algorithms were public knowledge, many people would use this information to try and manipulate the rankings to their favor. The smaller or busier webmaster would not have a chance. The fallout of this for SEO is that industry information is fluid. We know the engines change their algorithms on a regular basis and almost everyday new information about site optimization is uncovered and old information is discounted.
Morevisibility’s Natural Search Blog is here to help keep our visitors up to date with these changes and informed about the industry as a whole. Website technology, design, content and marketing all play a major roll in how a site performs in natural search. These factors along with engine specific news or optimization factors, the tidal wave of Social Media Optimization, and general or theoretical topics will be covered.