Google has recently launched its biggest algorithm update since the Panda update from last year. It’s called Penguin and it’s another way Google is actively fighting search spam.
While Panda primarily targeted shallow content, Penguin is specifically an anti-spam update that is punishing sites for techniques like keyword stuffing (using the exact same keyword an excessive amount of times on a page) and cloaking (showing a version of a webpage to a crawler that is different from what a human user sees). It’s easy to see if your site has been affected by Penguin. Just look at your Google Analytics page. Watch the data for April 24, when Penguin went live. Affected sites will see a sharp, immediate drop in traffic — indicating a Penguin penalization.
There has been some buzz about Google launching an update to penalize sites that have been “over-optimized.” Penguin looks to be just that, however the term “over-optimization” is a bit of misnomer. Sites that have followed a balanced, white-hat SEO plan will be negatively affected by Penguin. Sites that have excessively optimized via keyword stuffing need to watch out for Penguin.
It is possible you may have been keyword stuffing by accident with no intent to spam or otherwise game the system. Take the following URL as an example:
www.example.com/widgets product page/blue widgets/size10widgets.aspx
It seems like a perfectly normal e-commerce site ticking off product category progression, right? However, Penguin may see it as stuffed with the keyword “widget.” A preferable URL would be:
www.example.com/widgets product page/blue/size10.aspx
Another stuffing technique Penguin is looking for is internal links on your site that point to the same webpage using the same anchor text every time. When writing the copy for you pages, you’ll want to use synonyms for your primary keywords for the best optimization. This not only helps people using different search terms find your content, but it helps you avoid keyword stuffing as well. By balancing your SEO plan and producing quality content, you will be able to rank higher and avoid any Penguin penalties.
In our last installment of explaining Metadata for a website, we will cover the Keyword Tag. In the past, savvy webmasters and marketers would place many keywords, including words that had nothing to do with their site in this tag. When the databases were small, Search Engines would use the Keyword Tag as a way to index pages. Sometimes, people would abuse this and rise to the top undeservingly. As the databases grew and search engines become more sophisticated, they put a stop to spammers stuffing the Keyword Tag. As a result, there is less weight placed on the Keyword Tag today.
By now I may have convinced you to scrap the keyword Metatag altogether, but that is not my intention. To fully optimize a site, you must use all legitimate SEO practices such as content optimization, link building and Metatag optimization, including the KEYWORD TAG. Search engines use many factors in determining ranking, and as mentioned before, while they place reduced weight on the Keyword Tag, it is still a part of the SEO Big Picture. Below are some tips for writing effective Keyword Tags that are not considered spamming:
– The Keyword Tag should contain only phrases found on the page
– Phrases should be unique to the page
– Primary Keyword Phrase should be listed first
– Target 1 Primary key phrase and 2-3 supporting keyword phrases to put in the tag
– Every phrase in the tag should appear on the page
– Separate keywords with a comma and use no more than 12 phrases per page.
Things not to do when writing Keyword Tags include:
– Keyword Stuffing — repeating keywords over and over in the tag
– Place unrelated keywords in the tag — This is putting possibly high volume search terms in the tag that have nothing to do with the content of your site in hopes of tricking the search engines
By following these simple steps, you will be well on your way to writing Keyword Tags that will help with the SEO big picture of your site.
I must sound like a broken record when I suggest (much too weak a word) that my clients follow all of the above, specifically with respect to the big players in the industry: Google, Yahoo and MSN. Search Engines are smart; real smart. You might be able to pull the wool over their eyes in the short term, but eventually you will get caught if you try to utilize Black Hat techniques to gain better traction within the organic (free) results. Everyone wants to be #1 in Google for their core keywords, however, there is no way to get there (and remain there) other than by making your website as search engine friendly as possible. What does that really mean? In the simplest of terms, there has to be a high degree of correlation between the keywords you want to show up under and the actual content on your website. The more targeted and relevant the content, the better chance you have of achieving natural positions for the keywords that are mission critical to your business.
All of that being said, one of the most frustrating things for marketers is when they optimize their site for natural search, while strictly following the rules and guidelines set forth by the engines and employ White Hat techniques yet they (A) do not see significant progress being made and/or (B) see their competition showing up under their core keywords. Even worse, the competition is not following a Best Practices approach. In other words they are doing things that are heavily frowned upon by the engines, such as: keyword stuffing, duplicate content, redirects, etc. A couple of important things to consider: Search Engine Optimization (SEO) must be viewed as an ongoing process and it can take several months to reap the rewards of optimizing your site. The search engines are always changing their algorithms and therefore, it is crucial to view SEO as a long term commitment. Your competition might not follow a Best Practices approach and utilize Black Hat methods to “get ahead”. Just because they are not being penalized at present, does not mean that they will not get caught eventually. I recently had a client who saw their competition using Black Hat techniques and achieving great organic results. They decided to utilize the same techniques and were banned from Google as a result. The moral of the story is that the best things come to those who wait. If you optimize your site and “make nice” to the search engines, you will maintain a long term presence in the engines.
Title: Being White Hat Pays Off in the Long Run!
Description: Discussed the importance of following a best practices approach
Keywords: White Hat, Black Hat, Best Practices, Search Engines, Marketers, keyword stuffing, duplicate content, redirects, Search Engine Optimization, SEO, Google, Yahoo, MSN.