Although many people have heard about Google’s Penguin Algorithm updates and know that they can have a large impact on their website’s presence within Google’s search engine results pages (SERPs), many of those same people aren’t aware of issues that may cause a website to be affected by them. While there are many things that can cause your website’s presence to be affected by these updates, we have seen one common pain point within the SEO industry. This is when websites have inbound links from site-wide elements that do not fall within Google’s Quality Guidelines.
Site-wide links include any link that can be found on every page of a website. These are often contained within a website’s site-wide footer or sidebar. For example, if site A has a link to a page on site B within its site-wide footer, this would create a link on every page of site A pointing to the page on site B. When this occurs, a new link to the page on site B is created any time site A creates a new page on its site. This can result in thousands to hundreds of thousands of links being created that originate from one website and point to a single page.
In some cases, this is normal. If a corporation has multiple websites and each contains a site-wide footer link pointing back to a main privacy policy page, this is not likely to negatively affect any of the sites. However, there are many reasons that site-wide links can negatively affect a website’s presence within Google’s SERPs. The first is that site-wide inbound links are against Google’s Quality Guidelines if used improperly. Google states that websites should avoid “widely distributed links in the footers of various websites.” Although it is unlikely that linking to a privacy policy page would be problematic, there are situations in which it can be.
One common case is when a website has site-wide inbound links from partner websites. To be clear, this is not always problematic, but if the linking is done improperly it can become problematic. One of the reasons that this type of site-wide inbound link can be problematic is that there can be large fluctuations in the amount of inbound links originating from one domain. For example if site A is an ecommerce website with hundreds of thousands of pages and contains a site-wide link to site B, there could be thousands of links lost or gained each week based upon how often site A updates it’s product pages. When there are such large fluctuations in a site’s inbound link portfolio, Google’s algorithm will likely take notice.
If you are utilizing a website for your business, and especially if you are utilizing multiple websites, it is extremely important to consistently check your inbound link portfolio. You can find out more information about who is linking to your website in our blog post about Google Webmaster Tools. If you notice a large amount of inbound links originating from a single domain, it is important to take a closer look. Ensure that any site-wide inbound links fall within Google’s quality guidelines and consider having an SEO professional analyze all of the inbound links pointing to your website.
The end of September and beginning of October 2012 have proved to be busy times for Google. The search giant rolled out several updates that jostled the rankings of many websites. SEOs have also been busy trying to sort out the updates and analyze their impact. Here’s our rundown of the most important updates in chronological order.
Panda Update: Google rolled out its Panda algorithm for the 20th time during Sept. 27-30. Ordinarily, Google runs Panda about once per month to filter through search results and penalize sites with low-quality content. But, Panda’s algorithm was upgraded for this 20th run, which had a dramatic impact on search results. The last several Panda updates only affected about 1% or less of search queries — while this new Panda update affected 2.4% of English queries.
EMD Update: The Exact Match Domain (EMD) update was a brand new algorithm that Google pushed on Sept. 28, which affected about 0.6% of English searches. This update specifically targeted websites that had high rankings by virtue of their domain name being an exact match for search queries. However, this doesn’t mean that all EMDs will rank poorly. Google’s real goal with this update is to penalize low-quality websites that are only riding high because of their domain names. Having an EMD is fine, so long as website is a source of good content. These high-quality websites still have high rankings after the EMD update:
In a way, the EMD update is very similar to Panda — maintaining high-quality content on your website will keep you from being penalized. Google will roll out the EMD update again in the future (possibly every month, like Panda), but exact dates are not known.
Penguin Update: On Oct. 5, Google ran its Penguin update for the third time since it debuted in April. This had a small impact on search results, affecting only about 0.3% of English queries.
Top Heavy Update: Officially known as the page layout algorithm, the Top Heavy update rolled out for its second time on Oct. 9. This also turned out to be a relatively small update, affecting only about 0.7% of English searches.
Since all of these algorithms are being run periodically, it’s important to keep them in mind when modifying the content of your website. To check if any of these algorithms have impacted your site rankings, cross-reference your Google Analytics and Webmaster Tools data with the release date of each algorithm to see if there’s a correlation.
At the end of the day, the rules for SEO really aren’t any different. All of updates are taking aim at weak content and poor user experience. If you continue to create original content; follow best practices; and maintain your site with users first in mind, search engines second, then the quality of your site will be rewarded in search rankings.
This week Google announced that changes to their algorithm were coming soon in the form of further Penguin and Panda “adjustments.” But if you’re creating good content — onsite and offsite — you have nothing to worry about. That’s because these adjustments are all about devaluing sites that use artificial means in order to rank well in the search engine results.
When we say “artificial,” we mean poor-quality content created in order to manipulate search engines. This is done mainly by keyword stuffing — onsite and off — and using unnatural language in order to rank for a particular keyphrase.
Remember that Google’s primary goal (as far as search is concerned) is to deliver the best possible user experience. To achieve that, they want to rank content that also delivers the best possible user experience. Content that is thin, overly simplified, or that uses unnatural language in order to achieve rankings, is out. Content that is original, helpful, and written for humans is in.
The best thing that you can do for your site — beyond creating excellent, search engine optimized, content — is to make sure that you don’t appear as a spammer to the search engine bots. Because Google is changing the definition of “spammer” all the time, there is a certain art to this.
We will still — and probably, always — have to target keywords. And that’s a good thing. Keywords and phrases help guide content creators to the best possible ways to reach their audience. If you didn’t do keyword research, you might never know that potential customers are searching for a particular product, a particular way.
It’s what we do with those keywords that’s important.
As of now, overusing the same phrase — without variations — is out. Using natural language is in. Go ahead and use your keyphrase in your title tags, description tags, and in your H1. From there, vary your language — sometimes using your target phase, sometimes using a variation. Write the way you wrote before you wrote for the internet, varying your language and using synonyms, with the primary goal of communicating an idea — not landing on page one.
Do this and you have a good chance of creating content that will survive any algorithm update.