Article Archive by Author

March 24 2008

Site Sculpting with the nofollow attribute: Clever SEO or Spam 2.0?

by

Site sculpting with the nofollow attribute is slowly becoming a hot topic in the SEO world and the information that is being put around is not that clear. The nofollow link attribute was designed to relieve beleaguered forum owners and other open source content sites from the deluge of link spam that was clogging up their forum and general information pages. In a previous blog post on the true purpose of the nofollow link attribute, we discussed the correct usage of this attribute. So far, it’s worked pretty well for that purpose but as with all good things, there may be a dark side.

Late last year, Matt Cutts implied that a pro-active SEO use of the nofollow attribute could result in better PageRank for pages on your site leading to a heightened interest in this new SEO technique. A good visual explanation of site sculpting can be found here: http://www.evisibility.com/blog/no-follow-tag/. As recently as this week, participants in the Organic Listings forum at SES New York were recommending this new technique for improving the rankings of important pages of your site.

All this reminds me of a friend of mine who was convinced that he could make his back pain go away by gluing fridge magnets to his back. Magnetic fields may really have a medical use but so far as I know, nobody has been able to show exactly how to paste the fridge magnets on your back to maximum effect so he really had no idea whether he was using them correctly or not. Therefore, he was either doing nothing whatsoever for his back (the most likely possibility) or he could potentially be doing damage.

Site sculpting seems to me to be a little like this and a quick review of the most recent site sculpting buzz shows that I’m not alone in my confusion over the best use of this new SEO technique. I’m not endorsing all these opinions – just showing that there is some difference of opinion. One major problem is that not all search engines interpret the nofollow attribute exactly the same way:

How Google interprets the nofollow attribute:
http://www.google.com/support/webmasters/bin/answer.py?answer=81749&topic=8522

How Yahoo interprets the nofollow attribute:
http://help.yahoo.com/l/us/yahoo/search/indexing/nofollow.html
How MSN interprets the nofollow attribute — this is not explicitly mentioned in MSN HelpCentral but this was their original announcement on the topic:
http://blogs.msdn.com/livesearch/archive/2005/01/18/nofollow-tags.aspx.

These are just the major search engines. As far as we know, other engines like Ask.com do not respect it at all. In fact, there are still many questions about how search engines interpret the nofollow attribute.

All in all, we’re a little suspicious of the claims that Matt Cutts is endorsing the practice. Generally, Matt Cutts doesn’t promote techniques that could potentially manipulate search engine algorithms. This makes us worry that it won’t work and we will have wasted precious SEO time and effort. Or, worse, it will work but not to our advantage.As a result, we’re recommending that if a site owner wants to try it, they should be very careful only to apply it to links to pages that really and truly are unimportant and definitely do not need to be indexed.

The bottom line is that, as always, the best way to optimize your site is to only provide content and links that are valuable to your visitors. The homepage is the most valuable real estate on the site and only the most important links should be found there. If there is a link on your homepage that you are thinking of adding a nofollow attribute to, then maybe a better question would be why is that link there in the first place? In other words, instead of using the nofollow attribute to sculpt your site, try using your main navigation. In the end, it’s more durable and doesn’t depend on the ever changing whims of search engine algorithms.

March 7 2008

Duplicate Content in Search Engine Indexes: Too much of a good thing?

by

Having duplicate content on your site may not seem like it could cause a problem in search engine indexes. After all, the more keyword relevant pages that a site has in indexes, the more likely that a page from the site will appear in the search engine results pages (SERPS) for that keyword, right? It’s true that duplicate content in search engine indexes is not the worst problem that a site can have — it’s infinitely better than no content, for example. However, serving up duplicate content to search engines can cause problems. This is because although the major search engines are dedicated to crawling the entire web and indexing every single page, they also are constantly striving to present as many unique and relevant results to their users as possible. To do this, they have to filter out duplicate content particularly when it occurs on the same site.

How do duplicate content filters work? Every search engine is different and this is an aspect of search engines that is changing all the time. In fact, Google recently made major changes to the way they handle duplicate content. Prior to fall of 2007, they maintained two indexes: a main index where most search results pages were called from and a supplementary index. Pages in the supplementary index were much less likely to appear in the SERPS. Google has now eliminated the distinction between indexes and started using other methods to ensure that pages from a single site do not overwhelm the search engine results. Some of these include:

  • Grouping duplicate URLs into a “cluster” and consolidating their properties including inbound links in one URL which is then displayed in the SERPS.
  • Only displaying a maximum of two results from any one domain (including sub-domains) in the results pages and providing a link to display more results if the searcher wishes.

So, if Google is taking care of this issue, why should we care? There are two main reasons:

  1. Search engines do not crawl all the pages of a site on every visit. How often a site is subject to a “deep crawl” depends on how important the search engines view the site, but even very important sites are not fully crawled every time. How many pages are crawled can be dependent on how much time the search engines have allocated to crawling your site. If they are wasting time collecting the same content over and over rather than crawling and indexing the unique pages, some of your content may not be included in search engine results as quickly as you would like.
  2. When Google chooses which URL to display, they may not be considering issues like which page has the best title or meta tags or URL filename. If you have gone to the trouble of optimizing a specific page for search engines, your work is all for naught if they choose to display a non-optimized page in the SERPs instead.

The bottom line is that a well designed site that takes care to serve only one version of a page to both search engines and visitors will be crawled more efficiently and will be less confusing for visitors to navigate. Furthermore, as the site owner, you will choose which pages will be displayed and not some anonymous algorithm. Google has provided some tips on how to streamline your site and avoid duplicate content issues. How important this is to your site can depend on many factors, but taking any advantage you can when competing for those all-important first page positions is just good sense.

February 27 2008

Content Freshness as a Factor in Search

by

Content freshness is definitely a factor in search engine algorithms. For news items and blog posts, in particular, freshness gives a boost in rankings. Several major search engines have filed patents for gathering historical data about pages so not only is this a current factor but it is likely to be one that search engines will be adapting and improving over time to ensure that their customers receive timely and relevant results.

The big question, of course, is what does this mean to the site owner in terms of updating their content. Is it necessary to update pages frequently to get good results? The answer to this question is: it depends.

One of the main reasons that a site should try to feature fresh content is because it can increase the frequency of visits by search engine spiders. Naturally, if you are updating pages on a regular basis, you want that to be reflected in search engine indexes and you want the search engines to return in a timely fashion so that new information can appear in search engine indexes. Studies in index freshness have shown that, for over 68% of pages, Google requires about two days before the page is visible in the index. Yahoo is quicker — over 50% appear within a day.

However, search engines don’t crawl your site every day unless they have a reason and if they are only coming around every two weeks, this means that a page that you updated today might not appear in the indexes for over two weeks and that’s assuming that they actually accessed the page since they do not access every page every time they come. If your site is publishing timely news items or pages that need to be accessed quickly, content freshness is important. The good news is that these items are inherently fresh. Search engines will notice this and will return often to find your new content.

So, what if you are not publishing news items every day but you still want search engines to visit often? Should you try to change the content of your homepage every couple of days? Optimizing the content of a page so that it will rank well for a key phrase can be an arduous and painstaking process involving a lot of tweaking and experimenting before the page is just right. Changing the content of the page every day just for the sake of change is not a good idea. What about just updating the page by adding a word here and there? This is also not a good idea for two reasons:

  1. As noted above, search engines are becoming more and more sophisticated in recognizing real fresh content. This is unlikely to fool them.
  2. Updating pages just for the sake of updating may distract search engine spiders from your real new content which could actually prevent them from finding it.

Content freshness is actually one of the best reasons to include a blog on your site. You can take the opportunity to provide your visitors with timely news about your company and industry and even feature the occasional quick link to any new pages that you may have added. A well-written blog post has the advantage of actually being new content. Just make sure you post regularly. A regular pattern of adding new pages of content is the best encouragement for search engines to return on a regular basis.

Furthermore, if you do manage to convince search engines to visit your site regularly, it’s a good idea to make it easier to find those new pages. In other words, don’t update pages just to make old content look new. Only update pages that really do contain new content and then allow search engines to see that the content is new by setting your server to support Conditional Get or the If-Modified-Since request-header field. That way search engines are much more likely to find your new pages when they come and put them into search engine indexes to be found by your potential new visitors as soon as possible.

© 2017 MoreVisibility. All rights reserved