Creating unique and relevant description tag text for large dynamic sites can be a challenge. After all, if your website has 50,000 pages of products that are constantly changing, writing unique description tag text for each page would be a virtually impossible task. The time it would take to do this would be more effectively utilized elsewhere (on link building, for example). Therefore, many large websites program their sites to dynamically create unique description tag text and this is fine but there are still ways to optimize this process to make these tags the best that they can be. For example, look at these description tags from Amazon.com:
Basically, these are all the same tag with roughly this format:
A community about KEYWORD. Tag and discover new products. Share your images and discuss your questions with KEYWORD experts.
While these tags are all different, the difference is slight with the bulk of the words in the tag repeating across pages of the site. More seriously, all the tags start with the same string of characters: “A community about”.
By starting the tag with the unique key phrase “Italian cinema community”, the tags look more distinct to search engines and the key phrase is made more prominent.
To create good dynamic description tags, consider these two strategies and associated tips:
1. Create the dynamic tags according to general Best Practices for description tags:
a. Place the full key phrase being targeted to the page as close to the beginning of the page as possible.
b. Use full sentences and, if there is room, include a call to action such as Buy PRODUCT NAME HERE at my store.
c. Keep description tag text between 110 and 180 characters.
d. Avoid stuffing keywords in the description tag. Repeating keywords too many times can trigger spam filters, so it should be avoided.
2. Create unique tags for top level pages targeted to your most competitive keywords and reserve the dynamic tags for dynamic content. This is particularly important if your top level pages are short on plain text content because the description tag text may be the only content search engines can find to display in the snippet. Also, if the content of your top level pages are just lists of products, they will appear very similar to each other. A good description tag could help make the pages more distinct to search engines, increasing the chance that they will each be recognized as important and deserving of a place in the main search results pages.
If you have no way of creating unique description and keywords tags for your dynamic pages, then it is better not to have them at all. However, in that case, make sure that there is sufficient plain text descriptive content on the page, so search engines have something to display under your link in the search engine results pages.
Last week, in a rare unified move, all three major search engines announced support for a new “canonical URL tag” designed to help search engines understand a website with multiple URLs displaying the same content. Basically, all a site owner needs to do is add this tag to the head section of all versions of a duplicated page. So, for example, this tag:
would be added to the head section of all the versions of the same page shown below:
By adding the canonical tag to all these potential versions of the page, it tells search engines that all these URLs are essentially the same page and should be treated as such. This allows them to easily determine which page should be listed and at the same time ensure that all the linking value for these pages is preserved and combined under one URL.
The introduction of this new tag provides an alternate way for site owners to address duplicate content issues created by the way their site is designed. Up until now, the only solution that worked for all three search engines was to restrict the access of the robots to duplicate pages using instructions in the robots.txt file, robots meta tags or both. Any website owners that have been using the robots meta tag or robots.txt file to deal with this and who decide to switch to the tag will need to remove any instructions restricting access to duplicated pages from their robots.txt files and/or remove the robots meta tags so that search engines can find the new canonical URL tags.
Unfortunately, for some websites, using the robots meta tags and robots.txt file may continue to be the only viable solution to duplicate content, because although this new tag addresses the issue of which page should be indexed, it does not resolve the crawling problem associated with duplicate URLs. Since search engine robots do not realize that these pages are all the same until after they have been crawled and indexed, they may still waste valuable crawling time accessing the same content and potentially delaying the indexing of unique content. Furthermore, all three search engines have indicated that they will view the canonical URL tag as a “suggestion” and will still be using alternate means to determine which URL should be displayed in duplicate content situations. This is why the best course of action is not to give search engine duplicate URLs in the first place and using robots.txt, robots meta tags or the canonical URL tag should only be used if there is no way to program the site to be search engine friendly.
More details about this new tag can be found here:
Lately, we’ve been noticing something a little different in Google’s search results with some search queries resulting in more descriptive information in the result. The search result shown here illustrates this:
The normal size for a snippet description in Google’s search results has always been about 160 characters but recently, we have observed description snippets with as many as 317 characters. This kind of result was reported in Italian search results last November and also on Webmaster World, some users reported being offered optional “long” descriptions in results. However, we are seeing these results lately with no special preference settings.
We can only speculate on what is triggering the longer descriptions. We have noted that the longer the search query is, the longer the snippet. So, a search for a three-word search query leads to shorter descriptions like this one:
Two or three word queries result in normal short snippet sizes:
These results are not just local. They were reported here in Florida and a colleague up in Minnesota tells me he’s seeing them there too.
We have noticed that, as reported in the TechCrunch article mentioned earlier, the extra descriptive text is pulled from the page even if the page contains more text in the description tag so we see no reason to change existing Best Practices for description tag length.
Is this just something new that Google is testing or is this a real change in the way that Google displays search results? We’ll just have to wait and see.