Article Archive by Matt Crowley


October 1 2012

Kicking Pagination Problems

by Matt Crowley

Quite a few websites make use of pagination to distribute products or content evenly across multiple pages. Pagination is essentially the distribution of on page content across multiple pages. This can provide good user experience on many sites which might include blog or product categories. If these category pages have too much content to realistically fit on one page, spreading them across multiple pages can be a smart choice.

There can be serious SEO issues with this however. For one, ranking and indexing signals such as inbound links might be diluted across all pages instead of accumulating on the most important page in the series. This can keep the main page from ranking as well as it should be. Second, if you have some static content such as a descriptive paragraph on every page, pagination can cause duplicate content issues.

These problems have been run into across many websites, and Google has created some solutions. First, if your website or blog has an article which is broken up across multiple pages, the recommended solution is to implement rel=prev and rel=next tags on each page. These will inform search engines that the pages are in a series and should be grouped together. More from Google about this action can be found here.

The next example would be if you have a blog with categories, and you have so many posts within each category that they cover many pages. If your blog appends a parameter to the URL on each page such as page=1, page=2, page=3, then you might run into issues where links become created to pages with no content. This could be seen as page=133 when you only have 10 pages. If this is a problem with one category, it is very possible that this will happen to multiple categories and it is important to address early. The best solution is twofold; however it is important to be very careful with this solution. If you are not technically knowledgeable then it is recommended to contact your webmaster or an expert.

First, create and verify a Google Webmaster Tools account if you do not have one already. Next, go to “parameter handling” under the “configuration” tab and click on “configure URL parameters.” This will lead you through the steps necessary to keep Google from crawling your paginated pages. Make sure that you know exactly what parameters are being appended to your website’s URLs which are causing the pagination issues. Finally, it is also recommended to implement the rel=canonical element on each page which is not the first page in the series. Each canonical element should point to the first page in the series. More on parameter handling from Google can be found here.

Once again, it is very important to make sure you have the technical knowledge to implement either of these suggestions. If you incorrectly implement parameter handling, it can keep Google from crawling and indexing important pages. If you implement correctly, it can save Google from crawling unimportant pages and serve them only the pages you wish do well in the search engines.

September 11 2012

Search Resurgence by Bing

by Matt Crowley

In a divisive battle between Google and Bing, the two search giants have both made incredible strides in search quality over the past few years. Although Google has held a vast majority of the market share for quite some time, Bing is determined to change search engine user perspectives and garner a bigger piece of the pie. This competition is becoming exceedingly beneficial to business owners with a strong web presence. As the search landscape grows, it is of the utmost importance not to commit your attention to Google alone.

Bing has become so incredibly confident in the quality of their search engine that they have even launched an online challenge with Google. This challenge titled Bing It On was introduced by Bing on September 6th and gives users the ability to perform a search while obtaining results by both Google and Bing in a side by side comparison. Many people might remember a similar test called the Pepsi challenge. Most elements which could identify the search engines has been stripped and allows for a truly blind comparison. After each search the user is asked to choose which results page was more helpful. After 5 searches the users is informed of which search engine they preferred or if there was a draw.

Although this challenge will certainly not give Bing a major boost in search engine market share overnight, it may greatly help their reputation. There is no doubt that Bing is more of a major player than ever. Microsoft’s director of Bing Stefan Weitz was even quoted as saying “if you strip the brands off and show people the results side by side we knew we’d win.”
The table below shows the most recent search share results from comScore. Although Bing is clearly lagging behind Google, they are making strides to catch up.

This will be a long and arduous fight for search engine dominance, but users and business owners can benefit greatly in the meantime. The increase in search quality by both engines will highlight companies that adhere to strict SEO best practices standards and grow their web foot print. If you publish unique and quality content, maintain a search engine friendly site, and abide by SEO best practices then a Bing resurgence can be a great traffic driver for your website.

September 4 2012

Google’s Venice Update | A Hyperlocal Paradigm

by Matt Crowley

Major algorithm updates from Google are rarely disregarded. However in a year full of panda’s and penguins there has been little ado about the Google Venice Update. If you are a local business owner, the Venice update can help provide a bigger spotlight on your website or social media channel. This update occurred on February 27, 2012 and fundamentally changed localized search. According to Google’s search quality highlights:

“Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.”

“Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.”

Google will now return localized results for broad match keywords if they determine that it would be beneficial to the user. No longer are local keywords required to return localized search results. What this means is that if Google’s algorithms determine that a broad match search for a non-localized keyword such as “SEO” should return local results to the user, they will do just that. This is accomplished when Google “auto-detects” your location by using your IP address. This does not require your consent and will even occur when conducting a browser session in the “incognito” window.

You can see your location setting by simply performing a search in Google, then looking on the left hand side of the results page under the navigation. If you see your current city and state, Google has “auto-detected” your location through your IP address and will return more localized results as opposed to when your location is set more broadly such as to “United States.”

The Venice update can provide local businesses more impressions to local searchers who were not necessarily thinking about local businesses. It is more important than ever to not only have a strong presence on the web but also to have a strong presence in search. There are many channels through which you can accomplish this, and some great resources can be found on our blog under local search.

© 2019 MoreVisibility. All rights reserved.