Article Archive by Matt Crowley


October 2 2012

Search Referrers in a Privacy Minded World

by Matt Crowley

Shortly after the iPhone 5 release and the iOS6 software update, it was discovered that the default browser (Safari) uses Google’s Secure Search. This means that any keywords being searched for on the iOS6 platform will not be available to those who use Google Analytics to track organic traffic data. This keyword is known as a search referrer. If your website garners a lot of organic mobile traffic, this can be a very daunting issue. No longer will you be able to see what a user typed into their iPhone in order to find your website, if they are using iOS6 and Safari.

For the time being, the issue with Safari is not a big one. However, it speaks to a much larger issue that is steaming full force ahead. This is the line being drawn between privacy and data on the internet. Marketers and business owners are becoming increasingly knowledgeable about who visits their websites. This data is often pulled through site usage and Google search data. Searchers increasingly want more privacy, and business owners want to understand their online users and customers better.

Google is also taking an increasingly conservative view on the stance of search privacy. They are creating more and more scenarios in which a searcher’s keyword can not be tracked in Google Analytics. If a user is signed in to any Google account or they are using a Google Secure Search enabled browser, then their search referrer will not be passed to the website’s Google Analytics account. However, Google garners a vast majority of their revenue from AdWords advertising. They have decided to allow search referrer data to be passed through clicks on AdWords advertisements.

This creates a bit of a paradox. On the one hand Google wants to be more privacy oriented and not pass as much data to website owners who are using their services. On the other hand, if the website owner is paying Google to advertise on their search engine it is acceptable.

It is a two sided battle between website owners who want to cater to their users better and website visitors who want to retain more privacy. No matter your view, many search engines and browsers are beginning to look into privacy protection more seriously. It is becoming increasingly important to use multiple sources to understand your website’s user rather than relying solely on the keyword that brought them to the website in the first place.

October 1 2012

Kicking Pagination Problems

by Matt Crowley

Quite a few websites make use of pagination to distribute products or content evenly across multiple pages. Pagination is essentially the distribution of on page content across multiple pages. This can provide good user experience on many sites which might include blog or product categories. If these category pages have too much content to realistically fit on one page, spreading them across multiple pages can be a smart choice.

There can be serious SEO issues with this however. For one, ranking and indexing signals such as inbound links might be diluted across all pages instead of accumulating on the most important page in the series. This can keep the main page from ranking as well as it should be. Second, if you have some static content such as a descriptive paragraph on every page, pagination can cause duplicate content issues.

These problems have been run into across many websites, and Google has created some solutions. First, if your website or blog has an article which is broken up across multiple pages, the recommended solution is to implement rel=prev and rel=next tags on each page. These will inform search engines that the pages are in a series and should be grouped together. More from Google about this action can be found here.

The next example would be if you have a blog with categories, and you have so many posts within each category that they cover many pages. If your blog appends a parameter to the URL on each page such as page=1, page=2, page=3, then you might run into issues where links become created to pages with no content. This could be seen as page=133 when you only have 10 pages. If this is a problem with one category, it is very possible that this will happen to multiple categories and it is important to address early. The best solution is twofold; however it is important to be very careful with this solution. If you are not technically knowledgeable then it is recommended to contact your webmaster or an expert.

First, create and verify a Google Webmaster Tools account if you do not have one already. Next, go to “parameter handling” under the “configuration” tab and click on “configure URL parameters.” This will lead you through the steps necessary to keep Google from crawling your paginated pages. Make sure that you know exactly what parameters are being appended to your website’s URLs which are causing the pagination issues. Finally, it is also recommended to implement the rel=canonical element on each page which is not the first page in the series. Each canonical element should point to the first page in the series. More on parameter handling from Google can be found here.

Once again, it is very important to make sure you have the technical knowledge to implement either of these suggestions. If you incorrectly implement parameter handling, it can keep Google from crawling and indexing important pages. If you implement correctly, it can save Google from crawling unimportant pages and serve them only the pages you wish do well in the search engines.

September 11 2012

Search Resurgence by Bing

by Matt Crowley

In a divisive battle between Google and Bing, the two search giants have both made incredible strides in search quality over the past few years. Although Google has held a vast majority of the market share for quite some time, Bing is determined to change search engine user perspectives and garner a bigger piece of the pie. This competition is becoming exceedingly beneficial to business owners with a strong web presence. As the search landscape grows, it is of the utmost importance not to commit your attention to Google alone.

Bing has become so incredibly confident in the quality of their search engine that they have even launched an online challenge with Google. This challenge titled Bing It On was introduced by Bing on September 6th and gives users the ability to perform a search while obtaining results by both Google and Bing in a side by side comparison. Many people might remember a similar test called the Pepsi challenge. Most elements which could identify the search engines has been stripped and allows for a truly blind comparison. After each search the user is asked to choose which results page was more helpful. After 5 searches the users is informed of which search engine they preferred or if there was a draw.

Although this challenge will certainly not give Bing a major boost in search engine market share overnight, it may greatly help their reputation. There is no doubt that Bing is more of a major player than ever. Microsoft’s director of Bing Stefan Weitz was even quoted as saying “if you strip the brands off and show people the results side by side we knew we’d win.”
The table below shows the most recent search share results from comScore. Although Bing is clearly lagging behind Google, they are making strides to catch up.

This will be a long and arduous fight for search engine dominance, but users and business owners can benefit greatly in the meantime. The increase in search quality by both engines will highlight companies that adhere to strict SEO best practices standards and grow their web foot print. If you publish unique and quality content, maintain a search engine friendly site, and abide by SEO best practices then a Bing resurgence can be a great traffic driver for your website.

© 2022 MoreVisibility. All rights reserved.