SEO Blog

Your search for SEO best practices ends here.

On our SEO blog, MoreVisibility's SEO team offers insights and actionable information for novices and webmasters alike. Gain valuable information about technical SEO and learn the nuances of content production and optimization - for your website, mobile site, and offsite efforts. From "best practices" primers to thoughts on strategy and the intersection between SEO and usability, our SEO experts will guide you through today's pertinent SEO techniques and ideas.
To stay up to date on our SEO blog, subscribe to our feed.

April 10 2007

Mobile Enabled Web Sites Are Popping up Everywhere


Many web site owners would really love to have a mobile version of their web site available. Not only is it just cool and convenient to browse for the latest information while on the go, it is also becoming very popular. Soon most web sites will have a mobile version. If you can’t offer this service you may lose visitors in the future.Now is a good time to start learning about the technologies involved and the issues you may run into while deploying a mobile web site.

Most mobile enabled sites are using syndicated versions of their standard site. This could create some duplicate content issues which should be avoided. I’m in the process of learning new ways to fix this issue right now. One way is to make sure that your site is being crawled by the engines correctly. The standard SERPs should only index your standard content and the Mobile SERPs should only index your mobile content.

You have probably noticed that the big players in search are all starting to offer mobile versions of their search engines which are designed to index mobile content, this means that there will probably be a mobile SERP for each engine.

With a bit of research on Google’s site, I found a list of their user-agents which can be used in your sites robots.txt file. If I had my mobile content serving out of and wanted the site to get indexed correctly I would add something like this to my robots.txt

User-agent: Googlebot
Disallow: /mobile/

User-agent: Googlebot-Mobile
Disallow: /

User-agent: Googlebot-Mobile
Allow: /mobile/

This would tell the standard Google SERP to not index the content in “/mobile” or any of it’s sub-directories. Then the next two rules would tell the mobile Google SERP to index everything in the “/mobile” directory and ignore everything else.

This is just one way to avoid the issue, There is also the .mobi domain which is to be used only for mobile content. I will discuss this more on another day. For further reading on mobile site development and the DotMobi domain, I would recommend this awesome guide which is in PDF format.

DotMobi Mobile Web Developer Guide

April 10 2007

Blinded by the Flash – User Experience and SEO Have More in Common than You Think


A basic rule of website design is that the site should provide a positive user experience to its visitors. While it’s true that a site that’s invisible to search engine robots might not get many visitors, it’s also true that a site that fails to consider humans is unlikely to keep those visitors – much less sell them anything. Search engine optimization and user experience are both crucial for a successful site and, much as I hate to say it, user experience, like content, is king. Unfortunately, traditional web design often misses the mark on both fronts.

It’s common knowledge that JavaScript and Flash aren’t SEO-friendly. However, I was surprised to learn in this article on ruining the user experience that it’s not just search engines that don’t like sites that depend too heavily on Flash and JavaScript. In fact, nothing can make a dial-up user push that stop loading button faster than the sight of the Flash loading bar, or worse, a site that won’t let you in unless you agree to spend the next three hours downloading software so you can see its menus.

It turns out that even “simple” drop-down menus can irritate the user. Usability research has found that users would much rather type state abbreviations directly into contact forms than fiddle with a drop-down menu. From an SEO standpoint, drop-down menus dilute keyword density which may even blur keyword relevance and make it harder to optimize the page for keyword search. In addition, badly implemented JavaScript menus can make a site almost impenetrable to anyone whose browser or operating system isn’t supported and most of the time, web designers are decidedly unsympathetic.

As a Mac user, I was highly amused when I heard the ironic tale of Apple Computer’s misfortunes with a page design. Apple once found itself presented with a page design that featured JavaScript menus that worked great on a PC but not on a Mac. It just goes to show that even obvious user requirements can be overlooked and designers need to beware of being blinded by the Flash (or in this case, JavaScript) at the expense of their clients, their customers and the search engines.

April 10 2007

Spam isn’t kosher – but you knew that – didn’t you?


Whenever a particular search engine optimization strategy is deemed to be less than kosher, a common condemnation is: “well, it might be interpreted as spamming”. For lots of our clients, that kind of comment isn’t viewed as helpful. Of course, they’re not spamming! What an idea! How could anyone see optimization for keyword search as spam?

Interestingly enough, identifying spam isn’t all black and white even for humans. When compiling the WEBSPAM-UK2006 database, researchers found that a central problem with many data sets in spam research was the lack of agreement between studies and even individual researchers as to what exactly qualified as spam. The identification process was analogized to that of confidently distinguishing pornography from art — in many cases, it is in the eye of the beholder.

So, what’s the answer? The growing focus on the duplicate content filters in search engine algorithms over the past few years have shown that things can change and can affect your rankings even when you didn’t do anything at all differently. How can you be sure that next year your efforts at keyword content optimization won’t suddenly put your site on the wrong side of the dividing line between good rankings and banishment?

We think the best way is to produce a quality site that you know your human visitors will appreciate. To do that, you need to keep on top of new developments in search engine optimization by reading blogs and online articles just like you’re doing right now. What do you think? Let us know.

© 2018 MoreVisibility. All rights reserved