While writing several blog posts and documentation, I often have used example.com to stand in for any domain name. One of the Internet standards established by the Internet engineers circa 1999 set aside example.com (as well as example.org and example.net) for documentation purposes. So if you were to click on a link to http://www.example.com in my post, you wouldn’t see an actual web page. Click on this link to see for yourself.
I’d like to demonstrate a fun little trick you can use to amaze your friends.
The page you see is when you go to http://www.example.com is completely indexable by the search engines. There’s not a lot of content, but you would think that the engines will have indexed the content exactly as your browser shows it to you. It turns out that there is a robots.txt file that blocks all spiders from all content inside www.example.com. (If you ever forget how to create a basic robots.txt file, you can use this one as a guide.) Alright, now for the punch line. Let’s see what the search engines really have indexed for http://www.example.com. Go to www.google.com and type “site:example.com” (without the quotes). What do you see? If you see only one result, click on the link: repeat the search with the omitted results included.
I see 10,400 results now. There are pages like example.com/blah/ and www.example.com/concepts. The Google search results page does not have links to the cached version for any of these results, unfortunately, so we can’t see what exactly Google has indexed from these pages, but we can go to the page ourselves. Well, I tried that, and every page I go to replies back with “Not Found.” It’s logical to conclude that those pages never existed, but also notice some of the results have been crawled by Google in the past few hours. Impossible, no?
You can try this search on other search engines too.
My feeling on this strange phenomenon is that it could either be Google’s own testing or other people testing or somehow tricking Google into adding these pages to its index. It may be relegated to certain data centers as well.
Whatever is causing this, I’m sure Google knows about it, but doesn’t feel the need to do anything about it. This phenomenon may also get you thinking about how search engines are supposed to work.
Defining the term “linkbaiting” is not an easy task. It can encompass many different techniques for many different practices, ranging from running contests to antagonizing other bloggers in the hope of gaining some sort of retaliation so they will link back to you. You can also provide other webmasters with tools (with embedded links back to your own site) that they can put on their site.
There are many types of linkbaiting. As outlined above, being controversial will always get your blog post or online article noticed and inspire someone to link to you from their site. Getting attention this way is not new. People have been doing this for years. Howard Stern isn’t popular just because he has a good radio voice. People take notice when he says something outrageous. This is a great business model and something that could be applied to your site. Controversial posts can get immediate and strong reactions. YouTube has gained notoriety for its plethora of wacky videos that people love posting on their MySpace profiles and showing to their friends.
People also love linking to “lists”. People seem to latch onto them and enjoy reading them because of their simplicity. They can also be very persuasive when all of the items get straight to the point and summarize each point succinctly. Using humor in your blog post or article can be great linkbait as people tend to enjoy funny observations and like passing them on.
Even though the term “linkbaiting” seems negative, it is really just a technique to get people to naturally link to you and get your site or blog noticed. The negative connotations associated with linkbaiting have arisen because of the various forms of “Blackhat” SEOs using unethical ways to get people to link to their content. Examples of Blackhat linkbaiting include over antagonizing a popular blogger in a comment just so he will engage in conversation with you and link back to your site. This is sometimes referred to as “comment spamming”. Also, using bizarre headlines to gain notice and then just copying and pasting someone else’s article is another example of shady linkbaiting. Blackhat linkbaiting is not just bad for your reputation, it can also get you penalized by the search engines and that is something you never want to risk.
All in all, having a purely informational website with dull content just doesn’t cut it anymore. When trying to vie for web supremacy, it is always a good practice to be as creative as possible. There really isn’t a definitive list or technique for baiting links, but people seem to be creating new forms of linkbait all the time.
The best pages for customer conversions on a website are the ones that provide the content that the user is looking for. Content is king. Good copywriting, as well as a well-designed site that provides the user with the exact page they are looking for is the key to getting more customer conversions, whether you are driving traffic to your site with search engines or through the new social media marketing channels.
Providing quality content has become increasingly important for effective internet marketing. Google’s algorithm updates and personalized search efforts at the end of 2008 made that clear and there is no reason to believe that this will be changing in 2009. To start off the New Year, Google’s Matt Cutts asked for user feedback on what new web spam Google could target to further improve their results this year. The response was overwhelming, with many good ideas for Google improvements submitted (see the comments for Matt’s summary). From among all the ideas, Google chose to put their first focus for 2009 on making the content in their results pages even more relevant by working on ways to remove or demote “noresults” reviews and cost comparison web spam shopping pages in the search engine results pages and I, for one, couldn’t be happier.
We’ve all seen these “noresults” pages. You type your key phrase into Google (usually something quite specific) and among the list of results you find a listing that reads something like this:
Search for “insert your key phrase here” Find “insert your key phrase here” reviews and price comparisons at Bigonlineshoppingsite.com
Search results for “insert your key phrase here” — Big Online Shopping Site has reviews for “insert your key phrase here”.
However, when you click on the link you find that the site doesn’t really have content to match your query. You find either there are no reviews or they never really had any content matching that phrase in the first place. They just have their system set up to automatically generate pages for popular search queries to try to capture your click. I’m not alone in finding this particularly frustrating. Besides concerns about the accuracy of the Google Maps results, this was perhaps the most popular complaint from Matt’s readers. In fact, many reviewers felt that this should be extended beyond Review and Cost Comparison Shopping sites to large retailers that also display results that have basically no content.
What this means is that as visitors, we can look forward to even better and more relevant results from Google. For retailers, this means that another old “trick” for luring visitors onto your site will be biting the dust and more than ever the quality of the content on your site will be first and foremost in determining your rankings in Google’s results pages.