Article Archive by Author

August 5 2008

Outbound Links: A Brief (Re)Consideration

by

Anyone familiar with search engine optimization (SEO) has likely cautioned others about outbound links.   For those new to the term, the links that come from your website and go to other locations online are called outbound links.   The reason for cautious words is that as they originate from your site, in basic terms of online reputation management they are the most valuable links you have and must be utilized very carefully.

The cautionary tales told in SEO regarding outbound links usually involve mention of linking to a penalized domain (which can be harmful to your rankings), unstable links (which are links that continually change — these are viewed as an indicator for link farms or link networks), being in a “bad link neighborhood” (the result of linking to sites that have been penalized or banned), as well as paid / free reciprocal or free-for-all linking.

It’s perhaps easier to think of it in this way.   When contemplating adding an outbound link to any page on your site, ask yourself, “Is this a site I want my company to be associated with?”   As mentioned earlier, this question can be regarded as part of the important online reputation management dimension of the Internet.

If trusted, recognized, or authoritative websites are believed to be safe, what about useful and relevant websites?   One of the primary points of concern usually voiced with outbound links is that of losing traffic or visitors as they click away from your site.   If the outbound links from your website are useful and relevant, there is a very good chance that your site won’t “leak” traffic.   Let’s take a brief look at social media, in particular a site such as Digg.com.   With their model, the outbound links that are offered as the result of their user base submitting articles results in users returning to the Digg site in order to see what other sites of interest they might find.   The same can be true of your website.   If you offer useful and relevant links, your site visitors are more likely to come back in order to discover more.

Of course, the ancillary benefit to having stable, useful, trusted outbound links is the increased likelihood of those sites linking back to you.   That shouldn’t be the driving force behind adding outbound links, but it can yield positive results.

There is one final note to keep in mind whenever there is an appropriate outbound linking opportunity, and it ties in with usefulness and relevance: there’s more to linking out to another site than simply adding the link.   In order to provide value to the site user (and then pass the value on to you), the outbound link should be placed in an area where it is most useful and relevant.   For example, don’t just have outbound links on your homepage — include outbound links in relevant areas of the site, such as on a top level page at the end of body text on a related subject.   Ultimately, you should always be sure that any outbound link is of value to your business and reflects well on you.

July 22 2008

Google Webmaster Central

by

While most of us are primarily familiar with Google as a search engine, there are several products and services offered by the company that can help webmasters gain visibility into their websites. While Google Analytics is one of the better known of this suite of tools (we have an in-house Web Intelligence team as well as a web analytics blog which features articles on it), there is another very useful tool webmasters can use. Conveniently enough, it’s called Google Webmaster Central.

Just like Google Analytics, Google Webmaster Central is offered as freeware. All you need to gain access to these services is a Google account. But, once you are in, what can you do? Well, more than you might realize. Once you verify that you are the website owner (by the addition of a slim-line authentication code in your site metadata or via upload of a validation HTML file), Google Webmaster Central offers a fairly robust selection of services, among which are:

  • Diagnostic information, such as the ability to identify “crawl errors”. Google Webmaster Central will show   you the number of each type of crawl error the search engine has found on your website (with links to the individual URLs with errors).
  • Top search queries. You are able to view and research the top searches that bring visitors to your site from the Google search engine results pages (SERPs).
  • Visibility into what the GoogleBot sees. This is basically a detail of the words used most often in your website. As the search engine’s spiders are essentially “blind” (i.e. they can’t see the images used on your site), the way in which relevance to a particular search term is determined is from reading the words on the site’s pages. Knowing what the GoogleBot sees can help you with your search engine optimization (SEO) efforts, as well as your web accessibility compliance efforts.
  • A listing of external links. This will show you a list of pages on your site with external links to them, along with the number of links to each page. With this listing you can also click through to see the list of external URLs.
  • A listing of internal links. This list is presented in an alphabetized format, showing links from your website to other pages within the site (this is also referred to as inter-linking).
  • Statistical information for RSS/ATOM feeds. With Google Webmaster Central you can obtain information on the number of subscribers to each feed on your website via the Google Reader and iGoogle. It’s important to note that, if your site offers feeds using a service like FeedBurner, the data in Google Webmaster Central may not match the data from FeedBurner. The reason is that there’s currently no way for site owners to upload a FeedBurner file to the domain or to put an authentication/verification meta tag on the home page. Without this authentication, feeds served up via FeedBurner can’t get added to Google Webmaster Central.
  • A listing of site links. This is the list of links and titles that Google has generated for the site and appears in the SERPs.
  • Identification of site issues. Your site may have content problems. If there are any issues with missing, duplicate, or short titles or meta descriptions, you can find this information as part of the webmaster toolset.

As Google has the lion’s share of Internet search traffic (with an active reach for 59.41%*), understanding how Google views your site, and diagnosing potential problems, is crucial to increasing your site’s visibility. Learning how Google’s robots crawl and index your website, learning what drives traffic to your site so you can refine your SEO efforts, and actually telling Google about your site by using Google Webmaster Central can help to improve your crawl-ability.

Google Webmaster Central
*Figure 1: Nielson Online — Top Online Web Brands in the U.S.

By using the Google Webmaster Central service and its various tools, you can obtain information on how Google, and, by extension Yahoo, MSN, and the other search engines, sees your website. Google Webmaster Central is an excellent way to obtain direct, expert support, diagnose any site errors, and improve your site’s search visibility.

July 8 2008

The ABC’s of SEO

by

How do the search engines know how to find what it is that you are looking for? When you enter a query, or a series of words or a phrase into Google, MSN or Yahoo, how do they go about giving you sites that relate to your search?

Let’s try to answer this in a straightforward and somewhat simplified way. When it comes to search engine optimization, or SEO, there is usually mention of complex algorithms and predictive analytics. Let’s see if we can boil things down to a basic, real-world example.

Say you have a website, www.example.com. Typically, you would like for Internet users to find your site and to read more about your products or services. You have your content, titles, descriptions, and keywords, you’ve tackled your in-site linking and inbound linking, and you may even have a social media marketing plan in place. But how does it all come together on the back end? If you are doing everything right in terms of SEO best practices, why and how do your search engine rankings change?

At the present moment there are well over a billion pages of index-able content on the Internet. The search engines act as a way in which we can sort through all of that information, and, in turn, use that information to answer a question. The search engines collect and categorize information so they can help to answer the most basic question, that of relevance. This is how your query, the search term we mentioned earlier, relates to the information contained in the search engines’ massive databases. The question is really a matter of determining which web pages are most relevant to the terms A, B, and C.

Having over a billion pages to work through, the search engines have to manage that information in a way which ensures that less relevant information doesn’t appear at the forefront of the search engine result pages (SERPs). This isn’t part of some conspiracy — it’s a matter of trying to make the results as useful as possible, hence the focus on high relevance versus low relevance.

But how do the search engines determine what’s relevant? For this SEO blog post we’ll define “relevant” as being those web pages that have the terms which most closely match the words (keywords) the web searcher typed into the search engine.

This relevance is determined by the search engines roughly assigning a score (as in the case of Google it is PageRank), which is also how websites are listed in the SERPs. Obviously, with the vast number of websites and web pages, there is a wide variety of ways in which to measure and score relevance, some or all of which may be employed by the search engines. As these algorithms are proprietary to Google, MSN, and Yahoo, we don’t know exactly what is being used or in which way, but we do have first-hand experience, sound anecdotal evidence, as well as a wealth of research and observation to come to one conclusion when it comes to search engine relevance.

Content is king and the text is the thing. What can be considered as being the text on your website? It’s what appears in the title tags, URLs, anchor text, image alternate text, the comments (if you have a blog), the description meta tag and keywords meta tag, and in the formatted (or unformatted) visible text areas of a page. You can cultivate the relevance of pages on your site through in-site linking. The text as found on one page can be supplemented by information that is somehow associated with that page (as in a link) as well as related pages which link to that page.

Why is text so important to determining relevance? Much like a human visitor to your website uses the text on the page to figure out what the site is about and where to navigate to next, the search engines do the same. The placement of the visible text, as well as how it is emphasized and used, help the search engines understand what the pages on your site are about. Building keyword density and using keywords with prominence can help to assert the page’s relevance.

Changing the text in the pages changes the relevance. This is why there are changes to rankings and positioning on the SERPs. Even changing the links, images, or even something off site, such as directory descriptions, can tell the search engines that they should re-evaluate your site to ensure that it is still relevant to the previously associated terms or perhaps has greater relevance to a new set or words of phrases.

The thought to keep in mind is that you aren’t the only one changing or optimizing text — your competitors are doing the very same with their web content. Changes made by other websites within your industry or space can indirectly influence your relevance. This is because the search engines collect and compile all that they know about all the websites and web pages they find as they crawl the Internet, and relay this information back to a search engine user based on how all of this information relates to his or her query. So as you optimize www.example.com, www.example.net and www.example.org are likely performing similar activities. Refining and targeting your content by using unique and relevant keywords, as well as keeping a level of freshness for your content and cultivating trusted and relevant inbound links to your site are just some of the ways in which you can help your website to keep its relevance in the search engine algorithms. Content remains king and relevance and usefulness are the underlying forces which ensure content will remain supremely important. Understanding how important it is to the search engines will help both your SEO efforts and your site to grow.

© 2017 MoreVisibility. All rights reserved