Article Archive by Author

May 8 2009

Five Tips for Creating SEO Friendly URLs

by

SEO friendly URLs containing the keywords relevant to the page can help improve your rankings in search engines for a couple of reasons.

First, they can give the page a little boost for those keywords in the rankings.

Second, if the URLs are formatted correctly, the keywords in the URL can serve double-duty as anchor text if anyone links to the page.   This is another good way to boost a page’s keyword relevance.

For example, this link: https://www.morevisibility.com/press-articles.php automatically has anchor text including the words “morevisibility” “press” and “articles”, so just linking to it gives it a boost for any query containing those words.

However, just stuffing a bunch of keywords into a URL isn’t necessarily going to give you the best results. To create power-house URLs, follow these five tips:

  1. Separate keywords in URLs with dashes. This will make them more closely match a user query in search engines. People tend to separate words with spaces and a dash is the URL equivalent of a space.
  2. Place the primary key phrase as close to the root of the domain as possible.
  3. Include no more than five keywords in each URL. Remember that any keywords in your domain name are already included, so don’t bother to repeat them — it wastes space and could even trigger a spam filter. For example, using the word example twice in a URL like this one: www.example.com/examples-of-examples.php would be a big waste of valuable URL real estate and a lot of repetitions of one word.
  4. If your site structure allows, position the words in the same order that you expect a user to input them into a search query. Search engines tend to give higher rankings for pages that feature the queried key phrase in the same order as the searcher used.
  5. Keep the URL as short as possible — less than 78 characters is optimal for URL sharing in an e-mail or marketing the URL on popular social networking sites like Facebook and Twitter. URL shortening services are fine, but people who worry about catching computer viruses are more likely to click on a URL from a domain they recognize and trust.
Posted in: Tips & Tools

May 7 2009

The Impact of Page Load Time on Search Engine Rankings

by

Page load time is an important factor in website optimization if only for the fact that if the pages of your site take too long to appear, users can become impatient, stop the load and go to another website. In fact, this is one of the big reasons I was so fond of Google right from the start — their nice clean homepage design had one big advantage over their competitors at the time — it came up quickly and gave me what I wanted right away without making me wait for pictures and other Flashy stuff to load. So we know that fast load time is good for users but what effect does load time have on search engine rankings? This is a question that comes up quite often. Matt Cutts of Google recently asked for topic suggestions for his latest video and this was the number one question.

So, can a delay in page load time affect your search engine rankings? The short answer is yes. Even if load time is not directly a factor in the search engine algorithms, slow load time could lead to a loss of rankings for a website — particularly a very large website. The reason is that in order for your pages to rank to their full potential in the search engine results, search engines need to have accurate information about them. Both the content on the page and the linking structure of the page are important factors in the search engine algorithm.

To illustrate further, consider this diagram to be a rough model of a site’s linking structure. If the page site-crawlrepresented by the node highlighted in yellow isn’t crawled by the search engine spider, all the other links that originate from it may not be found either, and that can affect not only the ranking of the top page but also the ranking of all the other pages beneath it, even if the search engine spiders can find those pages another way. This is because they won’t have the full information about how these pages are linked together.

 

If search engines do not see the full linking structure of a page because it takes too long to crawl the links on your site, linking structure information is not included in the index and the page will rank lower in the search results than it deserves. This is why including an xml sitemap on your site is not good enough to get individual pages indexed and ranked. Search engines have to see how the pages fit together as well.

Search engines cannot give you any credit for something they don’t see. The spiders have a limited amount of time to crawl a site and if page load time is too long, they are less likely to fully crawl your site and that can affect your rankings. Recently, Live.com’s Webmaster Blog did a special four-part series describing special optimization issues for large websites that featured some excellent advice for webmasters with tips for optimizing content, site structure and server configurations. Helping spiders get at your content as quickly and efficiently as possible is an important aspect of search engine optimization so load time should always be given a high priority in any large-scale website optimization project.

April 2 2009

How Important are the Results of the Keyword Density Tool for Search Engine Optimization?

by

The keyword density tool is a standard part of any search engine optimization tool arsenal but how important are the results of the tool for determining how well a page is optimized? What role does keyword density play in the search engine algorithms?

First, as far as we know, keyword density has never been the formula used by search engines to calculate keyword relevance for a page. Unfortunately, we don’t know what the true formula is (it’s a secret and Google won’t tell).

Most believe that it is actually based on term weight and that would probably be a better way to evaluate keyword use on a web page. However, one of the main components for calculating term weights is the total number of times the keyword is used on the internet. We have no way of knowing what that number is because even if we could crawl the whole internet and find all the pages and all the words, we still wouldn’t know if our crawl was the same as Google’s or Yahoo’s or any other search engine. This is why we can’t create a tool to more sharply evaluate keyword use on a page using term weight — we don’t have access to all the data. (And it may also explain part of the reason why you see different results across search engines — they have different indexes).

However, there are some things we do know:

The number of times that the key phrase (or parts of it) appears on the page carries some weight. We know this because we have seen rankings change after keywords are added to pages. We have also observed sites undergo “penalties” for stuffing keywords into meta tags, alt attribute tags or in fine print at the bottom of the page — commonly referred to as “keyword stuffing”. Too many unnaturally placed keywords may trigger a spam filter and cause rankings to go down. This appears to be algorithmic so we have to assume that counting the keywords on the page has some place in the algorithm, however small.

The total amount of page content carries some weight. Pages with more content tend to rank better (all other things considered).

Overall keyword theme relevance carries some weight — search engines evaluate the content based on factors like proximity of the keywords to each other, ordering on the page, position in the overall body content, positioning in important places on the page, as well as the thematic relationship between the key phrase and other words on the page. In some cases, synonyms of the key phrase can even count toward keyword relevance. So, the algorithm is far more complex than a simple keyword density calculation, but keywords undoubtedly do feature in the calculation.

Finally, we know that if search engines cannot properly interpret the code on the page, they cannot “read” the content properly. This could result in parts of the code being included in the text indexing for the page possibly resulting in lowered overall keyword relevance for the page. Search engines might even have trouble seeing the keywords on the page which could affect any keyword weighting for relevance. While it is true that they likely attempt to filter out this kind of noise in the indexes, we do not recommend leaving it to the robots — it’s best to control what they see whenever possible.

So, we use keyword density analysis tools as a quick way to look at the content on websites the way that a spider might see it. If the keyword density is too high, it can be a signal that the page may trigger a spam filter. If the keyword density analyzer doesn’t find the keyword at all or the density is very low, it can indicate that the page is not appropriately targeted. Because the keyword density tool is basically a simple spider, if it cannot properly “read” the content, it could indicate problems with the coding on the site.

That said, if the content on the page is properly focused on the keyword theme of the page and the keyword appears in all of the important places, title tag, description tag, headline, a couple of times in the main text content (ideally in the first and last sentences), the text is probably fine even if you don’t get up to the “optimal” keyword density. Furthermore, if your key phrase is very long (more than 3 words), using it 4% of the time on the page would be very unwieldy so you may not even want to use the keyword that many times. Content should always be written for people first, with the keyword density tool just used as a kind of reality check to make sure that it’s also going to be okay for spiders. A real “optimized” page will have content that is useful for visitors and will attract quality inbound links.

© 2018 MoreVisibility. All rights reserved