Article Archive by Darren Franks

August 17 2011

How to Add Keywords to Web Pages in a Logical Way

by Darren Franks

Below are some SEO friendly ways to target a specific “theme” or keyword on important pages of your site for SEO purposes:

  • Write compelling content: The foundation of your website is to present content that is well written, has a specific purpose (am I selling something, presenting information etc.?)
  • Once the specific content goals of your website are established, write down the theme for all of your important pages and try to narrow the theme down into a two to five word keyphrase.
  • Take the list of keyphrases for all of your pages and perform the ever-important keyword research. Free tools such as the Google Keyword Tool: and paid tools, such as Wordtracker are very effective for researching things like search volumes and KEI (Keyword Effectiveness Index).
  • Once a keyword decision has been made, incorporate those words on pages of your site in an SEO and user-friendly way; the chosen keyphrase for your pages should be incorporated into the meta tags in the section of your site.
  • The title meta tag should contain the keyphrase for the page as close to the beginning of the tag as possible, as this piece of meta data is weighted the highest by the search engines.
  • The descriptions meta tags should contain, essentially, descriptive and compelling ad copy incorporating the primary keyphrase in a natural way; the better and more unique your descriptions are, the greater the probability that data from the descriptions will show up in the search results as the “snippet” of information below the title tag for the page.
  • The keywords meta data is now ignored by Google and Bing, so no need to over-think this one, but at least include the keyphrase for the page as keywords could be used again down the road to assign relevancy for a web page.
  • Learn about all of this and more on the MoreVisibility YouTube channel: SEO Video Tutorials.
June 29 2011

Google Tools 101: Page Speed Online

by Darren Franks

The relatively new (and free) Google page speed testing utility:   is a really invaluable tool for any web developer looking to identify which elements of code could be cleaned up to make their site faster. Why is this useful for SEO?

First, user experience should always be the primary goal for any website; how can I sell my product if the user gets frustrated with how long it takes for a page to load? Second, Google now considers page load time as a ranking signal in their algorithm, meaning that they will give ranking priority to sites that have optimized their pages for loading in people’s browsers.

What are some of the suggestions you are likely to see once you enter the URL of the domain you wish to check? What’s nice about this tool is that Google gives you a prioritized list of items to tackle and essentially says, “If you clean up these things first, you will likely have better results”:

Example of Google’s Page Speed Online tool

Page Speed Online suggestions could include optimizing (compressing) images, combining external JavaScript files so there are fewer requests to a server and removing CSS files that are no longer used.  

The very first thing you see once you have entered the URL will be a score out of 100; this will alert you to the severity of the issues that were found. A high score indicates little room for improvement, while a lower score indicates more optimizing to reduce page load time. Have you used Page Speed Online recently?

June 24 2011

Tell Google: “Do Not Enter” (And Bing Too)

by Darren Franks

As we (or most of us) know, we do not want the search engine spiders crawling around and indexing certain parts of our website. For instance, it is probably not intelligent to have spiders crawling the secure port of the server, i.e. secure parts of our websites (https://).
We also don’t want to have the Googlebot or the Bingbot crawling and indexing pages that would be a poor choice as a target for search, like PDF files, Excel documents of images. This is where the robots.txt file comes in.

The robots.txt file, uploaded to the root directory of the site (, tells the spiders which directories and pages to skip. Why do we want this? If someone were to find a PDF file or a Flash file in a search result and were to click on it, those types of documents generally don’t contain links leading back to the rest of the site and can be a “dead-end” for both the the search engines and for the user. A simple “Disallow” instruction in the robots.txt file will prevent non-SEO friendly pages from possibly showing up in search results for your desired keyphrases:

 Tell Google

© 2021 MoreVisibility. All rights reserved.