Below are some SEO friendly ways to target a specific “theme” or keyword on important pages of your site for SEO purposes:
The relatively new (and free) Google page speed testing utility: http://pagespeed.googlelabs.com/ is a really invaluable tool for any web developer looking to identify which elements of code could be cleaned up to make their site faster. Why is this useful for SEO?
First, user experience should always be the primary goal for any website; how can I sell my product if the user gets frustrated with how long it takes for a page to load? Second, Google now considers page load time as a ranking signal in their algorithm, meaning that they will give ranking priority to sites that have optimized their pages for loading in people’s browsers.
What are some of the suggestions you are likely to see once you enter the URL of the domain you wish to check? What’s nice about this tool is that Google gives you a prioritized list of items to tackle and essentially says, “If you clean up these things first, you will likely have better results”:
The very first thing you see once you have entered the URL will be a score out of 100; this will alert you to the severity of the issues that were found. A high score indicates little room for improvement, while a lower score indicates more optimizing to reduce page load time. Have you used Page Speed Online recently?
As we (or most of us) know, we do not want the search engine spiders crawling around and indexing certain parts of our website. For instance, it is probably not intelligent to have spiders crawling the secure port of the server, i.e. secure parts of our websites (https://).
We also don’t want to have the Googlebot or the Bingbot crawling and indexing pages that would be a poor choice as a target for search, like PDF files, Excel documents of images. This is where the robots.txt file comes in.
The robots.txt file, uploaded to the root directory of the site (www.example.com/robots.txt), tells the spiders which directories and pages to skip. Why do we want this? If someone were to find a PDF file or a Flash file in a search result and were to click on it, those types of documents generally don’t contain links leading back to the rest of the site and can be a “dead-end” for both the the search engines and for the user. A simple “Disallow” instruction in the robots.txt file will prevent non-SEO friendly pages from possibly showing up in search results for your desired keyphrases: