SEO doesn’t end with your website. You can use offsite SEO techniques to improve your reach and achieve brand ubiquity. Learn how to enhance your SEO strategy with tips for offsite optimization, brand expansion, and content marketing. MoreVisibility’s SEO experts can guide you through the most up-to-date offsite SEO techniques and methodologies.
The task of keyword targeting your website for natural search results can be a daunting task and if done incorrectly can actually hurt the performance of your site in the search engines. One way keyword targeting can go horribly wrong is when you try to target a laundry list of keywords to one page (many times it’s the homepage) in hopes of getting good positions in the search results. Jamming multiple keyword phrases on one page is called keyword dilution and can cause your site to drop rank in the search engines.
What many people don’t realize is that a major factor that determines the number of keywords that can be targeted on a website is the size of the website. What it boils down to is that you should only be targeting 1 to 2 unique keyword phrases per page. So if your site only has ten pages then the maximum number of unique keyword phrases it can support is 20. And it may even be less then that depending on the competitiveness of the keyword phrase. For example if a keyword phrase that you want your site to show up highly for in the search engines is very competitive, then you will want it to be the only keyword phrase targeted on a single page, at about a 4% keyword density.
But what if there are different variations of a keyword phrase that’s really important to the website? Which variation do you choose? The answer is in finding the balance between popularity and competition of the keywords. That’s were keyword research comes into play. Find out what version of the keyword phrase users are searching for. From there, decide which keyword phrase your site has the best chance of ranking highly for based on the competitiveness of the keyword. Then, according to the number of pages you have on your site choose the number of keywords you will target. If you want to increase that list of keywords, then you will have to create new pages with unique content.
Content freshness is definitely a factor in search engine algorithms. For news items and blog posts, in particular, freshness gives a boost in rankings. Several major search engines have filed patents for gathering historical data about pages so not only is this a current factor but it is likely to be one that search engines will be adapting and improving over time to ensure that their customers receive timely and relevant results.
The big question, of course, is what does this mean to the site owner in terms of updating their content. Is it necessary to update pages frequently to get good results? The answer to this question is: it depends.
One of the main reasons that a site should try to feature fresh content is because it can increase the frequency of visits by search engine spiders. Naturally, if you are updating pages on a regular basis, you want that to be reflected in search engine indexes and you want the search engines to return in a timely fashion so that new information can appear in search engine indexes. Studies in index freshness have shown that, for over 68% of pages, Google requires about two days before the page is visible in the index. Yahoo is quicker — over 50% appear within a day.
However, search engines don’t crawl your site every day unless they have a reason and if they are only coming around every two weeks, this means that a page that you updated today might not appear in the indexes for over two weeks and that’s assuming that they actually accessed the page since they do not access every page every time they come. If your site is publishing timely news items or pages that need to be accessed quickly, content freshness is important. The good news is that these items are inherently fresh. Search engines will notice this and will return often to find your new content.
So, what if you are not publishing news items every day but you still want search engines to visit often? Should you try to change the content of your homepage every couple of days? Optimizing the content of a page so that it will rank well for a key phrase can be an arduous and painstaking process involving a lot of tweaking and experimenting before the page is just right. Changing the content of the page every day just for the sake of change is not a good idea. What about just updating the page by adding a word here and there? This is also not a good idea for two reasons:
Content freshness is actually one of the best reasons to include a blog on your site. You can take the opportunity to provide your visitors with timely news about your company and industry and even feature the occasional quick link to any new pages that you may have added. A well-written blog post has the advantage of actually being new content. Just make sure you post regularly. A regular pattern of adding new pages of content is the best encouragement for search engines to return on a regular basis.
Furthermore, if you do manage to convince search engines to visit your site regularly, it’s a good idea to make it easier to find those new pages. In other words, don’t update pages just to make old content look new. Only update pages that really do contain new content and then allow search engines to see that the content is new by setting your server to support Conditional Get or the If-Modified-Since request-header field. That way search engines are much more likely to find your new pages when they come and put them into search engine indexes to be found by your potential new visitors as soon as possible.
There are many link strategies available that can help your site achieve the rankings you are looking for. You can submit your site to directories, write articles and push out press releases. One practice to watch out for is link farming. This black hat practice has been around for a long time and continues today. Be wary of link farm schemes and other tactics like this.
The idea behind link farming was to get as many links from websites as you could. It didn’t matter if the sites were relevant or not, as Search Engines would supposedly consider a site more “Popular” because it has so many links pointing to it. Most of the links in link farms have no relational subject matter to each other. They will most likely have a page on a site with a laundry list of hyperlinked keywords called Anchor text pointing to the various sites in the farm. When search engines see that a link farm has formed, they will penalize all involved, thus dropping ranks. Some have reported increased rankings at first, but soon after report a drop to lower than they were when they started.
Signs of a Link Farm
If you should run across sites that have one or many of the characteristics, do not participate. If you are still unsure, contact your Strategist at MoreVisibility and they will gladly assist you in determining the right link building strategy.