Articles in The 'Robots.txt-file' Tag


June 24 2011

Tell Google: “Do Not Enter” (And Bing Too)

by Darren Franks

As we (or most of us) know, we do not want the search engine spiders crawling around and indexing certain parts of our website. For instance, it is probably not intelligent to have spiders crawling the secure port of the server, i.e. secure parts of our websites (https://).
We also don’t want to have the Googlebot or the Bingbot crawling and indexing pages that would be a poor choice as a target for search, like PDF files, Excel documents of images. This is where the robots.txt file comes in.

The robots.txt file, uploaded to the root directory of the site (www.example.com/robots.txt), tells the spiders which directories and pages to skip. Why do we want this? If someone were to find a PDF file or a Flash file in a search result and were to click on it, those types of documents generally don’t contain links leading back to the rest of the site and can be a “dead-end” for both the the search engines and for the user. A simple “Disallow” instruction in the robots.txt file will prevent non-SEO friendly pages from possibly showing up in search results for your desired keyphrases:

 Tell Google

May 11 2009

It’s Important to Focus on Many Aspects of SEO

by Darren Franks

Focusing on just one aspect of SEO can hinder more than help your optimization efforts. For instance, only taking care of your keyword research can prove a futile effort if search engines are unable to crawl your site. Having too much code on the page can increase the code to content ratio, thus reducing the density of the targeted keyword for that page.

Also, if you set up your robots.txt file incorrectly by not writing the correct instruction to not index certain pages, ones optimization efforts can be thwarted. If a large website with thousands of pages only wants the most important pages indexed, the search engine spiders may only index a certain portion of the site and may not be able to include the more important pages in its index. Another factor that may hinder a website’s performance if not included in optimization is the lack of quality inbound links to the site. You could have the best web content in the world, but if people are not linking to it, the lack of traffic to the site could make the rest of the optimization worthless.

It’s also important to ensure that all of the meta data on the site is as unique as possible. To increase the reach in the search engines for multiple search terms, keeping your titles, descriptions and keywords as unique as possible can make the world of difference in search engine indexing.

To conclude, it is intelligent to have a holistic approach to SEO as opposed to only utilizing one aspect. To truly give your website the best chance to rank well in the search engine results pages, fixing multiple SEO issues can only serve to give you that extra edge.

© 2023 MoreVisibility. All rights reserved.