Your Robots.txt File and Why Frequent SEO Audits Are Crucial

Chuck Forbes - December 20, 2023

I have had the opportunity to work with many clients during my tenure at MoreVisibility, and no matter the client or goal, our team has succeeded because we all have a passion for finding out the “why” within digital marketing strategies. I was discussing SEO strategy with a client, more specifically “why” some of their organic rankings took a hit. With the help from our SEO experts and hearing what the client was seeing within their reporting, we were able to pinpoint a major flaw on their site where the robots.txt file was not properly placed. I thought this was a great example to share, not only for the knowledge of how to use a robots.txt file, but also to shed light on the importance of frequent SEO audits. Too often SEO site audits are put on the backburner because SEO rankings can take longer to optimize for than paid advertising. Naturally, if a company is looking to show results fast, optimizing their paid advertising strategy can be where most of the time is spent. Here is a refresher on robots.txt files and some great tips on how thinking about SEO differently can lead to a more stable auditing calendar for your team.

What is a Robots.txt file?

Every search engine deploys crawlers that look over your website in order to feed the best search results at the correct time to their users. These crawlers have a full time job, as websites change frequently. From adding new content, rebranding, removing pages and linking social media accounts – crawlers from search engines will decide a cadence to review your site so they are also up-to-date with your changes. A robots.txt directs the search engine crawler when they are on your site, telling them which pages they can access and which pages are not important. Your robots.txt file acts like an instruction manual for search engine crawlers so your site is not overloaded with crawler requests. It also helps the crawler understand your site better and more efficiently – just like an instruction manual, if you bought furniture and it came without instructions, you would be left determining which pieces are important, where they go and how the furniture needs to be assembled. This can make a simple one hour furniture project last a whole day – the same concept applies if your site does not have an updated and properly placed robots.txt file. The robots.txt file does not act as a tool to tell search engines which pages they cannot crawl. This is a common mistake, your SEO team should be using a ‘no index’ tagging strategy for those pages you do not want search engines to consider or see at all.

What Can Happen if Your Robots.txt File is Not Used Properly?

While there is no way to know one hundred percent the factors and the weight they carry for search engine algorithm rankings, we can make educated conclusions based on our expertise and history of finding out the “why” for our SEO clients. Using the example of the client I mentioned to start this blog, and my knowledge from previous client work, here are some negative impacts that companies have seen when not deploying a proper robots.txt file – or in some cases, not having one at all!

  • Videos on the site that used to rank well took a large hit in rankings.
  • Site data, such as the reported number of indexed pages in Google Search Console were drastically different then in previous time periods. Google Search Console also reported more alerts or errors that the SEO team should review.
  • Companies with an international presence saw a ranking difference within different countries, but it didn’t always align with where the heaviest competition was.
    Traffic data from Google Analytics also showed discrepancies when comparing time periods.
  • These four factors are enough to make you question the validity of your data, a situation all digital marketers want to avoid. Properly using a robots.txt file is not a large undertaking for a digital team, but the consequences of not keeping it updated can be great.

How Often Should I Audit My Site?

As I mentioned in the opening, how frequent you audit your site is less about resources and more about how you think about SEO. Working with clients, you develop relationships with marketing teams that boast different levels of digital marketing knowledge. Because of this, you get good at explaining technical factors in ‘layman’s terms.’ I like to think about the pieces of my digital marketing strategy like real transportation. For example, paid advertising is my airplane. It can be a fighter jet or a commercial airliner – what matters is the speed at which it can get you from one place to the next. Not as fast as my airplane is SEO – or as I like think about it, my cruise ship. Elegant and just as stable as a ride on my plane, except the cruise ship move differently. Between the size of the ship and traveling in water, I need to have clear coordinates for where I am sailing to. Unlike my plane that can change direction any minute and reroute to a new destination, my SEO cruise ship is a force moving in one direction that will take time and patience to change. Before I even set sail, the cruise ship requires much more pre-planning and prep time from the staff and captain – the same goes for your SEO. However, what makes a successful cruise is all the crew members knowing their responsibilities and always thinking ahead so the ride is seamless for the passengers (aka your website visitors).

If you think of SEO in this way, then it becomes clear auditing your site is something that should never stop. Instead of taking one month out of the year to deep dive into your website, you can develop monthly health checks and attack a checklist or priorities with your team, as the ‘cruise ship’ keeps moving. I like to think about digital strategy in this manner when trying to figure out the “why” with my team – consistent best practices is still the best way to garner SEO wins.

© 2024 MoreVisibility. All rights reserved.