Get simple, actionable information you can use to gain insight into SEO, content production, competitor data and more with our SEO Tips & Tools blog posts. Learn how to use a variety of tools and browser plugins to see your website how the search engines see it, and find opportunities to enhance your content, link portfolio and SEO.
In a previous post, Google versus Bing: Webmaster Tools, I discussed the virtues of both Google and Bing’s webmaster tools websites. Today, I will give an overview of Bing’s main categories and how they can be used to improve the functionality, both for SEO and for the user, of your website:
Overview page, much like the Dashboard in Google Webmaster Tools, that gives you a “snapshot” of the most pertinent data for your site, such as recent trends pertaining to crawling, indexing and traffic. Newly added sites begin showing data within 3 days of gaining access.
With this section, you can view six months of crawl data like the amount of pages crawled and crawl errors. This is a really useful place to identify any potential problems with the Bingbot accessing pages on your site to include in its index.
See which of your site’s pages are in the Bing index. You can also view your inbound link data here, too.
What kind of traffic is being driven to your site? This section provides six months of traffic data and analyzes search query performance over time.
Index Explorer (Index Tracker)
One really helpful new addition is the ability to locate any “broken” pages on the site; 404 (removed), 500’s (server errors):
What makes this new tool even more useful is the ability to “filter” your results and actually display the errors on your site manually, page by page. You are even able to locate directories on your site that have been infected with Malware.
Crawl Delay Function
There is also a new crawl delay function which allows you to ask Bing to crawl slower during peak business hours and crawl faster during off-peak hours, allowing for better load times on pages from your site.
Crawl Parameters for AJAX
The relatively new (and free) Google page speed testing utility: http://pagespeed.googlelabs.com/ is a really invaluable tool for any web developer looking to identify which elements of code could be cleaned up to make their site faster. Why is this useful for SEO?
First, user experience should always be the primary goal for any website; how can I sell my product if the user gets frustrated with how long it takes for a page to load? Second, Google now considers page load time as a ranking signal in their algorithm, meaning that they will give ranking priority to sites that have optimized their pages for loading in people’s browsers.
What are some of the suggestions you are likely to see once you enter the URL of the domain you wish to check? What’s nice about this tool is that Google gives you a prioritized list of items to tackle and essentially says, “If you clean up these things first, you will likely have better results”:
The very first thing you see once you have entered the URL will be a score out of 100; this will alert you to the severity of the issues that were found. A high score indicates little room for improvement, while a lower score indicates more optimizing to reduce page load time. Have you used Page Speed Online recently?
As we (or most of us) know, we do not want the search engine spiders crawling around and indexing certain parts of our website. For instance, it is probably not intelligent to have spiders crawling the secure port of the server, i.e. secure parts of our websites (https://).
We also don’t want to have the Googlebot or the Bingbot crawling and indexing pages that would be a poor choice as a target for search, like PDF files, Excel documents of images. This is where the robots.txt file comes in.
The robots.txt file, uploaded to the root directory of the site (www.example.com/robots.txt), tells the spiders which directories and pages to skip. Why do we want this? If someone were to find a PDF file or a Flash file in a search result and were to click on it, those types of documents generally don’t contain links leading back to the rest of the site and can be a “dead-end” for both the the search engines and for the user. A simple “Disallow” instruction in the robots.txt file will prevent non-SEO friendly pages from possibly showing up in search results for your desired keyphrases: