Google released a way to submit new and updated URLs for indexing within Google Webmaster Tools. Using the Fetch as Googlebot feature, the search engine will be promoted to crawl the URL that is submitted. Google doesn’t guarantee that every page they crawl will be indexed, but this new feature seems to speed up the evaluation process.
The new solution can help in several situations:
That said, XML Sitemaps are still the best way to provide a complete list of your website’s URLs to Google and encourage The GoogleBot to crawl and index those pages. However, this new feature is ideal for times when you add new pages to your site or have a major update.
How to submit a URL
First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the submission is successful, you will see a new “Submit to index” link appear next to the fetched URL.
Once you click “Submit to index” a dialog box will be displayed and allow you the option to choose whether you want to submit only the one URL, or that URL along with all the pages linked to it.
You can submit up to 50 URLs a week via Fetch as a GoogleBot within the Google Webmaster Tools platform. Though, Google does limit URL submissions that link all of the pages listed within that URL to 10 per month.
This new feature is not the “end all” solution to getting your pages indexed, but it shows how Google is continuing to make strides toward interacting site owners.
Let’s talk about Bing Webmaster Tools (even if no one else is) and how it compares to Google’s version. I say that with all due respect, because, well, Bing has certainly tried. Below are the major features of each:
Google Webmaster Tools has (that Bing doesn’t):
Bing Webmaster Tools has (that Google doesn’t):
It’s interesting that even though Google seems to have many more options within its Webmaster Tools, Bing has some really comprehensive data that Google does not. Bing’s crawl data seems to be much more granular in that it will give you the specific date that a page was found by their robots. It’s also interesting that Bing gives people the ability to resubmit a URL to be crawled, but Google only gives people the ability to remove a URL or page from its index. Google Webmaster Tools, however, seems like it has more “configuration” tools and Bing Webmaster Tools seems more of a “diagnostic” tool.
Regardless of which one you prefer, Bing has certainly made headway over the past year or so. Their re-inclusion of presenting inbound links data makes their webmaster tools more robust, but Google’s version still seems to one-up their competitor in that you are given more opportunity to configure different elements of parts of your website for Google’s crawlers.
Any good webmaster that has a predilection for a well optimized site should always be perusing Google Webmaster Central (googlewebmastercentral.blogspot.com) for the latest Google features. These updates on their blog are really useful for keeping up on the current innovations from the worlds most popular search engine.
Some recent blog posts of note include:
It can sometimes be a daunting task to sift through all of the information on the Internet about how Google wants you to design your site when it comes to indexing and ranking. Webmaster Central is a useful resource if you are on the fence about designing your site for SEO.