Articles in The 'Webmaster Tools' Tag

August 5 2011

New Way To Submit URLs To Google

by Mike Siers

Google released a way to submit new and updated URLs for indexing within Google Webmaster Tools. Using the Fetch as Googlebot feature, the search engine will be promoted to crawl the URL that is submitted. Google doesn’t guarantee that every page they crawl will be indexed, but this new feature seems to speed up the evaluation process.

The new solution can help in several situations:

  1. New site/page launch: If you’ve just launched a new site, and/or added new pages, you can ask Google to find and crawl it them immediately.
  2. URL Reconsideration: If you have recently updated key pages on the site and would like Google to index the latest version, you can submit the updated pages for reconsideration.
  3. Accidental Indexing: If you find that Google has indexed a page you did not want to show, you can submit your site to update the cached version after you’ve removed the page from your site.

That said, XML Sitemaps are still the best way to provide a complete list of your website’s URLs to Google and encourage The GoogleBot to crawl and index those pages. However, this new feature is ideal for times when you add new pages to your site or have a major update.

How to submit a URL
First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the submission is successful, you will see a new “Submit to index” link appear next to the fetched URL.

New Way To Submit URLs To Google

Once you click “Submit to index” a dialog box will be displayed and allow you the option to choose whether you want to submit only the one URL, or that URL along with all the pages linked to it.

New Way To Submit URLs To Google

You can submit up to 50 URLs a week via Fetch as a GoogleBot within the Google Webmaster Tools platform. Though, Google does limit URL submissions that link all of the pages listed within that URL to 10 per month.
This new feature is not the “end all” solution to getting your pages indexed, but it shows how Google is continuing to make strides toward interacting site owners.

April 26 2011

Google versus Bing – Part 1: Webmaster Tools

by Darren Franks

Let’s talk about Bing Webmaster Tools (even if no one else is) and how it compares to Google’s version. I say that with all due respect, because, well, Bing has certainly tried. Below are the major features of each:

Google Webmaster Tools has (that Bing doesn’t):

  • Ability To Test Robots.Txt
  • Remove URL Or Directory From Index
  • Set Up A Change Of Address
  • Links To Your Site
  • Ability To Block Sitelinks

Bing Webmaster Tools has (that Google doesn’t):

  • Submit URL To Be Indexed Or Re-Crawled
  • Ability To View Average Click Position
  • Number Of Pages With Crawl Errors By Day
  • Date Page Was Discovered

It’s interesting that even though Google seems to have many more options within its Webmaster Tools, Bing has some really comprehensive data that Google does not. Bing’s crawl data seems to be much more granular in that it will give you the specific date that a page was found by their robots. It’s also interesting that Bing gives people the ability to resubmit a URL to be crawled, but Google only gives people the ability to remove a URL or page from its index. Google Webmaster Tools, however, seems like it has more “configuration” tools and Bing Webmaster Tools seems more of a “diagnostic” tool.

Regardless of which one you prefer, Bing has certainly made headway over the past year or so. Their re-inclusion of presenting inbound links data makes their webmaster tools more robust, but Google’s version still seems to one-up their competitor in that you are given more opportunity to configure different elements of parts of your website for Google’s crawlers.

December 20 2010

Have you Checked out Google Webmaster Central Lately?

by Darren Franks

Any good webmaster that has a predilection for a well optimized site should always be perusing Google Webmaster Central ( for the latest Google features. These updates on their blog are really useful for keeping up on the current innovations from the worlds most popular search engine.

Some recent blog posts of note include:

  1. Search queries with top pages: This post announces the addition of the ability for site owners to use Webmaster Tools to analyze impression, click, and position data for their top pages.
  2. More queries show additional results from a domain: Instead of the usual 2 results for a type of search query, Google will now serve around 4, meaning that some sites will have much more of a presence for exact name matches for their company name or brand.
  3. Control crawling and indexing of your site: Google has created a new subdomain,, which explains, in a lot of detail, about how to correctly set up a site’s robots.txt file and robots meta tags.

It can sometimes be a daunting task to sift through all of the information on the Internet about how Google wants you to design your site when it comes to indexing and ranking. Webmaster Central is a useful resource if you are on the fence about designing your site for SEO.

© 2024 MoreVisibility. All rights reserved.