Google released a way to submit new and updated URLs for indexing within Google Webmaster Tools. Using the Fetch as Googlebot feature, the search engine will be promoted to crawl the URL that is submitted. Google doesn’t guarantee that every page they crawl will be indexed, but this new feature seems to speed up the evaluation process.
The new solution can help in several situations:
That said, XML Sitemaps are still the best way to provide a complete list of your website’s URLs to Google and encourage The GoogleBot to crawl and index those pages. However, this new feature is ideal for times when you add new pages to your site or have a major update.
How to submit a URL
First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the submission is successful, you will see a new “Submit to index” link appear next to the fetched URL.
Once you click “Submit to index” a dialog box will be displayed and allow you the option to choose whether you want to submit only the one URL, or that URL along with all the pages linked to it.
You can submit up to 50 URLs a week via Fetch as a GoogleBot within the Google Webmaster Tools platform. Though, Google does limit URL submissions that link all of the pages listed within that URL to 10 per month.
This new feature is not the “end all” solution to getting your pages indexed, but it shows how Google is continuing to make strides toward interacting site owners.