“Fetch as Google” is an option available for webmasters to check how a web page will be looking when Googlebot crawls that page. This tool can be effectively used for the following purposes:
- Troubleshooting webpages to improve the performance in search engine result pages.
- Submitting pages for indexing whenever the content is changed significantly.
- Finding out the pages causing issues when a site is hacked by malwares.
How to Use “Fetch as Google”?
You need to have a Google webmaster tools account, added your site and verified successfully in order to use this option. Login to your Google Webmaster Tools account and navigate to “Fetch as Google” option available under “Crawl” section.
Enter your page URL or leave the box blank to fetch your website’s home page. The URL entered here is the one you want to troubleshoot or the content is changed significantly. Choose the Googlebot type as “Web” and click on “Fetch” button.
Troubleshooting Crawling Issues
After fetching a URL the status will be showing as “Success” or “Failed” under “Fetch Status”. Clicking on the success or failed link will show you the HTTP response received from the site’s server. This information can be used to analyze the crawling issue in detail and correct the issue to ensure Googlebot can crawl the page successfully next time.
The successful response from the server is shown in the below picture:
Submitting URLs for Indexing
After successful fetching of a URL, you can a “Submit to Index” link. Clicking on the link will provide you the following two options for indexing your pages:
- Submit only the fetched URL – select this option if your page is new or updated recently.
- Submit the fetched URL along with all the linked pages – select this option if your whole site is changed recently.
This helps webmasters not to wait until Googlebot crawls your page next time and request Googlebot to crawl it immediately. Normally Google indexes the new content in few hours after submission but it is not guaranteed by Google.
Detect Affected Pages by Malwares
Whenever a site is affected by malware, the source code seen in the web browser and crawled by the Googlebot could be different. This means your site may appear in search results irrelevant to your content keywords. In this case you can use “Fetch as Google” option to find out what exactly Googlebot sees when crawling your page and take corrective actions.
Google Webmaster Tools also offers malware detection tool which helps to identify whether your site is infected or not.
Fetching Redirected URL
During “Fetch as Google” Googlebot does not follow the redirects, hence you need to enter the final URL what the visitor will see in the browser’s address bar. You will see the status as “Redirected” when you try to fetch a redirected URL.
Clicking on the link will show the HTTP server response code as “301 Moved Permanently” and “Follow” button for fetching the target URL again.
Limits of “Fetch as Google” Option
- There is a limit for using “Fetch as Google” option as well as submitting your URLs under “Fetch as Google” option.
- The limit is applicable at account level and not on site level, which means fetching any site added in your webmaster tools account will reduce the limit.
- The limits are valid for one month; if your first URL is fetched today then the counter will be reset after a month from today.
|Number of fetches||500||Per Week||Per Account|
|Number of URL Submissions||500||Per Week||Per Account|
|Number of URL and all Linked Pages Submission||10||Per Month||Per Account|