What is Fetch as Google in Search Console?
Google Search Console is a wonderful way to understand search engine issues and optimize your site for Google. “Fetch as Google” is an option available for webmasters to check how a web page will be looking when Googlebot crawls that page. This tool can be effectively used for the following purposes:
- Troubleshooting webpages to improve the performance in search engine result pages.
- Submitting pages for indexing whenever the content is changed significantly.
- Finding out the pages causing issues when a site is hacked by malwares.
How to Use “Fetch as Google”?
You need to have a Google Search Console account, added your site and verified successfully in order to use this option. Login to your Google Webmaster Tools account and navigate to “Fetch as Google” option available under “Crawl” section.
Enter your page URL or leave the box blank to fetch your website’s home page. The URL entered here is the one you want to troubleshoot or the content is changed significantly. Choose the Googlebot type as “Desktop” and click on “Fetch” button.
Troubleshooting Crawling Issues
After fetching a URL the status will be showing as “Complete” or “Partial” under “Status”. Clicking on the success or failed link will show you the HTTP response received from the site’s server. This information can be used to analyze the crawling issue in detail and correct the issue to ensure Googlebot can crawl the page successfully next time.
The successful response from the server is shown in the below picture:
Fetch and Render
Google later introduced an additional option of fetch and render to help webmasters who do not know how to analyze the fetched text content by Googlebot. If you want to see the visual look of the fetched page then click on “Fetch and Render” button instead of “Fetch” button when fetching an URL. After successful fetching click on the status and then view rendered page under “Rendering” tab.
Googlebot not only fetches the given URL and also accesses all linked resources on that page. If there are blocked resources Googlebot can’t access then all those links will be listed below the rendered page.
Submitting URLs for Indexing
After successful fetching of a URL, you can submit the page for indexing by clicking on “Submit to Index” button. Clicking on the link will provide you the following two options for indexing your pages:
- Crawl only this URL – select this option to instruct Googlebot to crawl only the page content and all links within the page will not be crawled.
- Crawl this URL and its direct links – select this option to instruct Googlebot to crawl the page and all links in the page.
Submitting to index option helps webmasters not to wait until Googlebot crawls your page next time and request Googlebot to crawl it immediately. Though crawling will happen within few minutes after your submission indexing of the page depends on general webmasters guidelines and Robots directives.
Detect Affected Pages by Malwares
Whenever a site is affected by malware, the source code seen in the web browser and crawled by the Googlebot could be different. This means your site may appear in search results irrelevant to your content keywords. In this case you can use “Fetch as Google” option to find out what exactly Googlebot sees when crawling your page and take corrective actions.
Google Search Console also offers malware detection tool which helps to identify whether your site is infected or not.
Fetching Redirected URL
During “Fetch as Google” Googlebot does not follow the redirects, hence you need to enter the final URL what the visitor will see in the browser’s address bar. You will see the status as “Redirected” when you try to fetch a redirected URL.
Clicking on the link will show the HTTP server response code as “301 Moved Permanently” and “Follow” button for fetching the target URL again.
Limits of “Fetch as Google” Option
- There is a limit for using “Fetch as Google” option as well as submitting your URLs under “Fetch as Google” option.
- The limit is applicable at account level and not on site level, which means fetching any site added in your webmaster tools account will reduce the limit.
- The limits are valid for one month/week; if your first URL is fetched today then the counter will be reset after a month/week from today.
When submitting a fetched page for crawling Google shows the restriction and pending submission limits for your account. The limits for fetch as Google option is consolidated and provided below for reference:
|Number of fetches||500||Per Week||Per Account|
|Number of URL Submissions (Crawl only this URL)||500||Per Month||Per Account|
|Number of URL and all Linked Pages Submission (Crawl this URL and its direct links)||10||Per Month||Per Account|