Google Search Console is a wonderful way to understand search engine issues and optimize your site for Google. “Fetch as Google” is an option available for webmasters to check how a web page will look when Googlebot crawls that page. Google renamed the tool in new Search Console interface as “URL Inspection” with similar functions. In this article let us explore more on what is fetch as Google and how to use URL inspection tool.
Related: Complete guide to Google Search Console.
When to Use URL Inspection Tool?
You can use this tool for the following purposes:
- Find index status of any webpage on your verified domains.
- Submitting pages for crawling and indexing whenever you have changed the content.
- Troubleshooting webpages to improve the performance in search engine result pages.
- Finding out the pages causing issues when a site is hacked by malware.
- View live version and crawled page to find any differences.
How to Use URL Inspection in Google Search Console?
- Login to your Search Console account and navigate to “URL Inspection” option from the sidebar.
- Enter the URL of your web page that you want to request for crawling in the search box and hit enter key.
- It will take few seconds for Google to fetch the page content from the index.
- You will see the status of the submitted URL in Google index. Remember, this is the fetch status of Googlebot Smartphone from Google index and your live page maybe different than this status.
- If you have modified the content of the page or you find the current indexing status is not correct, then click on “Test Live URL” button.
- Now, Google will try to fetch content from the live page directly and show you the status.
- If you find the current status is in green, click on “Request Indexing” link. You will see a pop-up like below showing the indexing of the URL is requested and the page is added to a priority queue.
- You will see the status changed to “URL is available to Google” showing the status as “Indexing Requested”. If needed, you can request again for indexing. However, submitting multiple requests will not change the queue priority in Google.
Submitting to index option helps webmasters not to wait until Googlebot crawls your page next time and request Googlebot to crawl it immediately. Though crawling will happen within few minutes after your submission, indexing of the page depends on general webmasters guidelines and Robots directives. Generally, after few days you can find the updated content is available in Google search results.
Fixing Errors Before Indexing Request
You may find different type of errors in the submitted page when using URL inspection tool. The “Enhancements” section will show you issues in mobile-friendly, breadcrumbs, sitelinks, rich snippets, etc. For example, you may find the review snippets having warnings due to missing fields in the schema. However, you can still request for indexing the URL that has warnings.
When the submitted URL is invalid with error, you first need to fix the error before requesting for indexing.
Similar to enhancements errors, there could be also errors in coverage like 403 HTTP error. You have to fix these coverage errors before submitting the page for reindex in Google search results.
In summary, fix all coverage and enhancement errors before clicking on the request indexing link.
Troubleshooting Crawling Issues
Many of us use URL inspection tool to submit the new and modified pages to Google. However, you can also use this tool to view the crawled content from Google index as well from live page. You can use this information for troubleshooting purposes and find any rendering issues with Googlebot. After fetching Google indexed URL, click on the “View Crawled Page” link. Similarly, you have to click on “View Tested Page” link after fetching the live page’s content.
- HTML – you can find the source code of the page in this section.
- Screenshot – screen capture of your webpage rendered with Googlebot smartphone.
Whenever you see crawl errors in your Search Console account, clicking on the error link will take you to the further error details screen. Here also you have an option to use “URL Inspection” in order to see how Googlebot crawls your page and troubleshoot accordingly.
Detect Affected Pages by Malware
Whenever a site is affected by malware, the source code seen in the web browser and crawled by the Googlebot could be different. This means your site may appear in search results irrelevant to your content keywords. In this case you can use “URL Inspection” option to find out what exactly Googlebot sees when crawling your page and take corrective actions.
Google Search Console also offers security issues detection tool which helps to identify whether your site is infected or not.
Limits of URL Inspection Tool
- Google by default uses Googlebot smartphone as the user agent to crawl and render the page. There is no possibility for you to change the crawler manually to test for desktop or other devices.
- During “URL Inspection” Googlebot does not follow the redirects, hence you need to enter the final URL what the visitor will see in the browser’s address bar. You will see the status as “Redirected” when you try to fetch a redirected URL.
- Earlier, there was an explicit limit for using “Fetch as Google” option as well as submitting your URLs under “Fetch as Google” option. The limit was applicable at account level and not on site level, which means fetching any site added in your webmaster tools account will reduce the limit. The limits were also valid for one month/week; if your first URL is fetched today then the counter will be reset after a month/week from today.
|Number of fetches||500||Per Week||Per Account|
|Number of URL Submissions (Crawl only this URL)||500||Per Month||Per Account|
|Number of URL and all Linked Pages Submission (Crawl this URL and its direct links)||10||Per Month||Per Account|
The current URL inspection tool does not show any limitations when submitting the page. However, we see the current limit is set as 10 submissions per day. You will see a message like below when exceeding the quota for the day.
Note that Bing offers 10K as a limit for single day for URL submission tool. Google’s limit of 10 (or even 100 for that matter) is nowhere nearer to Bing’s limit.
The Fetch as Google option in old Search Console account was exhaustive and offered clear details of the possibility and limits. However, on the new interface, URL Inspection tool is simplified and added with additional options to show coverage and mobile usability issues. You can use this tool to submit new and modified content as well as do troubleshooting of your pages to find problems in Google Search results.