Google Search Console is a wonderful way to understand search engine issues and optimize your site for Google. “Fetch as Google” is an option available for webmasters to check how a web page will look when Googlebot crawls that page. Google renamed the tool in new interface as “URL Inspection” with functions remains modified. In this article let us explore more on what is fetch as Google and how to use URL inspection tool.
Related: Guide to Google Search Console.
When to Use URL Inspection?
You can use this tool for the following purposes:
- Find index status of any webpage on your verified domains.
- Submitting pages for crawling and indexing whenever you have changed the content.
- Troubleshooting webpages to improve the performance in search engine result pages.
- Finding out the pages causing issues when a site is hacked by malware.
- View live version and crawled page to find any differences.
Since both old and new Search Console interfaces are still available, we will explain on both scenarios. In order to use the tool, you need to have a Google Search Console account and added your site and verified successfully.
How to Use URL Inspection in New Search Console?
Login to your Search Console account and navigate to “URL Inspection” option from the sidebar.
Enter the URL of your web page that you want to request for crawling in the search box and hit enter key. The tools offers many options with three major sections.
- Check the index status of the page in the first section. If you have recently modified the web page then click “Request Indexing” option. Google will show you the confirmation like below indicating you have requested for indexing the page.
- You can view the “Crawled Page” to check the details of the indexed page on Google. You can also click on the “Test Live URL” button to fetch the live content and compare with the indexed content.
In addition to submitting web page for re-indexing, you can also check the coverage and mobile usability status using URL Inspection tool.
How to Use Fetch as Google in Old Search Console?
Login to your Google Search Console account and click on “Go to Old Version” option available at the bottom of the sidebar. When you are in old version, navigate to “Fetch as Google” option available under “Crawl” section.
Enter your page URL or leave the box blank to fetch your website’s home page. The URL entered here is the one you want to troubleshoot or you have changed the content significantly. Choose the Googlebot type as “Desktop” and click on “Fetch” button.
Note, choose the Googlebot type as cHTML or Mobile XHTML/WML in order to check how the page is seen by a Googlebot-mobile crawler.
Troubleshooting Crawling Issues
After fetching a URL the status will be showing as “Complete” or “Partial” under “Status”. Clicking on the success or failed link will show you the HTTP response received from the site’s server. This information can be used to analyze the crawling issue in detail and correct the issue to ensure Googlebot can crawl the page successfully next time.
The successful response from the server is shown in the below picture:
Fetch and Render
Google later introduced an additional option of fetch and render to help webmasters who do not know how to analyze the fetched text content by Googlebot. If you want to see the visual look of the fetched page then click on “Fetch and Render” button instead of “Fetch” button when fetching an URL. After successful fetching click on the status and then view rendered page under “Rendering” tab.
Googlebot not only fetches the given URL and also accesses all linked resources on that page. If there are blocked resources Googlebot can’t access then all those links will be listed below the rendered page.
Submitting URLs for Indexing
After successful fetching of a URL, you can submit the page for indexing by clicking on “Submit to Index” button. Clicking on the link will provide you the following two options for indexing your pages:
- Crawl only this URL – select this option to instruct Googlebot to crawl only the page content and all links within the page will not be crawled.
- Crawl this URL and its direct links – select this option to instruct Googlebot to crawl the page and all links in the page.
Submitting to index option helps webmasters not to wait until Googlebot crawls your page next time and request Googlebot to crawl it immediately. Though crawling will happen within few minutes after your submission indexing of the page depends on general webmasters guidelines and Robots directives.
Detect Affected Pages by Malware
Whenever a site is affected by malware, the source code seen in the web browser and crawled by the Googlebot could be different. This means your site may appear in search results irrelevant to your content keywords. In this case you can use “Fetch as Google” option to find out what exactly Googlebot sees when crawling your page and take corrective actions.
Google Search Console also offers malware detection tool which helps to identify whether your site is infected or not.
Fetching Redirected URL
During “Fetch as Google” Googlebot does not follow the redirects, hence you need to enter the final URL what the visitor will see in the browser’s address bar. You will see the status as “Redirected” when you try to fetch a redirected URL.
Clicking on the link will show the HTTP server response code as “301 Moved Permanently” and “Follow” button for fetching the target URL again.
Limits of Fetch as Google Option
- There is a limit for using “Fetch as Google” option as well as submitting your URLs under “Fetch as Google” option.
- The limit is applicable at account level and not on site level, which means fetching any site added in your webmaster tools account will reduce the limit.
- The limits are valid for one month/week; if your first URL is fetched today then the counter will be reset after a month/week from today.
When submitting a fetched page for crawling Google shows the restriction and pending submission limits for your account. The limits for fetch as Google option is consolidated and provided below for reference:
Activity | Maximum Count | Duration | Applicability |
---|---|---|---|
Number of fetches | 500 | Per Week | Per Account |
Number of URL Submissions (Crawl only this URL) | 500 | Per Month | Per Account |
Number of URL and all Linked Pages Submission (Crawl this URL and its direct links) | 10 | Per Month | Per Account |
When you see crawl errors in your Search Console account, clicking on the error link will take you to the further error details screen. Here also you have an option to use “Fetch as Google” (as shown in the below picture) in order to see how Googlebot crawls your page and troubleshoot accordingly.
Summary
The Fetch as Google option in old Search Console account is exhaustive and clear details of the possibility and limits. However, on the new interface, UTL Inspection tool is simple and added with addition options of coverage and mobile usability.
15 Comments
Leave your reply.