8 SEO Tools to Get Insights from Google Search Console URL Inspection API

Google announced a week ago the launch of a new Google Search Console URL Inspection API. This change allows access to third-party applications for bulk access to information that can only be accessed for a single URL at a time through the URL Inspection Tool’s Google Search Console interface.

With a quota per Search Console property (which can also be subdomains or subdirectories, as well as domains) of 2,000 queries per day and 600 queries per minute, the release opens the doors for SEO tools and platforms to integrate Google’s index coverage information, such as:

  • Crawlability and indexability status
  • Last crawl time
  • Include sitemaps
  • Canonical URL selected by Google
  • Structured data identified for rich results eligibility
  • Mobile Usability Status

All of these include other useful information that aids in technical SEO analysis and debugging.

This opportunity has already started to be exploited by a few SEO professionals who have already developed new free tools and shared scripts and set up SEO crawlers integrating this data with their own ideas. Here are a few:

Free tools

Free tools may be the best way to quickly validate the status of a specific group of URLs.

1. Google Bulk Inspect URLs by Valentin Pletzer

Pletzer has developed a new free tool called “Google Bulk Inspection URLwhich provides what is probably the easiest way to get data from the URL Inspection API. There is no need for registration or complex configuration. You need to allow access to your Google Account linked to Google Search Console, select the property you want to verify, and paste the URLs you want to verify.

Google Bulk Inspect URLs by Valentin Pletzer

The tool, which processes data in the browser, displays the status obtained from the various fields available from the URL Inspection API in a table that allows you to browse or export the values ​​in CSV format or Excel.

2. MyDomain.Dev by Lino Uruñuela

MyDomain.dev, developed by Lino Uruñuela, is a free tool. It requires registration, allowing you to access Google Search Console data available through the API without the constraints of the Search Console interface. Reports segment and group data for easier analysis.

In addition to the existing reports for performance data, the tool now also provides access to URL inspection information through a new section. First, grant access to your Google Account linked to Search Console when signing up. Next, go to the “Index Coverage (bulk)” section to select the desired property to check, then paste the URLs to validate to get their status in an easy-to-browse table that allows you to filter data, copy or export in CSV, Excel or PDF.

3. URL Inspection API in Sheets by Mike Richardson

For those who don’t want to use a new tool and prefer Google Sheets, Mike Richardson has developed and made freely available a new Google sheets template using application scripts which you can copy and follow the instructions shared directly there to create a free Google service account to run it.

Once you have added the required key, email address, client ID and ownership information, paste the URLs to check and get their latest crawling information, coverage, crawlers, indexing , user and canonical status selected by Google.

Integration of SEO robots

SEO crawlers can be the best way to get and integrate your page’s Google coverage status data into a more comprehensive technical SEO analysis to supplement (and validate) data from your SEO crawling simulations.

However, it is important to keep the daily API quota in mind when using SEO crawlers. You might want to crawl by areas/categories, list crawls of your most valuable URLs, or enable new properties for category/subcategory directories as their quotas are independently counted.

1. Screaming Frog SEO Spider [16.6 Update]

Screaming Frog was the first SEO crawler to support the new URL Inspection API integration, announcing a new version (16.6) named “Romeo”.

Integration is simple and explained in the release notesdescribing how to select the option in the already existing Google Search Console API access to populate new columns in the “Search Console” tab (as well as the global “Internal” tab).

The report also includes more filters to get URLs with Google coverage issues directly, which can also be assessed with Google Search Console “performance” data, also included via API integration.

2. site bulb [Version 5.7]

The other “super-fast” release yesterday of SEO crawlers to support Google’s new URL Inspection API came from Sitebulb, which announced a new version 5.7. It takes advantage of the existing Google Search Console integration and only requires checking the “Get URL data from Search Console URL Inspection API” option when configuring an analysis.

Sitebulb has now enabled a brand new “URL Inspection” report. They feature many clickable charts and tables for the different fields, segmenting their values ​​for easier analysis rather than aggregating them into a single table.

When you click on the charts in different fields, you will be taken directly to the report showing those URLs. You can also combine with other metrics available through the tool by adding columns to the table or clicking the “Open URL Inspection” option to be taken directly to the Google Search Console report for there. see page information.

3. Fandango SEO

Another SEO crawler that announced the URL Inspection API integration yesterday is Fandango SEO, a cloud-based crawler. In addition to fetching URL inspection data to display the “Google indexing status” of pages, it will also notify whenever Google changes the indexing status of pages.

Free scenarios

If you’re a little more technical and prefer to run a script in the terminal, there are alternatives for you as well.

1. Google Index Inspection API Node.js Script by Jose Luis Hernando

Jose Luis Hernando developed and made available a free script via Github with step by step instructions. Make sure that Node.js on your machine installs the necessary modules and OAuth 2.0 client ID credentials from your Google Cloud Platform account.

2. Google URL Inspection API with Python

If you prefer Python, Jean-Christoph Chouinard has write a tutorial with Python code to interact with the URL Inspection API. The tutorial walks through the entire process, from the steps to creating your service account credentials, structuring the API response, and creating the API project with a Google service account.

Deepen Your Google Coverage Status

After seeing rapid adoption after just a few days, many other SEO tools and platforms will integrate the new insights from Google Search Console’s URL Inspection API. I can’t wait to use them!

Although, at the moment, the API may have a daily quota limit, remember that this is per property, not per domain (you can also register your category/subcategory directories as properties). This is already an important first step in achieving Google direct crawling and indexability status much faster than ever before.


The opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


New to Search Engine Land

About the Author

Aleyda Solis is an SEO consultant and founder of Orainti, speaker and author, who also offers SEO tips in the Crawling Mondays video series, the latest SEO resources in the #SEOFOMO newsletter, and Free SEO Learning Roadmap in LearningSEO.io. She is also the co-founder of Remoters.net, a remote work hub offering a free remote bulletin board, tools, guides and more to enable remote work.

Comments are closed.