GSC Coverage Analysis Report: Fix Common Issues

Are you tired of wondering why your website pages are not appearing in search results? Have you checked your GSC Coverage Analysis Report and found a list of problems and affected pages If so, you’re not alone. Many website owners experience the same problem and need help fixing it.

A website’s indexing status can be tricky to understand and maintain. Google Search Console (GSC) Coverage Analysis Report is a great tool to help you understand what’s happening. It provides a list of issues preventing your pages from being indexed, such as crawl errors, noindex tags, and blocked resources. But sometimes, it can be hard to interpret the results. 

In this post, we’ll explain some common indexing issues that can occur and help you solve them. 

“Excluded by ‘noindex’ tag”

This issue occurs when a page has a “noindex” tag in the HTML code, which tells Google not to index the page. 

The “noindex” tag can be removed by editing the HTML code of the affected page. It’s important to verify that the tag is removed from the correct page and that it’s not added accidentally to any other page.

One way to fix this issue is by using a web development tool such as Google Tag Manager, which allows you to easily manage and edit the tags on your website without manually editing the HTML code. Another way is using a plugin or extension for your CMS that allows you to manage the “noindex” tags on your pages, such as Yoast SEO for WordPress.

“Indexed, not submitted in sitemap” 

If a page is indexed by Google but not included in your sitemap, you’ll encounter this issue. This can happen if the page wasn’t properly added to the sitemap or the sitemap isn’t submitted to Google correctly.

The “Indexed, not submitted in sitemap” issue can be fixed by ensuring that all the pages you want to be indexed by Google are included in your sitemap and that the sitemap is submitted to Google correctly.

You can start by reviewing your sitemap to ensure all the pages are included. If any pages are missing, add them to the sitemap. To do this, use a sitemap generator tool that automatically creates a sitemap for your website based on its structure and content. Also, ensure that the sitemap isn’t blocked by robots.txt and is accessible for Google to crawl.

Once your sitemap is updated, submit it to Google in the Google Search Console under the Sitemap section. This allows Google to crawl and index the pages included in the sitemap. 

“Submitted URL marked ‘noindex'”

This issue happens when a page is included in the sitemap but has a “noindex” tag, which tells Google not to index the page. 

You can fix the “Submitted URL marked ‘’noindex’’’ by removing the “noindex” tag from the affected page and ensuring that Google indexes the page.

  1. First, check the HTML code of the affected page and identify the “noindex” tag. 
  2. Once you have identified the tag, remove it by editing the page’s HTML code.
  3. After removing the “noindex” tag, notify Google that the page should be indexed. Do this by resubmitting the sitemap to Google Search Console, allowing Google to recrawl the page and index it.

“Crawled currently NOT indexed pages” 

This issue means that a page on your website is crawled by Google but is not indexed. This can happen if the page has a “noindex” tag, is blocked by robots.txt, or has other technical issues that prevent it from being indexed. 

To fix this issue, identify the affected pages by checking your GSC Coverage Analysis Report. Then check for common causes of this issue, such as the “noindex” tag, robots.txt block, or other technical issues.

  • If a “noindex” tag causes the issue, remove the tag by editing the page’s HTML code, as explained previously.
  • If a robots.txt block is the cause, check your robots.txt file to ensure that the affected pages aren’t blocked, and if they are, remove the block so the pages can be indexed.
  • If the issue is caused by other technical issues, such as broken links, missing or duplicate content, or incorrect redirects, use web debugging tools such as Lighthouse to identify and fix the specific issues. 

“Indexed, though blocked by robots.txt”

This issue indicates that a page is indexed by Google but blocked by robots.txt, which is a file that tells search engines not to crawl certain pages on your website. This can happen if the robots.txt file is not set up correctly or a page is accidentally blocked.

Fix this issue by identifying and removing the block in the robots.txt file that prevents the affected pages from being crawled.

  1. Check your GSC Coverage Analysis Report to identify the affected pages. 
  2. Check your robots.txt file to see if they are blocked. 
  3. If the pages are blocked, edit the robots.txt file and remove the block for the affected pages. Be careful when editing the robots.txt file, as a mistake can block important pages from being indexed.

You can also use tools like Google Search Console or other SEO tools to check the robots.txt file and see which pages are blocked by it. This will give you a clear picture of which pages are blocked and which ones should be allowed to be indexed.

“Product Issues Detected” 

This issue results from problems with the structured data on a product page, which can prevent Google from correctly displaying the page in search results.

Check the affected pages in your GSC Coverage Analysis Report to fix this issue. Next, look for common issues such as missing or incorrect structured data, incorrect product information, or other technical issues.

  • If missing or incorrect structured data cause the issue, use tools such as Google’s Structured Data Testing Tool to find and resolve the specific issues. This might include adding the missing structured data or correcting the existing structured data to ensure that it’s correctly implemented and adheres to the guidelines set by Google.
  • If the issue is caused by incorrect product information, check and update the product information on the affected pages to ensure that it’s accurate and relevant.
  • If the issue is caused by other technical issues (broken links, missing or duplicate content, or incorrect redirects), use web debugging tools to identify and fix the issues. 

Soft 404 

A soft 404 is an HTTP status code that is returned when a server returns a “not found” message, but the status code is not a 404. It’s called “soft” because it’s not a true 404 error, but it is considered a problem by search engines because it can prevent them from correctly identifying and indexing the page. This issue can lead to poor user experience and lower rankings in search results. 

Start by identifying the affected pages and then use the URL Inspection tool to examine the rendered content and the returned HTTP code.

  • If the page and content are no longer available, return a 404 (not found) or 410 (gone) response code for the page. These status codes tell search engines that the page doesn’t exist and that they shouldn’t index the content.
  • If the page or content is now somewhere else, return a 301 (permanent redirect) to redirect the user to the new location. This doesn’t interrupt their browsing experience; it also notifies search engines about the page’s new location.
  • If the page and content still exist but are missing critical resources or displaying a prominent error message during rendering, try to identify why the resources can’t be loaded. It could be blocked by robots.txt, too many resources on a page, different server errors, slow loading, or extensive resources. 

Conclusion 

GSC Coverage Analysis Report is a valuable tool for identifying and addressing common issues that can prevent your pages from being indexed and appearing in search results. 

Remember to regularly monitor your GSC Coverage Analysis Report to ensure that new pages aren’t affected by these issues and that the previously affected pages are indexed properly. Doing so will improve your website’s visibility, traffic, and search engine rankings in the long run. Keep in mind that it may take time for Google to recrawl and reindex the affected pages after the sitemap is updated and submitted.

Learn On-page SEO

Title Tag Errors

Title tags are the first things that users notice when they see your website in the SERP, which makes them crucial if you want to capture customers’...

read more

Learn Technical SEO

XML Sitemap Errors

An XML sitemap serves as a comprehensive file that lists all the webpages on your site that you want search engines like Google to crawl and index....

read more

Learn off-page SEO