Find and Fix Index Coverage Errors in Google Search Console

Adam Steele
Nov 2, 2024
index coverage errors

Hand off the toughest tasks in SEO, PPC, and content without compromising quality

Explore Services
Quick navigation

Index coverage errors wreaking havoc on your search visibility?

Stress no more. I’ll help you clear them right up.

Less-go!

How to Check Index Coverage Errors in Google Search Console

Before I start calling out all the index coverage errors that you might come across in Google Search Console (GSC), for those who don’t know, let me quickly show you how to check if you have these errors.

It’s super simple:

  1. Log into GSC using the Gmail account you use to manage the website in question.
  2. Click Pages from the left-hand side navigation menu.

Here, you’ll see a bar graph showing the pages that are and are not currently indexed. In the example above, there are a total of 68 pages, with 48 not indexed and 20 indexed. To see why these web pages aren’t indexed, simply scroll down the page.

And now we’ve landed. For this site, all the reasons why these 48 pages aren’t indexed. We have noindex tag issues, redirects, 404s, and crawling issues.

Common Index Coverage Errors: Diagnosing the Red Flags đŸš©

There isn’t a shortage of index coverage errors that GSC might throw your way. To help you diagnose what the trouble is, here’s all of them with fixes for each issue:

Error TypeExplanationFix
Server error (5xx)The server returned a 500-level error when requested.
  1. Check server logs and configuration.
  2. Upgrade hosting or optimize website code.
  3. Fix code errors, disable faulty plugins, and ensure scripts function correctly.
  4. Review firewall rules and database connections.
  5. Contact the hosting provider for maintenance/update schedules and assistance.
Redirect errorIssues with redirect chains, loops, or bad URLs.
  1. Check redirect setup for accuracy (URL, type, etc.).
  2. Fix redirect chains and loops.
  3. Ensure the target page is indexable and accessible.
  4. Test redirects manually and with a crawler.
  5. Consult server logs and hosting provider if needed.
URL blocked by robots.txtThe page is blocked by robots.txt file.
  1. Check robots.txt for “Disallow:” directives blocking the URL.
  2. Use GSC’s robots.txt Tester to simulate crawling and identify issues.
  3. Remove or adjust blocking directives to allow access.
  4. Resubmit the URL through GSC for re-crawling.
  5. Monitor the Page report for changes in status.
URL marked ‘noindex’The page has a “noindex” directive, preventing indexing.
  1. Check the page source code for noindex robots meta tag in the <head> section.
  2. Check for the X-Robots-Tag noindex HTTP header in server settings.
  3. Verify robots.txt file for “Disallow:” directives blocking the URL.
  4. If intentional, remove the URL from your sitemap.
  5. If unintentional, remove the noindex directive and resubmit the URL to GSC.
Soft 404A page returns a “not found” message without a proper 404 response code.
  1. Check for thin or irrelevant content on the page.
  2. Ensure the page has a clear purpose and provides value to users.
  3. Improve content quality by adding relevant information, images, or videos.
  4. If the page is no longer needed, remove it or create a proper 404 error page.
  5. If the page should exist, ensure it’s linked internally and accessible to crawlers.
Blocked due to unauthorized request (401)The page requires authorization, blocking Googlebot.
  1. Remove authorization requirements for the page if it should be publicly accessible.
  2. If authorization is necessary, verify Googlebot’s identity through specific tokens or IP whitelisting (if supported by your server).
  3. Check for server, firewall, or CDN settings that might be blocking Googlebot.
  4. If the page shouldn’t be indexed, block it in robots.txt to optimize the crawl budget.
  5. Consult your hosting provider or server administrator for assistance.
Not found (404)A page returned a 404 error (not found).
  1. Restore the page if it was accidentally deleted.
  2. Redirect the URL to a relevant page if the content has moved.
  3. Create a custom 404 page to guide users.
  4. Remove broken links pointing to the 404 URL.
  5. If the page is intentionally removed, use a 410 status code.
Blocked due to access forbidden (403)Page is blocked by a 403 error, which Googlebot cannot bypass.
  1. Check server, firewall, or CDN settings for accidental blocks on Googlebot.
  2. Verify file and folder permissions to ensure Googlebot has access.
  3. Review .htaccess file for any restrictive rules.
  4. Temporarily disable security plugins to rule out conflicts.
  5. If access is intentionally restricted, add a noindex directive to prevent crawling.
  6. Consult your hosting provider or server administrator for assistance.
URL blocked due to other 4xx issueThe page encountered a 4xx error not otherwise specified.Identify the specific 4xx error code returned for the URL. (Use GSC’s URL Inspection tool or a server header checker.)

Address the underlying issue based on the error code:

  • 400 Bad Request: Check for incorrect URL formatting or invalid request parameters.
  • 402 Payment Required: Ensure payment processing is functioning correctly for paid content.
  • 405 Method Not Allowed: Verify that the request method (GET, POST, etc.) is allowed for the URL.
  • 408 Request Timeout: Optimize page load speed or server response times.
  • 411 Length Required: Provide the Content-Length header in the HTTP request.
  • 413 Payload Too Large/414 URI Too Long: Reduce the size of the request or shorten the URL.
  • 429 Too Many Requests: Reduce crawl rate or adjust server settings to handle more requests.
  • Other 4xx errors: Research the specific error code and its potential causes.

Test the page after implementing the fix to ensure it’s accessible.

Resubmit the URL to Google Search Console for re-crawling.

Monitor the Coverage report for changes in status.

Crawled – currently not indexedGoogle crawled the page but decided not to index it at this time.
  1. Ensure the page has valuable, unique content and is well-optimized.
  2. Check for any unintentional noindex directives or robots.txt blocks.
  3. Improve internal linking to the page from other relevant pages on your site.
  4. Build high-quality backlinks to the page to increase its authority.
  5. Submit the URL through Google Search Console for faster indexing.
  6. If the page is low-quality or not essential, consider removing or noindexing it.
Discovered – currently not indexedGoogle found the page but hasn’t crawled it yet, likely due to server overload.
  1. Request indexing for faster discovery by submitting the URL through GSC.
  2. Optimize crawl budget and prevent Googlebot from crawling low-quality pages to free up crawl resources for important pages.
  3. Check for crawl errors and address any crawl errors that might hinder Googlebot’s access to the page.
Alternate page with proper canonical tagA page is an alternate version (e.g., AMP or mobile) correctly pointing to the canonical page.
  1. The page points to a canonical page, which is indexed, so no fix is needed.
  2. However, you could verify that the canonicalized page is indexed and accessible to crawlers.
  3. Check for conflicting canonical tags or redirects.
  4. If the alternate page is not needed, consider removing it or 301 redirecting it to the canonical version.
  5. Use the URL Inspection tool in GSC to test how Google sees the canonical tag.
Duplicate without user-selected canonicalDuplicate content without a canonical URL was declared, so Google chose a different URL to index.
  1. Ensure the canonical tag points to the correct page and verify it’s indexed.
  2. Check for conflicting canonical tags or redirects and fix them if necessary.
  3. Consider removing the alternate page or 301 redirecting it to the canonical version.
  4. Use the URL Inspection tool in GSC to test how Google sees the canonical tag.
Duplicate, Google chose different canonical than userGoogle disagrees with your canonical choice and indexed a different page.
  1. Review Google’s chosen canonical and assess if it’s the most appropriate version.
  2. If Google’s choice is suitable, align your canonical tags to match their selection.
  3. If your preferred version is better, improve its on-page optimization and internal linking to signal its importance.
  4. Build high-quality backlinks to your preferred version to boost its authority.
  5. Use the URL Inspection tool to test how Google perceives both versions and their canonical tags.
Page with redirectThe non-canonical URL redirects to another page.
  1. Verify the redirect is implemented correctly and leads to the intended destination.
  2. Fix any redirect chains or loops that might be hindering indexing.
  3. Ensure the final destination page is indexable and accessible to crawlers.
  4. If the redirect is no longer needed, remove it to improve page speed.
  5. Monitor your website’s performance and traffic after implementing any changes.

Phew! Quite an exhaustive list, right? Now for a wee touch of prevention.

Preventive Measures: Keeping Your Website in Tip-Top Shape đŸ’Ș

Problem fixed—let’s not stop there. Take a proactive approach and prevent those indexing issues from popping up again.

Here’s what to keep an eye on:

Technical SEO

  • Site structure focuses on making sure a website is easy to navigate and a joy to explore. Use a clear hierarchy, logical internal linking, and descriptive URLs to make it easy for both users and search engines to find their way around.
  • Robots.txt files make sure a site is up-to-date and accurately reflects which pages you want Googlebot to crawl and index.
  • Redirects guide visitors to the right destination. But beware of those confusing detours. Avoid redirect chains and ensure your redirects are implemented correctly to prevent indexing issues and frustrated users.
  • Don’t let broken links lead to a dead end for your website visitors (and Googlebot!). Regularly check for and fix broken links to ensure a smooth user experience and prevent indexing issues.
  • A reliable hosting provider with stable servers is essential for preventing server errors (those pesky 5xx errors) and ensuring consistent website accessibility for both users and search engine crawlers.
  • Website security breaches can lead to various indexing issues, including hacked content, malware infections, and server downtime. Implement robust security measures like strong passwords, regular software updates, and security plugins to protect your website and prevent indexing problems.
  • Google has been advocating for a secure web for years, and HTTPS is a confirmed ranking factor. Ensure your website has a valid SSL certificate and is served over HTTPS to avoid security warnings and potential indexing issues.

Content Optimization

  • Google loves fresh, original content that provides real value to its users. So, ditch the duplicate content and focus on creating informative, engaging, and well-written pages that deserve a spot in the index.
  • Use relevant keywords naturally throughout your content to signal to Google what your page is all about.
  • Keep your content up-to-date and relevant. Regularly review and update your pages to ensure they’re providing the most accurate and valuable information.
  • Duplicate content giving you a headache? Use canonical tags to tell Google which version of a page is the boss, preventing confusion and boosting your SEO.

Regular Monitoring

  • Regular backups are your safety net in case of unexpected events like server crashes, hacking attempts, or accidental deletions. Having a recent backup can help you quickly restore your website and minimize downtime, which can indirectly impact your indexing and SEO performance.

Conclusion and Next Steps

You’ve conquered those pesky index coverage errors and optimized your website for peak performance. High five!

But don’t get complacent—SEO is an ongoing race to the top of search results, and staying ahead requires an insatiable thirst for everything related to search.

It’s a heck of a lot to keep up with.

That’s where Loganix comes in. We live and breathe search and are here to make SEO a breeze.

Ready to take your (or your client’s) website to the next level?

👉 Head over to our SEO services page, and let’s chat.

Hand off the toughest tasks in SEO, PPC, and content without compromising quality

Explore Services

Written by Adam Steele on November 2, 2024

COO and Product Director at Loganix. Recovering SEO, now focused on the understanding how Loganix can make the work-lives of SEO and agency folks more enjoyable, and profitable. Writing from beautiful Vancouver, British Columbia.