If you’re a website owner who wants to keep track of how your site is performing on Google, then Google Search Console is the tool for you. It has a helpful feature that warns you of crawl errors. These crawl errors happen when Googlebot, the web crawler, can’t access a page on your site. They can mess up your site’s rankings by preventing search engine bots from viewing your content. It’s crucial to fix crawl errors as soon as you can because they can come from different reasons, like server errors, broken links, or pages that don’t exist anymore. Fortunately, Google Search Console can give detailed reports on the URLs causing the errors. With these reports, you can pinpoint the problem and take the necessary action to fix it.
The Top Google Search Console Errors You Need to Know
1. Server Errors
Server errors occur when search engine bots cannot access your website due to server issues. Some common server errors are 500 internal server errors, 502 bad gateway errors, 503 services unavailable errors, and 504 gateway timeout errors. These errors can occur for various reasons, such as server downtime, overload, or misconfiguration. Sometimes, Google’s bots just happened to crawl your site at an inopportune time when your site was down. If your site has since been restored, this error should resolve on its own the next time Google crawls your site.
2. 404 Errors
Four hundred four errors occur when a page on your website is no longer available, and the search engine bot cannot find it. This can happen for various reasons, such as a deleted page or URL change. To fix 404 errors, you can redirect the old URL to a new URL using a 301 redirect. If the page is unavailable, you can create a custom 404 page with alternative navigation options for your website visitors.
3. Redirect Errors
Redirect errors occur when a search engine bot cannot follow a redirect from one page to another. This can occur due to various reasons, such as a redirect loop (where Page A redirects to Page B which redirects right back to Page A), a redirect chain (where Page A redirects to Page B, which redirects to Page C), or a redirect to a broken link. To fix redirect errors, you can check if the redirect is correctly configured and ensure the target page is accessible.
4. Robots.txt Errors
Robots.txt errors occur when a search engine bot cannot access your website’s robots.txt file. This file tells search engine bots which pages to crawl and which to ignore. To fix robots.txt errors, check to see if the file is correctly configured and ensure it is accessible to search engine bots.
5. Soft 404 Errors
Soft 404 errors occur when a page on your website returns a “not found” error, but the server returns a 200 status code. This can happen for various reasons, such as a page that no longer exists but returns a generic “not found” page. To fix soft 404 errors, you can configure your server to produce the correct error code for pages that no longer exist.
6. Excluded by ‘Noindex’ Tag
If you see any pages in this category that you believe should be indexed, go to the page on your site and manually mark it as “indexable.”
7. Discovered – Currently Not Indexed
This is often where newly published content lives. “Discovered” means that Google either knows it exists but hasn’t crawled it yet. You can wait until Google gets around to indexing the page or request a manual reindex in Google Search Console.
8. Crawled – Currently Not Indexed
This means that Google knows the page exists and crawled the page but deemed it unworthy of being indexed at this time. The first thing to do is to check to see if it’s a false positive. In Google, type in “site:” and the URL you’re checking to see is indexed and what Google picks up as the SEO meta title and description. Common reasons Google seems pages unworthy of indexing are due to duplicate content or thin content.
Improve Your Website’s Search Engine Rankings
Maintaining a successful online presence requires consistent monitoring and swift corrective action to address crawl errors, as they can adversely affect your website’s search engine rankings. By addressing the most common crawl errors, you can ensure that search engine bots correctly crawl and index your website’s content, enhancing your website’s search engine rankings and online visibility.
Google crawl error alerts can be daunting, but they’re beneficial, and in many cases, no or minimal action needs to be taken. At our Nashville SEO agency, all critical crawl errors are monitored and dealt with as part of our monthly SEO packages.
If you have any questions about Google Search Console errors and how to address them, feel free to contact us for an SEO consultation.