Search Console crawl error: “Submitted URL blocked by robots.txt”

First of all, let us be clear that Yoast SEO plugins do not automatically add anything to the robots.txt file.

In this article, we’ll take a look at the Google Search Console error Submitted URL blocked by robots.txt. What does that mean and how can you fix it?

Related articles

What does the error mean?

The word ‘error’ indicates that this page is not indexed. In other words, Google tried to index your page, but couldn’t. Why not? Well, ‘submitted URL’ indicates that you submitted the page for indexing. That means that the URL is listed in a sitemap. But… there’s a problem: your submitted URL is blocked by your site’s robots.txt file.

How to fix a ‘submmitted’ error

Depending on what you want, there are different ways to fix a ‘submitted’ error:

  • If you want Google to crawl and index your page, then you should fix the issue that prevents that from happening.
  • If you don’t want Google to index your page, you should remove the URL from your sitemap. Google will notice the changes when it visits your site again. If you don’t want to wait until Google’s next visit, you can also resubmit the edited sitemap in the Sitemaps report of Google Search Console. Are you new to Google Search Console? Then please read this beginner’s guide first.
  • If you don’t want Google to index your page, you could also check your Sitemaps report and delete any sitemaps that contain the URL of the page. In addition, make sure that no sitemaps listed in your robots.txt file include this URL.

How to test and update your robots.txt file

Let’s assume that you want Google to crawl and index your page. How can you find and fix the cause of the problem? Let’s take a closer look at the actual robots.txt file.

Test your robots.text file

You can use the robots.txt Tester tool in Google Search Console to test whether your URL can be crawled. Follow the steps as described in this support article from Google. The tool will highlight the part of the file (the rule) that causes the blocking. The tool is just for testing, you can’t make any changes to the actual file.

Update your robots.text file

As soon as you know what’s causing the problem, you can update your robots.txt file by removing or editing the rule. Typically, the file is located at http://www.[yourdomainname].com/robots.txt however, they can exist anywhere within your domain. The robots.txt Tester tool can help you locate your file. Just click ‘See live robots.txt’.

If you are using a web hosting service, it’s very well possible that you can’t access and/or edit this file by yourself. In that case, contact your host and ask for assistance.

Submit your updated file to Google

Once your robots.txt file is updated, you can submit it to Google by following these steps.

Related articles

Get free SEO tips!