Fix Blocked Resources In Google Search Console

Last update: 26 June, 2016

Google Search Console displays a list of all resources that Google cannot access due to a robots.txt restriction. And by all, we mean all. If you embed videos or maps from other sources—including Google maps—you may see a blocked resource warning.

Learn more about why you should not block resources here.

Do I Have Blocked Resources?

You can find the open blocked resources list here. You’ll need to be logged in and select your domain from the list.

I have some. How do I fix it?

Unfortunately, you may not be able to fix all of them. You can only fix the ones you host, own, or have the rights to edit the robots.txt file. Once you’ve fixed the ones you can, you must decide if the remaining blocked resources are necessary for your site. If they are necessary, reach out to the domain owner and ask them to adjust their robots.txt file to allow Google access. If the domain owner is unwilling or unable to modify the file, you can either find another method for this necessary function or ignore the warning in Google.

How do I know which ones I can change?

Thankfully, Google also includes a direct link to the robots.txt file to easily identify which file is causing the warning.

  1. Open blocked resources list here and select your domain.
  2. Drill down on each resource in the list until you see a pop up like this:gsc_robots_error

Now you have all the information you need to determine if you can fix the issue.

  • URL: This is the page on your site displaying something that Google cannot access due to a robots.txt restriction.
  • Resource: This is what Google cannot access due to a robots.txt restriction.
  • robots.txt: This is the location of the robots.txt file causing the issue.

Is the robots.txt file on a site or domain you own? If yes, please edit your robots.txt file. If the file is part of your WordPress install, you can even use our plugin to edit it. If the file is not on a site or domain you own, please contact the domain or site owner and request they modify the file.

I changed the robots.txt file. Now what?

Use the robots.txt Tester to verify that Google sees the changes you made. If you use a cache plugin or server cache, it may take up to 24 hours for Google to update. Clearing your cache will speed up this process.

After the changes are visible in the robots.txt Tester, Google should remove the blocked resources the next time Google scans your site for updates. If you don’t want to wait, you can use the Fetch as Googlebot tool to speed up the process.

Was this article helpful? ·