Did you recently receive an email from Google Search Console that Googlebot cannot access CSS and JS files on your site? Many webmasters (including myself) are reporting that they received the message, causing a flurry of forum posts and messages. While some people seemed panicked at first, this notice doesn’t mean that they can’t see your site; only parts of it can’t be accessed. While there isn’t a reason to sound the alarm, we do want to always be as “Google Friendly” with websites to help gain organic exposure.

It looks like Google is now sending these messages out en masse through the search console to notify webmasters that there is an issue with Google accessing certain parts of your site. The email itself does come off as a bit alarming since we of course want Google to be able to easily access our site. So, what does this all mean?

It looks like most of the people reporting that they received this message on WordPress sites. Most WordPress installations typically block the /wp-admin/ folders in the robots.txt file. The original intent of blocking this folder would be so that the core WordPress files are blocked so that you can have a more secure WordPress installation. However, if you linked to certain sections of your /wp-admin/ folder, then they would still be easily found which defeats the purpose.

Considering website security as a priority, it would make sense to keep these folders blocked. However, what most don’t know is that WordPress implemented a fix for this a while back that allows these folders to be accessed by Google while keeping security top of mind. As Joost de Valk points out, we should allow WordPress to handle this on its own:

 If you don’t do anything, WordPress has (by my doing) a robots meta x-http header on the admin pages that prevents search engines from showing these pages in the search results, a much cleaner solution.”

What Caused The Email?

After looking at my site’s Robots.txt file, I found the following that appears to be causing issues with Googlebot. Either through a plugin or a link on my site, I must have been making a reference to the /wp-includes/ folder.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

Ok great, now what?

The quick fix would be to update the robots.txt file to be the following resources that Google is trying to access:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
allow: /wp-content/plugins/
allow: /wp-content/themes/

The second method is a bit more refined since we can allow access just for Googlebot and disallow access for other user-agents.

User-agent: Googlebot
Allow: *.css
Allow: *.js

# Other bots
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

Once you are done editing your robots.txt file, you can test the update on the Fetch as Google page and see what the results are to make sure that you identified the issue.

At the end of the day, this is something that Google rolled out about a year ago, but is just now sending out notices. With that being said, we always want to do everything we can to have our sites be as Google friendly as possible and this is a pretty quick adjustment that only takes a few minutes to implement if you are able to.

Did you receive one of the emails? Have another solution to share? Let me know in the comments!


Tell us a bit about you and a team member will be in touch ASAP.


I’m interested in discussing...

Some of our Clients

Acceptance Insurance
Crystal Geyser
Nikon Logo
Tandem Diabetes Care