If you are using WordPress powered site or blog, you have probably received a warning from Google Search console team in the past (or will receive) saying Googlebot cannot access CSS and JS files on your domain.
Don’t be alarmed if you received this warning from Google. That sounds bad but there’s an easy fix for it.
The email message contains links to instructions on how to fix this issue. But those instructions are not very easy to follow.
Google bot and other search spiders will visit the robots.txt file of your website immediately after they hit the htaccess file.
Htaccess has rules to block ip addresses, redirect URLs, enable gzip compression,…etc. The robots.txt will have a set of rules for the search engines too.
They are the reason you received “Googlebot Cannot Access CSS and JS files.”
Robots.txt has few lines that will either block or allow crawling of files and directories.
If the JS is blocked, Google bot will not be able to crawl the code and it will consider the code as a spam or violation of link schemes. The same logic applies for the CSS files.
If you haven’t received this warning email yet, instead of waiting for the email, you can take action right away.
Robots.txt file helps search engine robots to direct which part to crawl and which part to avoid.
When search bot or spider of search engine comes to your site and wants to index your site, they follow robots.txt file first.
Search bot or spider follows this files direction for index or no index any page of your website.
If you are using WordPress, you will find Robots.txt file in the root of your WordPress installation.
If you don’t have robots.txt, you can simply create a new notepad file and name it as Robots.txt. Then just upload it into Root directory of your domain using FTP or directly via cPanel file manager.
You can either edit your WordPress robots.txt file by logging into your FTP account of the server, using cPanel or you can use plugin.
There are few things, which you should add in your robots.txt file along with your sitemap URL.
Adding sitemap URL helps search engine bots to find your sitemap file and thus faster index your post and pages.
Here is a sample robots.txt file. In sitemap, replace the Sitemap URL with your blog URL:
# disallow all files in these directories
If your site has been blocking Googlebot from accessing those files, then it’s a good thing you know about it so you can deal with the issue.
Google is focused on giving better rankings to user-friendly websites. Sites that are fast, have good user experience, etc.
This is how Google saw KasaReviews site before fix
By default WordPress does not block search bots from accessing any CSS or JS files. However some site owners may accidentally block them while trying to add extra security measures or by using a WordPress security plugin.
This restricts Googlebot from indexing CSS and JS files which may affect your site’s SEO performance.
There’s an easy fix for it and involves editing your site’s robots.txt file.
If the terms like robots.txt sounds new to you, don’t worry as you are not alone.
It’s a common lingo in the SEO industry but not so popular among bloggers.
For WordPress blog, you can edit your robots.txt file using FTP such as FileZilla, using cPanel file manager or use the file editor feature of SEO by Yoast.
Yoast SEO plugin lets you edit your robots.txt and .htaccess file from the WordPress dashboard.
Inside your WordPress dashboard, click on SEO > Tools & click on file editor
On this page, you can view and edit your robots.txt file. In the majority of cases, you need to remove following line:
Depending upon how you have configured your robots.txt file, it will fix most of the warnings.
You will most likely see that your site has disallowed access to some WordPress directories like this:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/
Some users may notice that their robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it automatically crawls and index all files.
Then why are you seeing this warning?
On rare occasions, some WordPress hosting providers may proactively block access to default WordPress folders for bots. You can override this in robots.txt by allowing access to blocked folders.
User-agent: * Allow: /wp-includes/js/
Once you are done, save your robots.txt file. Visit the fetch as Google tool, and click on fetch and render button. Now compare your fetch results, and you will see that most blocked resources issue should disappear now.
Here is how Google is rendering KasaReviews after editing and fixing the robots.txt file. You should also aim that your whole site is rendered in full.
After you modify your blog’s robots.txt file, test any URL of your website with robots.txt tester of Google webmaster tools.
If you find any errors or if Googlebot is not able to crawl a resource, the tool will highlight the line that is causing the error.
To remove the issue, delete the highlighted line.
Hope this helped. If you have any questions feel free to ask via comment section or via contact form.
If you have something to add or I have made mistake feel free to correct me.
Hello, my name is Matija but everybody calls me Kasa. I started this site to earn lots of money so that I never have to work again. Just lay down on a beach, drinking cocktails day after day while hot, beautiful chicks fight for my attention.Ok, now seriously. I love making websites, especially in WordPress. Hope reading content on this site you will find helpful tips, tutorials, comparisons, and product reviews for your business.
Current ye@r *
Leave this field empty
Add me to your weekly newsletter!
Send this to a friend