It looks like as of today Google has begun reporting on properties in Google Search Console that are not allowing access to the website’s CSS and Javascript files.
According to the message from Google:
Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in sub-optimal rankings.
CSS files are required to allow a browser to assemble and format the website in the way the website designer or administrator intended. Without CSS a website is not much more than a collection of code, plain text, and in-line images, all stacked under each other. There hasn’t been any change in the way Google can see your website, as far as I am aware. If you’re not sure what Google can see on your site, try a Fetch and Render function in Google Search Console. The render command will show you two images, the first is an impression of how Google sees your site, and the second shows how the public would see your site.
If the two versions of the Fetch and Render don’t look any different, then you have no real problem with how Google sees the site or allocates rank. You may not need to change any setting in the Robots.txt file, but as I discover more about this new message in the course of today and coming week, I’ll add comments below.
With WordPress websites, a common configuration for the Robots.txt file is to include a Disallow command for Admin folder resources, and that makes perfect sense because you don’t want any of your admin-end content being displayed in Google search. However, many installations also include a Disallow command for the /wp-includes/ folder, and this is what is causing Google to trigger the message for your GSC property. If you want to allow Google to crawl the /wp-includes/ folder entirety, you can simply remove the Disallow command for that folder. To block other sub-folders within this folder, just add commands for those folders individually.
If you’d like more info on how to use Google Search Console, visit here for a comprehensive step-by-step explanation.
I recommend you follow Google’s steps to check your website’s Robots.txt settings, and amend the file to exclude the Disallow command for /wp-includes/ if you are running a WordPress website
____________________________
Here’s how to fix this issue:
1 | Identify blocked resources
Use the “Fetch as Google” feature to identify those resources that robots.txt directives are blocking. |
Fetch as Google |
2 | Update your robots.txt file
Remove the restrictions on your site’s CSS and JavaScript files from your robots.txt directives and test the changes with the Robots.txt tester. Then update your revised robots.txt file on your site and submit it to Search Console. |
Test Robots.txt |
3 | Validate the fix using “Fetch as Google”
Fetch and Render your page with the “Mobile: smartphone” option selected to double-check that Googlebot for smartphones renders your content properly. |