According to the message from Google:
CSS files are required to allow a browser to assemble and format the website in the way the website designer or administrator intended. Without CSS a website is not much more than a collection of code, plain text, and in-line images, all stacked under each other. There hasn’t been any change in the way Google can see your website, as far as I am aware. If you’re not sure what Google can see on your site, try a Fetch and Render function in Google Search Console. The render command will show you two images, the first is an impression of how Google sees your site, and the second shows how the public would see your site.
If the two versions of the Fetch and Render don’t look any different, then you have no real problem with how Google sees the site or allocates rank. You may not need to change any setting in the Robots.txt file, but as I discover more about this new message in the course of today and coming week, I’ll add comments below.
With WordPress websites, a common configuration for the Robots.txt file is to include a Disallow command for Admin folder resources, and that makes perfect sense because you don’t want any of your admin-end content being displayed in Google search. However, many installations also include a Disallow command for the /wp-includes/ folder, and this is what is causing Google to trigger the message for your GSC property. If you want to allow Google to crawl the /wp-includes/ folder entirety, you can simply remove the Disallow command for that folder. To block other sub-folders within this folder, just add commands for those folders individually.
If you’d like more info on how to use Google Search Console, visit here for a comprehensive step-by-step explanation.
I recommend you follow Google’s steps to check your website’s Robots.txt settings, and amend the file to exclude the Disallow command for /wp-includes/ if you are running a WordPress website
Here’s how to fix this issue:
|1||Identify blocked resources|
Use the “Fetch as Google” feature to identify those resources that robots.txt directives are blocking.
|Fetch as Google|
|2||Update your robots.txt file|
|3||Validate the fix using “Fetch as Google”|
Fetch and Render your page with the “Mobile: smartphone” option selected to double-check that Googlebot for smartphones renders your content properly.