Need help?

Our experts have had an average response time of 13.52 minutes in October 2021 to fix urgent issues.

We will keep your servers stable, secure, and fast at all times for one fixed price.

Googlebot cannot access CSS and JS files – Crawl errors

by | Apr 15, 2021

In the Google search console, we may come across Googlebot cannot access CSS and JS files – Crawl errors

In order to understand a page, Googlebot needs to view it with the accompanying CSS and JavaScript files.

However, if Google cannot load them, it will cause errors in Google Search Console’s coverage report.

As part of our Google Cloud Platform services, we assist our customers with several WordPress queries.

Today, let us see how to fix the “Googlebot cannot access CSS and JS files” error on the WordPress site.


Googlebot cannot access CSS and JS files – Crawl errors

Google focuses on providing better rankings to user-friendly websites.

To determine the user experience, Google needs access to be able to visit the site’s CSS and JavaScript files.

By default, WordPress does not block search bots from accessing any CSS or JS files.

However, we may accidentally block them while adding extra security measures or when we use the WordPress security plugin.

The major cause of this error is the accidental blocking of these resources using a .htaccess file or robots.txt.

This will restrict Googlebot from indexing CSS and JS files which may affect the site’s SEO performance.


How to fix this?

As a first step, in the website’s root folder, we can ensure we do not block static resources.

If we do, find the files Google is unable to access on our website.

To see how Googlebot sees our website, click on Crawl » Fetch as Google in Google Search Console.

Then, click on the fetch and render button.

Once done, the result will appear in a row. It will show us what a user sees and what the Googlebot sees when it loads our site.

Any difference in the data means that Googlebot is not able to access CSS/JS files. In addition, we can see the links of CSS and JS files it was unable to access.

To find a list of blocked resources, go to Google Index » Blocked Resources.

Each of them will show the links to actual resources that Googlebot cannot access.

Mostly, it will be the CSS styles and JS files added by our WordPress plugins or theme.

In such a case, we need to edit the site’s robots.txt file which controls what Googlebot sees.

To edit it, we connect to our site using an FTP client. It will be in our site’s root directory.

If we use the Yoast SEO plugin, we can edit it from SEO » Tools page and then click on File Editor.

For instance, below we see the site has disallowed access to some WordPress directories:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

Now we need to remove the lines that block Google’s access to CSS or JS files on our site’s front-end.

Typically, they will be in the plugins or themes folders. In addition, we need to remove wp-includes, such as jQuery.

Sometimes the robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it indexes all files.

At times, a few WordPress hosting providers may proactively block access to default WordPress folders for bots.

We can override this via:

User-agent: *
Allow: /wp-includes/js/

Once done, save the robots.txt file.

Eventually, visit the fetch as Google tool and click on the fetch and render button. Now compare the fetch results to see the resolution.

[Need help with the procedures? We are here for you]



In short, the major cause of this error is the accidental blocking of search bot resources using a .htaccess file or robots.txt.


Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.


var google_conversion_label = "owonCMyG5nEQ0aD71QM";


Submit a Comment

Your email address will not be published. Required fields are marked *

Privacy Preference Center


Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.

PHPSESSID - Preserves user session state across page requests.

gdpr[consent_types] - Used to store user consents.

gdpr[allowed_cookies] - Used to store user allowed cookies.

PHPSESSID, gdpr[consent_types], gdpr[allowed_cookies]


Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.

_ga - Preserves user session state across page requests.

_gat - Used by Google Analytics to throttle request rate

_gid - Registers a unique ID that is used to generate statistical data on how you use the website.

smartlookCookie - Used to collect user device and location information of the site visitors to improve the websites User Experience.

_ga, _gat, _gid
_ga, _gat, _gid


Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.

IDE - Used by Google DoubleClick to register and report the website user's actions after viewing or clicking one of the advertiser's ads with the purpose of measuring the efficacy of an ad and to present targeted ads to the user.

test_cookie - Used to check if the user's browser supports cookies.

1P_JAR - Google cookie. These cookies are used to collect website statistics and track conversion rates.

NID - Registers a unique ID that identifies a returning user's device. The ID is used for serving ads that are most relevant to the user.

DV - Google ad personalisation

IDE, test_cookie, 1P_JAR, NID, DV, NID
IDE, test_cookie


These are essential site cookies, used by the google reCAPTCHA. These cookies use an unique identifier to verify if a visitor is human or a bot.