Bobcares

Googlebot cannot access CSS and JS files – Crawl errors

by | Apr 15, 2021

In the Google search console, we may come across Googlebot cannot access CSS and JS files – Crawl errors

In order to understand a page, Googlebot needs to view it with the accompanying CSS and JavaScript files.

However, if Google cannot load them, it will cause errors in Google Search Console’s coverage report.

As part of our Google Cloud Platform services, we assist our customers with several WordPress queries.

Today, let us see how to fix the “Googlebot cannot access CSS and JS files” error on the WordPress site.

 

Googlebot cannot access CSS and JS files – Crawl errors

Google focuses on providing better rankings to user-friendly websites.

To determine the user experience, Google needs access to be able to visit the site’s CSS and JavaScript files.

By default, WordPress does not block search bots from accessing any CSS or JS files.

However, we may accidentally block them while adding extra security measures or when we use the WordPress security plugin.

The major cause of this error is the accidental blocking of these resources using a .htaccess file or robots.txt.

This will restrict Googlebot from indexing CSS and JS files which may affect the site’s SEO performance.

 

How to fix this?

As a first step, in the website’s root folder, we can ensure we do not block static resources.

If we do, find the files Google is unable to access on our website.

To see how Googlebot sees our website, click on Crawl » Fetch as Google in Google Search Console.

Then, click on the fetch and render button.

Once done, the result will appear in a row. It will show us what a user sees and what the Googlebot sees when it loads our site.

Any difference in the data means that Googlebot is not able to access CSS/JS files. In addition, we can see the links of CSS and JS files it was unable to access.

To find a list of blocked resources, go to Google Index » Blocked Resources.

Each of them will show the links to actual resources that Googlebot cannot access.

Mostly, it will be the CSS styles and JS files added by our WordPress plugins or theme.

In such a case, we need to edit the site’s robots.txt file which controls what Googlebot sees.

To edit it, we connect to our site using an FTP client. It will be in our site’s root directory.

If we use the Yoast SEO plugin, we can edit it from SEO » Tools page and then click on File Editor.

For instance, below we see the site has disallowed access to some WordPress directories:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

Now we need to remove the lines that block Google’s access to CSS or JS files on our site’s front-end.

Typically, they will be in the plugins or themes folders. In addition, we need to remove wp-includes, such as jQuery.

Sometimes the robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it indexes all files.

At times, a few WordPress hosting providers may proactively block access to default WordPress folders for bots.

We can override this via:

User-agent: *
Allow: /wp-includes/js/

Once done, save the robots.txt file.

Eventually, visit the fetch as Google tool and click on the fetch and render button. Now compare the fetch results to see the resolution.

[Need help with the procedures? We are here for you]

 

Conclusion

In short, the major cause of this error is the accidental blocking of search bot resources using a .htaccess file or robots.txt.

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

var google_conversion_label = "owonCMyG5nEQ0aD71QM";

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.