Many of our clients started receiving this message in their search console this week:
Googlebot Cannot Access Your JavaScript and CSS Files
Last October Google updated their Webmaster Guidelines and explained that hiding your JavaScript and CSS files would soon start to penalize your raking index. It appears this week they started making good on that promise. The problem is that most sites are not intentionally blocking these files.
On a website you can use a file called “robots.txt” to indicate what content the search engine bots (ie. Googlebot) are allowed to index on your site. If this file is too restrictive google will now start penalizing your site in your search rankings. Google is trying to make sure that what their Googlebot sees when they visit your site is the same thing that a user would see when they come to the site because your browser does not pay any attention to the robots.txt file.
If you get a warning in your Google Search Console but don’t know how to remedy the problem, please don’t hesitate to call or email us and we can help you get it fixed before your Google page rank drops.
Recent Comments