Blocking Javascript and CSS Could Harm Your SEO

Google has just updated its Webmaster guidelines, and the new guidelines could be bad news for sites that are blocking JavaScript and CSS files. In an announcement that was published on the Google Webmaster Central blog, Google explained that it has updated its indexing system so that it now behaves more like a real browser, which means that CSS and JavaScript files are being interpreted by the search engine. Google gives some quite clear advice to webmasters, noting that they should allow Googlebot to access JavaScript, CSS and images files that are displayed to end users. In the announcement they explained that this will provide optimal rendering and indexing for their websites. If webmasters disallow the crawling of CSS and JavaScript via their robots.txt, then this will directly harm how well algorithms render and index their content, resulting in less than optimal rankings. Google has warned webmasters that they should no longer view the Googlebot as a text-only browser. The rendering engine does not support all technologies, but it has become much more sophisticated. Page loading speed remains incredibly important, however, as does accessibility on a basic level. To help webmasters understand how their site looks to Google, the search engine has updated the Fetch as Google tool, which can be found in the webmaster tools admin panel. This tool simulates how Google sees websites when it crawls them. When the Fetch as Google tool is used in Fetch mode, it will crawl the URL that is given, and users can look at the response that the website provides. This is a quick and simple way of ensuring that there are no network connectivity or security issues with the website. The fetch and render mode, however, will fetch the page the way that Google would see it, and then render it as if it is being used by a normal end user with a relatively modern browser. This tool is useful because it provides a visual representation of any differences between the way that a page is displayed to Google and the way that it is displayed to an end user. If it has been a while since you have updated your SEO implementation, there is a high chance that your website is blocking CSS and JavaScript requests from Googlebot. It is a good idea for all webmasters to take a moment to examine their sites and relax such restrictions.

 

Next article

Is This The End of The Keyword Era? Google has just updated its Webmaster guidelines, and the new guidelines could be bad news for sites that are blocking JavaScript and CSS files. In an[...] Read article
Find Out How We Can Help Your Business

We specialise in implementing bespoke online marketing campaigns and building stunning responsive websites. Fill in the details below to find out how we can help your online business become a success.