: Will removing certain libraries when bots visit the web page affect SEO? I included jquery and bootstrap mainly for user experience. When I checked my site’s speed on Google's PageSpeed, it
I included jquery and bootstrap mainly for user experience. When I checked my site’s speed on Google's PageSpeed, it recommended me to remove those libraries in order to boost page speed.
Now, if I remove those libraries permanently then I would be compromising the responsive feature of the website. So I decided to keep a server side check to confirm whether a bot or a browser is visiting my webpage. If a browser visits the webpage then I would include the libraries and if a bot visits my webpage I would not include the libraries.
Is it the right way? Will this affect SEO? Is there any other way I can achieve this?
I'm asking because if I do it this way then the bot’s rendering and browser’s rendering will be different (Though the text content and other contents still remains the same). Is it cloaking?
I found the Q&A "Does blocking content for some users and showing it to Google for indexing affect SEO" but this does not totally answer my situation.
More posts by @Sarah324
2 Comments
Sorted by latest first Latest Oldest Best
There are some people use robots.txt to block css and javascript things from Googlebot and here you gonna hide it from other techniques (May be by checking their user agent or IP address), but Google don't like that. It surely affect on SEO.
Bootstrap contain too many CSS codes which you might not using at all, I have used recently bootstrap and when I do site audits via chrome dev tools, it says 90% css is not used at all in my website, so why should I load them? Then I remove those unnecessary css codes from my bootstrap file. Same thing will apply to Jquery as well.
So optimize your framework, remove unnecessary things and host it on your own server, If your site is minimal and don't require more css then you can directly put your css in head section which will help you to save some time to request another HTTP request to server.
The key to loading javascript quickly is to load it when required and without blocking the page rendering.
Fooling the page speed tool with a server side check is foolish. it wont improve your end users experience of your site, nor will regular google bot be fooled, in fact it may negatively impact your ranking on the mobile index as the page is no longer responsive.
best advice is to look into efficient js loading techniques and optimse the site with requirejs or similar module loader for on demand js.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.