Mobile app version of vmapp.org
Login or Join
Bryan171

: Could AJAX + caching be seen as cloaking? I have a website where we use a technique to speed up loading times based on a combination of AJAX + caching. Basically, when we have a section

@Bryan171

Posted in: #Ajax #Cache #Cloaking #Seo

I have a website where we use a technique to speed up loading times based on a combination of AJAX + caching. Basically, when we have a section in a page with content which is slow to retrieve, we first check if it's cached. If it is, then we serve the content; if it's not, we serve a placeholder and then make an AJAX call in the client to retrieve the content, which is then cached for subsequent requests.

As a consequence, sometimes you get the entire page content in the first request, and sometimes you get those placeholders, which get filled immediately with the responses of the AJAX request.

You can see an example in the results count by category in the right column of this page:
www.inzoco.com/crits/2-1-3-28-185-0-28079-0-0/listado-piso-en-alquiler-en-madrid-madrid.aspx
I'm worried if it could be seen as cloaking by search engines because if you make a request for a page which content isn't cached and then ask again for the same page, you would get different responses, the first with the placeholders and AJAX requests and the second one with all the content rendered.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Bryan171

1 Comments

Sorted by latest first Latest Oldest Best

 

@Kristi941

Cloaking is when you intentionally serve different content to search engines then to your users for the sake of manipulating the search results. That isn't the case here.

Search engines seeing different content will only occur once if the cache is created and saved after the page's initial visit. And even then it will only happen if the search engine is the first to view the page. So this scenario will only happen for a percentage of your pages and even only once for those pages. That's not really a problem.

But, if you want to avoid this altogether, you may want to consider creating a cron job that runs when your site's traffic is lowest (probably overnight around 3AM) to cache content for any new pages that do not yet have a cache generated by its initial page view.

An alternate option, or one you may wish to implement in addition to my above suggestion, is to have links to the complete content and then remove them when using JavaScript/Ajax to download the content on the page. That way search engines always have access to the complete content. This is known as progressive enhancement and is recommended as it allows users without JavaScript to also have access to that content.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme