Mobile app version of vmapp.org
Login or Join
Annie201

: For performance reasons: have different pages for search engines and users I have a SEO question for single page applications. We're working on a SPA (not yet public) that displays tables of

@Annie201

Posted in: #Ajax #Googlebot #GoogleSearch #Performance #Seo

I have a SEO question for single page applications.

We're working on a SPA (not yet public) that displays tables of numeric data to users.
The data volume is large, number aggregations are costly and consistent state must be kept under a high number of concurrent users. All of these cause high strain on the servers.

Since this is an AJAX application, for crawlers we serve HTML content following recommended practices (like e.g. ?_escaped_fragment_=...).

The application is optimized for usage patterns that a human user would generate, but a crawler indexing the entire site would cause a spike.

The application's architecture allows us to scale both vertically and horizontally but that means more money spent on hardware. A cheaper solution would be to serve the search engines the HTML content with random data in it, instead of the real numbers that are costly to generate.

Everything will be the same as what a user would see in the browser but with random numbers. Search engine wouldn't make sense of the numbers anyway so there is no point in loading the servers to build them.

As I said, my question is SEO related. We're not trying to do SEO optimizations or tricking search engines, we just want to lower the load on the server. But we're worried this would impact the ranking of the application? So would it?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

2 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Since this is an AJAX application anyway, write the data into the pages separately from your escaped fragments. Users would generate two AJAX requests:


GET /fragment?id=12345 that would contain the text and HTML for the screen with a placeholder for the data
GET /data?id=12345 that would be the actual data to write into the screen (maybe the data would be in JSON format).


You could the have a robots.txt like:

Disallow: /data


So that users could get the data, but search engine bots wouldn't access it. Search engine bots would still be able to get the text and other information from the /fragment URL using the normal ?_escaped_fragment_= crawling method.

10% popularity Vote Up Vote Down


 

@BetL925

In my opinion, you shouldn’t serve different content for users and search engines; it’s called cloaking and as you most probably know, it’s a bad SEO practice. And even if you don’t want to manipulate search engines results, I think Google bots couldn’t make any difference; as you know, they’re only bots.

When you say search engines wouldn’t care about fake data in the tables, you’re actually wrong. The search engines bots want the same data than users. Indeed, they need the same data to permit to users to find again a result in the SERPs.

Actually, I think you found the solution even if this is expensive. Except the fact to optimize your architecture site, I don’t see how you could optimize the performance of the site (and thus SEO).

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme