![Si4351233](https://vmapp.org/images/player/000default.jpg)
: Handling SEO for Infinite pages that cause external slow API calls I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute).
I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute). Links in the site point to such pages, and when a users clicks them they are generated and he waits. Considering I cannot pre-create them all, I am trying to figure out the best SEO approach to handle these pages.
Options:
Create really simple pages for the web spiders and only real users will fetch the data and generate the page. A little bit 'afraid' google will see this as low quality content, which might also feel duplicated.
Put them under a directory in my site (e.g. /non-generated/) and put a disallow in robots.txt. Problem here is I don't want users to have to deal with a different URL when wanting to share this page or make sense of it. Thought about maybe redirecting real users from this URL back to the regular hierarchy and that way 'fooling' google not to get to them. Again not sure he will like me for that.
Letting him crawl these pages. Main problem is I can't control to rate of the API calls and also my site seems slower than it should from a spider's perspective (if he only crawled the generated pages, he'd think it's much faster).
Which approach would you suggest?
More posts by @Si4351233
1 Comments
Sorted by latest first Latest Oldest Best
If I were you, I'd ensure the link is "nofollow" unti the page has been generated once. Once it's generated, store it in a cache or create a static page from it. Then make the link "follow" so the content can be indexed.
Optionally, if it's an option, consider generating all those pages in some Cron Job overnight so they won't take that long to load.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.