: Disabling frequent reloads of visitor In my website i have some pages, that makes connections to 3-4 sites and checks the results with my database and gathers a full table. This table includes
In my website i have some pages, that makes connections to 3-4 sites and checks the results with my database and gathers a full table. This table includes all the data that the visitor would need.
You can think it as scores table of latest 100 basketball, football and valleyball matches. Users see who won, latest score, location of game, bets data etc.
The sources of data changes every 15 minutes or so. So if the user reloads the page in this period, after many CPU consuming and database querying he will see nearly the same table. (There exist small differences)
I don't save table data into my web storage, because the data is invalid after 1 hour.
My concern is, if visitor reloads the page, this process will be done again again. How can i make solve this?
First ideas that come to my mind:
Saving page data in localStorage of browser. Storing last access date of page in cookie. And when the page is reloaded, checking the time difference. If smaller than 15 minutes than get data from local storage and show same page. But in this scenario. I will be limited to 5 mb storage limit and browser compatibilities.
Saving page data in my database and show from there. The problem of this is, the data is nearly 200 entries per page. When i use this model in 50 pages, i will have too much tables and data, where the data will be invalid in an hour.
Saving page data in my web storage. This is better option in beginning. But is not scalable. When visitor number becomes big, this won't be maintainable.
I would be happy if you can share some ideas, or better practices.
Thank you
More posts by @Cooney921
1 Comments
Sorted by latest first Latest Oldest Best
The first thought I have is that if a few reloads are causing a problem, then you are certainly doing something wrong somewhere. You should never be making connections to other sites on every page load as that sucks for user experience in the first place, regardless of reloads. Furthermore you are at the mercy of other sites always being up and running and returning valid data.
Your best bet is almost certainly caching the data, of which there are several levels. If the data on external sites changes every 15 minutes, then run a separate cron job to fetch that data and store it on your servers (eg if you are fetching an XML file, store said file). This will massively speed up the data processing every time the pages load.
Second, see if you are able to cache the results of the data processing, ie the table that you output. If the output is the same for every user this is dead simple. If every user has different setting and so on it will take up a lot more space, but there is at least something you can do there.
Regarding your ideas:
Local storage does not have a limit actually, but the browser will prompt users if you try to store more than 5MB. But I highly doubt you are near that amount anyway.
200 entries time 50 pages is absolutely nothing when it comes to databases: they are built to store millions of entries per table. It does mean running a lot of update queries every 15 minutes though.
Yes this could take up a lot of space, but storage is cheap. If you need a fast solution then do this now and continue looking at options. There is often little point in being fully scalable until you are closer to actually needing to scale.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.