: Impact of storing image sources into custom data attributes I am working on a project which has a huge amount of images. Some pages reach up to 15MB of images alone. Naturally before even
I am working on a project which has a huge amount of images. Some pages reach up to 15MB of images alone. Naturally before even thinking of putting something like that online I want to reduce the amount of images which will get loaded during the initial page load. In order to do that I am thinking about using a form of lazy loading where the images will get loaded as they are requested.
The technical principle is pretty simple. All the src values for the lazy loaded images are left empty and the real URLs are stored in a data attribute like this.
<img src="data:image/png;base64,..." data-src="path/to/image/image.jpg">
After a user action the data-src value is simply copied into src through JavaScript.
What I am worried about here is accessibility and if there would be any kind of negative impact on how Google and other search engines might see this optimization. Do search engines even index values stored in data attributes? My guess is no because data attributes are custom data and have no real meaning.
I have thought about adding an additional <noscript> tag next to every image just to make sure search engines do get to index the image URLs
<img src="data:image/png;base64,..." data-src="path/to/image/image.jpg">
<noscript>
<img src="path/to/image/image.jpg">
</noscript>
As a matter of fact I even thought about putting all the image URLs inside a single <noscript> tag which would be hidden and located somewhere near the end of the page. The main reason why I would like to do that is because if JavaScript is not enabled then basically there is no way to trigger the on demand image load so putting <noscript> tags next to lazy loaded images doesn't have much sense.
I would simply like to minimize the impact that such a huge number of images has on page load times and make sure Google actually knows about the images and indexes them.
EDIT
Lazy loading is not triggered by scrolling, in fact the proper term would probably be deferred images because pages behave more like small SPAs.
Every page has around two dozen versions of a single product. Every product has four images, each weighing approximately 150-200KB. Clicking on an icon for a specific version should display those four images in a carousel type rotation (image rotation is triggered manually).
Servers are more than capable of handling all the bandwidth but my only concern is client side user experience. Of course those two are bound together but the point is it's the client side that can control how the requests are made and optimized.
As already explained the main reason for the given ~15MB comes from the huge number of images, not primarily from image size itself (24 models * 4 images per model * 150KB ≈ 14MB).
Reducing the number of images is not an option. It's a requirement set by marketing.
Further reducing image size could be a problem because one of the project goals is to be optimized for as much screen sizes as possible, large screens like 1440p included. We do have a srcsetin place which serves screen size specific images to optimize bandwidth but that still doesn't solve the problem of serving such a huge number of images.
The only reason why it would be preferable for images to get picked by search engines is that our current visits are often triggered by potential customers actually searching for specific images (Google Image Search).
My primary concern is that if I let the page load all the images during initial page load, so that they can become immediately available when a user clicks on a different model, then I am facing terrible user experience. There is also a possibility that someone picks a model which is at the end of the page before the images actually load which would leave the visitor staring at a big white chunk of nothing. However, once the images are loaded the user experience is flawless, no waiting to switch the models.
On the other side on demand loading would have at least two huge advantages: the initial page load times and the total amount of used bandwidth would be dramatically reduced. Loading all the images upfront would be like assuming that the user experience is optimized for every user seeing every available model. On the negative side on demand requests would require the user to wait for images to load every time he or she selects a new model. I've seen things go bad in situations like that on mobile devices which can sometimes have issues with even small amounts of data because they can't always provide a stable connection.
It's a trade off dilemma. Somehow after summarizing stuff like this I am leaning more towards the second solution - loading the images on demand.
More posts by @Cofer257
1 Comments
Sorted by latest first Latest Oldest Best
In your case I do not think how you present the code is important compared to file size.
I can not see any situation why you would need to ever load an image that big.
Google is big on file size and load times. Even if one photo loaded at most per page, the result will not be good.
If the file was in a format where a larger file size is normal (IE: wav, mp4), then poor user experience and search engines would be of no concern here.
No matter how you hide/present the code, expect a bad result, because as you said you want it to be indexed, so you are telling the bot crawlers to come right on in and view codes.
From a user's experience point of view.. I cannot see it be a pleasant one, unless I am missing something.
Best bet is to shrink the files, display more than one image at a time and allow the user to select the larger image view and/or a download link.
This will eliminate many possibilities of poor user experience and rank results.
Note: even if you use lazy load or scroll fire, users upon initial scroll tend to scroll down fast for a visual overview. In both cases lazy load and similar events will not be effective.
Photographers normally have large image files.
Whether it is a single image or a single image in a storyboard design, I always try to steer clients away from doing either. A well presented gallery with access to larger images on click increases UX and search ranking, in addition to freeing up server resources.
Edit based on your Edit:
Interesting scenarios
Initial page images load...
Toggling/Add Class
display:none, hides the content > pre-loads the content > content rich source code. --image search links back to content rich page.
Serve External Content
Load main 24 images > User clicks image, modal loads external model content (remaining 4 images and any other info > Preloader for good UX. --image search links back to individual sub model page. Where's all the content for that page now, eh.
In short it should have:
User controlled bandwidth, good UX, return content rich pages or CTA for visitors entering from image search, indexed content, page size and page load times.
So how about:
Models treated as category pages (page loads 24 models)
onClick modal opens, preloader > external content loads (delays in
this case are pre anticipated and therefore a good UX.)
The external content should be designed with article page and modal
friendly layouts.
Search image visitor's view the entire content of the exact product
they were looking at in the first place
Add CTA that links back to the corresponding model page.
Combine this along with content related directory names and URLs.
example.com/category/model/submodel_page
example.com/photos/category/model/submodel/abc.jpg
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.