This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 17842 - <img>: Feature to make <img> elements not load their images until needed
Summary: <img>: Feature to make <img> elements not load their images until needed
Status: RESOLVED MOVED
Alias: None
Product: WHATWG
Classification: Unclassified
Component: HTML (show other bugs)
Version: unspecified
Hardware: Other All
: P3 enhancement
Target Milestone: 2018 Q1
Assignee: Ian 'Hixie' Hickson
QA Contact: contributor
URL:
Whiteboard: blocked on dependencies, picture
Keywords:
Depends on: 25715 picture
Blocks:
  Show dependency treegraph
 
Reported: 2012-07-18 07:01 UTC by contributor
Modified: 2017-07-01 19:07 UTC (History)
25 users (show)

See Also:


Attachments

Description contributor 2012-07-18 07:01:05 UTC
This was was cloned from bug 16830 as part of operation convergence.
Originally filed: 2012-04-23 15:06:00 +0000
Original reporter: Josh T. <mephiles@live.co.uk>

================================================================================
 #0   Josh T.                                         2012-04-23 15:06:13 +0000 
--------------------------------------------------------------------------------
The BBC updated their BBC News mobile website, recently. One of the ways they optimised it for mobile devices was by deferring loading images until after the page has loaded.

I haven't looked into how this is done in detail, but they used DIVs for placeholders in the positions where the images will go. For example:

<div class="delayed-image-load" data-src="http://static.bbci.co.uk/news/200/media/images/59388000/jpg/_59388680_59388679.jpg"></div>

After the page has loaded, the DIV is changed into an IMG element.

Clearly, this isn't very semantic, and wouldn't work at all if JavaScript is disabled. But they have a good reason for doing this, which is why I believe there is a need for a way of deferring the loading of images until after the page has been parsed. This could also be applied to other embedded content elements, like IFRAME and OBJECT.

The way I think this should be done is by using the DEFER attribute, which would work in the same way as it does in the SCRIPT element.
================================================================================
 #1   Odin Hørthe Omdal                               2012-05-08 13:16:06 +0000 
--------------------------------------------------------------------------------
Lots of pages do this, even for normal web pages. Load (fadein) on scroll.

I think there was another bug for this. I believe it's a hint worth having. For painting it'd be nicer to actually have height/width set though.
================================================================================
Comment 1 Odin Hørthe Omdal 2012-07-18 08:47:19 UTC
Another thing, I totally forgot this bug here, but I also proposed having

   <img defer>

for the artistic use case for responsive images.

That way you can do your media query/breakpoint checks in Javascript and load the appropriate image based totally on your client side interpretation.
Comment 2 Ian 'Hixie' Hickson 2012-12-02 04:08:33 UTC
Not sure I understand comment 1 (if you're doing this in JS, why not insert the image in JS too?).

Regarding the original comment, though, I'm not sure i really follow the idea. What does deferring the images do? The page won't parse any faster, as far as I can tell. What's the benefit for not loading images right away?
Comment 3 Josh Tumath 2012-12-02 23:01:44 UTC
For (In reply to comment #2)
> Regarding the original comment, though, I'm not sure i really follow the
> idea. What does deferring the images do? The page won't parse any faster, as
> far as I can tell. What's the benefit for not loading images right away?

I think a good use case for this has been presented by the BBC's mobile Web site. It allows 'essential' smaller images like icons to be loaded whilst larger images aren't loaded until all of the text has been. Doing this with DIVs in JavaScript seems ugly and the images won't be displayed if JavaScript is disabled or the script file fails to load. Obviously this would only benefit users with very slow mobile data connections.
Comment 4 Odin Hørthe Omdal 2012-12-03 09:20:12 UTC
There are many benefits, you have to ask yourself, why do so many web pages do this today?

Only from the 2-3 daily pages I go to, the two biggest news papers in Norway do this deferred image load. Also a blogging network site called blogg.no does it on *all* blogs across their entire network. They're the biggest blog network in Norway. So going in to any random food- or fashion blog, all images are <div>'s, they don't load until they are scrolled down to.

The reason they do that is to save money on bandwidth, I would guess. Because they set width and height, the loading is not irritating at all. The pictures don't just "pop in" they fade in quite beautifully if you're scrolling faster than they load.


So there's two modes I'd see something like this having:

1.  The default behaviour. Which could have a in-spec hook like @srcset for implementation specific behaviour. But the default could be, don't make these images/resources slow down the site so don't load until everything else is finished. When you do load, get the image and fade-in (basically what bbc, nowegian news sites, blogg.no etc do). It could (and probably should (maybe not in default?)) also defer stuff that's "too far down" on the page. But that can also be done using the js-based solution:

2.  Have a js-API which would "take control" over the image (don't know how, preventDefault some event? doing the_img.load()? setting the_img.src = "my_custom_url.jpg"?). In this way you could start doing some advanced image selection algorithms again. And if the default (1.) behaviour lazily loads every non-visible image on the page, you could override that behaviour if you're blogg.no and really don't want to do the extra image loads.


The benefits of this, ofc, is that if you have no javascript, or old browser, you still get the pictures, whereas with the current <div>-based hacks, you don't get anything.

Opera Mini (which is not a conforming browser bla bla) actually has had quite some problems with this lazy-loading sites like our news papers and the blogging sites. If this had been real <img>s instead, it'd be easy to declaratively load everything anyway, although the site really wanted to optimize for non-load on the more common user agents.

Also, mobile browsers could have a default where it doesn't load images until it sees that the user scrolls, at that point it could warm up the 3G connection and actually load all the rest of the images. Whereas the desktop version could always (lazily) load every picture on the page if it wants to optimize for that.




So, the reason for not doing this in pure js is fallback, code beauty, getting some nice, common behaviour "for free" - and being able (as web author) to extend that if you wish. Another nice thing is that it makes opt-in deferred image loading possible again, like Opera did in fact try, but found out it broke the web.
Comment 5 Ian 'Hixie' Hickson 2012-12-03 23:24:00 UTC
Delaying the image until it's needed in the rendering makes sense to me, but that's not just delaying until after onload, that's delaying possibly forever.

Is that what this is requesting?

Delaying until after onload doesn't make sense to me. In particular, delaying the load of the images shouldn't make any difference regarding how fast the original file loads, since unless the HTML file is just ridiculously large, it's going to have downloaded and parsed long before the requests for the images start returning data.
Comment 6 Marcos Caceres 2012-12-04 10:02:00 UTC
I'd like to a add use-case for deferred image loading: don't load unnecessary resources until the user actually needs them. 

Consider in cuevana.tv (scroll to the movies/series) section, all the movies or shows that are off screen don't get loaded until the user scrolls them into view. This saves bandwidth for the site, as they don't need to transfer resources to the user that the user will never see (until the user wants to see them). This saves bandwidth for the user, as they don't need to download a bunch of images they will never see.
Comment 7 Odin Hørthe Omdal 2012-12-04 10:46:57 UTC
Ian: Yes, exactly; possibly delaying forever.
Comment 8 Ian 'Hixie' Hickson 2012-12-05 20:40:45 UTC
Josh, can you confirm that that is what you meant?
Comment 9 Josh Tumath 2012-12-05 22:21:45 UTC
(In reply to comment #8)
> Josh, can you confirm that that is what you meant?

Yes. I didn't have a specific way in mind of how it would work but giving control over when an image is loaded is what I meant.
Comment 10 Ian 'Hixie' Hickson 2012-12-05 22:40:35 UTC
Ok, that makes sense. Thanks for the clarification.
Comment 11 louisremi 2012-12-06 14:54:10 UTC
This "defer forever" mechanism would work very well with the ViewportObserver I proposed[1], allowing users to observe images entering the Viewport and loading them at that point.

[1] https://www.w3.org/Bugs/Public/show_bug.cgi?id=20246
Comment 12 Ian 'Hixie' Hickson 2013-02-09 22:11:44 UTC
Are any browser vendors interested in implementing this?
Comment 13 Mark Nottingham 2013-02-13 03:15:25 UTC
Another use case, just to make sure it's explicit: many sites will display: none images if they're viewed on small screens, etc. Downloading those images wastes bandwidth (which costs for many users), and makes the page slower.
Comment 14 Ilya Grigorik 2013-02-13 10:17:23 UTC
Perhaps counterintuitively, I don't think "defer forever" use case makes sense on mobile, which is the case everyone here is focusing on. But before we get there...

(a) Instead of "defer", I do think we would benefit from an "async" attribute on img tags. The difference is perhaps a bit subtle.. We don't want to arbitrarily delay the start of image loading until an arbitrary point in the document, but we *do* want to provide an additional hint to the browser to help prioritize assets in a better way. Marking an image as async would effectively tell us that this is a low priority image (invisible, below the fold, etc), which would allow the browser to avoid starving other critical resources of bandwidth. 

Looking forward, with HTTP 2.0, this will be a perfect use case: we can send the request to the server immediately, but give it a low(er) priority. If the server can saturate the pipe with higher priority resources, then it will do that.. but if higher priority resources are trickling out without making use of the full bandwidth of the pipe, then remaining capacity can be used for the lower priority image. (both optimal scenarios) 

On mobile, where the best practice is to burst and saturate the pipe.. this is the behavior you want.

(b) Defer forever doesn't make sense. First off, everyone has a different definition of what forever means. For me it may be on scroll, for you it's onload, for another user it's when a user clicks on a "next image" gallery button. 

Second, and perhaps even more important.. Providing this mechanism would likely cause more harm than good on mobile. On mobile, we want developers to burst as much data as possible, then shut off the radio, and not touch it (preferably until next page load :)). The 'on-demand' loading pattern is an anti-pattern: it introduces a ton of latency, and burns your battery like nothing else.

Long story short: +1 for async attribute as a hint for low priority image, but async should not define any behaviors as to _when_ the image is loaded. This decision should be deferred to the browser+server, which can figure this out dynamically based on user's connection. Finally, if you do want to defer the load, then you have plenty of existing JS solutions to do so.
Comment 15 Jake Archibald 2013-02-13 14:38:25 UTC
This attribute, defer/async/whatever would be a hint to the browser that the image may not be needed during the lifetime of the page.

The browser MUST NOT start downloading the before DomContentLoaded, allowing JavaScript to change its src dynamically but providing a fallback if JavaScript does not execute. (worried about the performance of this, maybe there's a better way).

The browser MAY give this image a lower priority than images without the attribute, and other page resources.

The browser MAY defer image downloading until the image would be visible within the viewport. Browsers could go extreme, and avoid downloading images that are within the viewport but completely obscured by other elements. If the device is using a connection that benefits from burst, it load images that are outside the viewport to a certain degree, or load all images regardless (after DomContentLoaded).

Ilya, does that fit in with the network stuff?
Comment 16 Ilya Grigorik 2013-02-19 18:58:34 UTC
We had an offline chat with Jake, to summarize the outcome...

Having a simple priority hint on <img>, which allows the author to explicitly mark which images have a lower relative priority (for whatever reason, ex: hidden gallery image) *would* be useful, with the following rule: "The browser MAY give this image a lower priority than images without the attribute, and other page resources."

There are no promises as to when the image would get loaded - this decision should be deferred to the client, which is in best position to determine the optimal behavior based on current connection. Similarly, this is not a mechanism for "defer forever" - if necessary, this is better tackled via a separate mechanism (see discussion in 20246).

Re: browser vendor interest. Assuming the above, I do think this is something we would like to have in Chrome.

Thoughts, comments?
Comment 17 Yoav Weiss 2013-04-03 18:01:30 UTC
I'd like to raise another use-case for a `defer` attribute - the "fallback" use-case.
When `img` elements are used as a fallback for other elements such as `video` or `picture`, their resources are being downloaded, even though they are never displayed. `img` triggers a resource download upon its creation, before it enters the DOM, so there's no elegant & simple way to define an "If img is inside video, don't load the resource" mechanism. (At least, I failed to find such a way in WebKit. Implementors: please correct me if I'm wrong. )
A `defer` attribute will enable to continue using `img` as a fallback, without the performance penalty.

In order to support that use-case, the User-Agent must avoid loading deferred image resources that weren't yet added to the DOM (since they may serve as fallbacks)

I'm not sure this needs to be specified, or it simply falls under "MAY defer".
Comment 18 Ilya Grigorik 2013-04-04 00:43:44 UTC
@Yoav: isn't this case covered by "poster" attributes on video/audio tags? 

---
poster = URL potentially surrounded by spaces
The address of an image file for the UA to show while no video data is available.
---
Comment 19 David Newton 2013-04-04 12:53:33 UTC
The poster attribute assumes that the browser understands the `video` element; if `video` is not supported, `poster` will be ignored. The `img` fallback as a child of `video` (or `picture` or whatever) is intended to be loaded only if `video` is not supported or there is no `poster specified.

Right now, even if a poster is present and/or one of the `source` resources is chosen, the fallback `img` is still downloaded. A `defer` attribute/prop should help avoid this.
Comment 20 Ilya Grigorik 2013-04-05 03:07:57 UTC
(In reply to comment #19)
> The poster attribute assumes that the browser understands the `video`
> element; if `video` is not supported, `poster` will be ignored. The `img`
> fallback as a child of `video` (or `picture` or whatever) is intended to be
> loaded only if `video` is not supported or there is no `poster specified.

Gotcha.

> Right now, even if a poster is present and/or one of the `source` resources
> is chosen, the fallback `img` is still downloaded. A `defer` attribute/prop
> should help avoid this.

So, one alternative strategy is to rely on some JS test and then inject an image - correct? It appears that modernizr ships with an audio/video test.

Also, why would you ever want the image to not "defer" inside of this fallback case? That is.. isn't this a more generic: if inside <video> or <audio> then don't start the download unless all sources fail? The extra "defer" marker seems unnecessary and just muddies the waters.
Comment 21 Jake Archibald 2013-04-05 09:37:51 UTC
Deferring until DOMContentLoad is pretty nuts, hadn't really thought that through. Here's another stab at <img defer>:

Images with 'defer' MUST NOT download if they are not in a document, or their 'display' style property does not calculate to 'none'.

This should cover images inside video elements but also elements hidden by CSS, which could be set in a selector like ".has-js .delayed-image-load" which covers the BBC case.

The download state of images with 'defer' MUST NOT delay the 'load' event of the window.

The 'load' event of an image fires when the image has loaded and decoded (as currently specced), but of course this will never happen if the image has display:none for the life of the page.

Images with 'defer' MAY download once they are in the document, and their calculated 'display' style property is not 'none'.

Images with 'defer' MUST download once they are in the document, and their calculated 'display' style property is not 'none', and the image element might appear within the viewport.

This gives the browser a bit of leeway on image download. Eg, if the device is on a cell network it may download images that are out of view, to save waking the radio up constantly, although it may still defer any decoding. Devices on wifi may download more conservatively.

Potential issues:

var img = new Image();
img.src = url;
img.defer = true;

From what I can tell from the spec, the download is queued, synchronously, when the src is set. Can setting 'defer' easily unqueue that download? I'd rather not require devs to set these properties in a given order.

Working out if an image is in the viewport is tricky for image elements whose size is determined by the image (which we don't know, because we haven't downloaded it). In this case, I guess the browser could play it safe and download the image once scrollTop/scrollLeft is greater than the image's top/left position, as well as within the viewport.
Comment 22 Yoav Weiss 2013-04-05 14:57:53 UTC
@Ilya: My initial thoughts were to do exactly that - prevent `img` from loading resources when it is inside `picture`/`video`.
However, since `img`'s src starts loading as soon as the element is created, before its addition to the DOM, the element is not aware of its future parent element when the loading starts. I found no easy solution around that that doesn't involve modifying the behavior of `img` or ugly hacks that will pass the parser's state ("inside picture/video/audio") to the `img` element.
I've talked to Robin Berjon about this, and he suggested turning all the DOM inside `picture`/`video` to inert, preventing internal images from loading their resources. That's a cleaner solution, but possibly more complicated to implement.
If anyone here has an idea of a simple way to implement this without relying on extra attributes, I'd love to hear it.

While searching for the ideal solution, I thought that a `defer` based fallback simplifies the implementation, without (too many) drawbacks for devs.

@Jake: Sounds great to me.
The issue you raised with `img.src=url` is a real problem and even if you could easily dequeue the image request (which I'm not sure all browsers can), you'd have a race condition in this case.
One alternative would be adding a `deferred` parameter to the `Image()` constructor, which is false by default. So in order to create a dynamic deferred image, devs would write:
   var img = new Image(true);
   img.src=url

I'm not sure it's much better than forcing devs to set the attributes in a specific order. What do you think?
Comment 23 Marcos Caceres 2013-04-05 15:18:24 UTC
(In reply to comment #22)
> @Jake: Sounds great to me.
> The issue you raised with `img.src=url` is a real problem and even if you
> could easily dequeue the image request (which I'm not sure all browsers
> can), you'd have a race condition in this case.

FWIW, I don't think setting parameters in a particular order is a problem. Setting attributes on objects can have synchronous side effects so order matters... it's like arguing that a dev should be able to invoke xhr.send() before xhr.open(). 

I'll also add that dequeuing sounds very complicated.

> One alternative would be adding a `deferred` parameter to the `Image()`
> constructor, which is false by default. So in order to create a dynamic
> deferred image, devs would write:
>    var img = new Image(true);
>    img.src=url
> 
> I'm not sure it's much better than forcing devs to set the attributes in a
> specific order. What do you think?

It's certainly nicer to have it in the constructor, though for legibility and extensibility you probably want to pass in a dictionary:

new Image({defer: true})
Comment 24 Odin Hørthe Omdal 2013-04-05 16:52:35 UTC
(In reply to comment #21)
> Deferring until DOMContentLoad is pretty nuts, hadn't really thought that
> through. Here's another stab at <img defer>:
> 
> Images with 'defer' MUST NOT download if they are not in a document, or
> their 'display' style property does not calculate to 'none'.

Shouldn't this be "'display' style property calculates to 'none'"?  The intention was to not download hidden images I assume?

> This should cover images inside video elements but also elements hidden by
> CSS, which could be set in a selector like ".has-js .delayed-image-load"
> which covers the BBC case.

How?  The .has-js would not be applied before the script that sets it has run, is it a requirement then for that to be run in a <script> tag before any css?  And this method, would it not have to wait until all CSS is loaded anyway to be able to check what the CSS evaluates to?

So you are deferring this even longer? I.e. always after DOMContentLoad, because you need all the CSS, since you never know if at the end of the page you will get some crazy javascript that loads lots of new CSS?

So basically DOMContentLoad would still be a good place to do all those site-specific hooks?

(In reply to comment #21)
> var img = new Image();
> img.src = url;
> img.defer = true;
> From what I can tell from the spec, the download is queued, synchronously,
> when the src is set. Can setting 'defer' easily unqueue that download? I'd
> rather not require devs to set these properties in a given order.

I do not know the restrictions for image loading at such a detailed level, but doing image queuing as a (micro?)task that starts right before the running JS returns to the event loop would only make the queue mere CPU instructions later than it does now. And compared to any network fetch that can't be measured.

Or I might be right off the ball, because it sounds a bit too easy.

> Working out if an image is in the viewport is tricky for image elements
> whose size is determined by the image (which we don't know, because we
> haven't downloaded it). In this case, I guess the browser could play it safe
> and download the image once scrollTop/scrollLeft is greater than the image's
> top/left position, as well as within the viewport.

If you have 2000 pictures with no width/height set, they would in this case actually all be inside the viewport when nothing has loaded.  But I guess heuristics for that could be in browser engine land, and not at spec level (since you should be allowed to defer anyway).

(In reply to comment #14)
> Perhaps counterintuitively, I don't think "defer forever" use case makes
> sense on mobile, which is the case everyone here is focusing on.

I almost didn't mention mobile at all.  But it also makes sense in another way than the ones you've outlined along with others in your later comments: the way this is solved today (data-src) doesn't give any information to mobile who would like to get everything on a warmed up connection.  So by actually leaving the smarts up to the browser for these common cases, it can choose to be smart however it wishes.  So network performance-wise it /is/ in fact a win for mobile, and does make sense.

And defer forever is not really much for the client, but for the site owner himself who would like to save bandwidth, and defer forever would have to be implemented with javascript, because you can't rely on the browser to do that.

So this is giving more power to both the user agent, and the web page author when they use it.

For the user agent:

- Allowing it (and its user) to make load decision, because the image is
  there right away, instead of being a useless div.

    For a 3G UA this could mean it can stuff everything it needs into
    that warmed up connection.

    For an offline downloader it means it can fetch the images without
    having to fake scroll events through the whole page.

- It gets information to better prioritize its network loads, it will know
  these images are not as critical as say an icon without @defer.


For the web page author:

- Having the user agent wait just long enough for you to have your say.

    A mobile-first web site might have small images (and no srcset/...
    yet ;-)), before the image loads start, some js can rewrite all
    the urls from /images/small/ to images/1600/ because it can see
    you have a huge viewport.

    A site wanting to save money can disable image loading but still
    use <img> (photo blogs, the 'defer forever')

    A carefully designed web page wants to wait for layout, and till
    all its media queries has come into effect to see how the page
    lays out. Depending on the layout, it loads different types of
    images.

    A generic "load small images" script will be able to wait for
    layout, and only request the exact image to fit into a given
    space.


There's many examples of sites using <div>s instead of images to be able get a hook point to control the images, this is the strongest point :-)  It's actually done atm, but in a very bad way.

This in addition to all the rest.
Comment 25 Jake Archibald 2013-04-07 17:09:27 UTC
@Marcos @Yoav: Yeah, we already require users to set load events before src, so I don't think this is a problem.

@Odin:
(In reply to comment #24)
> Shouldn't this be "'display' style property calculates to 'none'"?  The
> intention was to not download hidden images I assume?

Yes, that's what I mean. Oops!

> > This should cover images inside video elements but also elements hidden by
> > CSS, which could be set in a selector like ".has-js .delayed-image-load"
> > which covers the BBC case.
> 
> How?  The .has-js would not be applied before the script that sets it has
> run, is it a requirement then for that to be run in a <script> tag before
> any css?

Yes
document.documentElement.className += ' has-js';
(or similar) is a typical thing to run in the head of a page to hide elements that are to be enhanced by JS. Modernizer does something similar.

> And this method, would it not have to wait until all CSS is loaded
> anyway to be able to check what the CSS evaluates to?

Yes. The loading pattern becomes like CSS backgrounds, but allows the browser to defer further based on viewport visibility.

> So you are deferring this even longer? I.e. always after DOMContentLoad,

No, CSS can download and apply before DOMContentLoaded, that's how we get progressive rendering. Think CSS backgrounds.

> > Working out if an image is in the viewport is tricky for image elements
> > whose size is determined by the image...
> 
> If you have 2000 pictures with no width/height set, they would in this case
> actually all be inside the viewport when nothing has loaded.

Assuming that's all that's on the page (as in, the x & y of each image is in the viewport) and their display isn't 'none', yeah, I'd expect them all to be queued for download, although the browser would be allowed to dequeue images if the first 50 images push the others out of the viewport.
Comment 26 Jake Archibald 2013-04-07 17:18:06 UTC
An updated proposal, following Odin's comments:

Images with 'defer' MUST NOT download while they are not in a document, or their calculated 'display' style property is 'none'.

The download state of images with 'defer' MUST NOT delay the 'load' event of the window.

Images with 'defer' MAY download once they are in the document, and their calculated 'display' style property is not 'none'.

Images with 'defer' MUST download once they are in the document, and their calculated 'display' style property is not 'none', and any part of the image would be visible within the viewport. If visibility within the viewport is uncertain (due to element size determined by image dimensions), the image MUST download.
Comment 27 Guy Podjarny 2013-04-07 22:46:29 UTC
Great conversation, I think this would be a powerful feature. 

A few additional thoughts, based on my experience and what we've learned providing on-demand image loading as part of Akamai's automated FEO solution (previously Blaze). 

1) While I understand the value to the user-agent, I believe website devs often intentionally want to separate deferred loading of images (meant to make the page load faster) and loading images only when they become visible (meant to reduce costs). Odin eluded to that, but unless I'm missing something, devs would still need to resort to JavaScript to achieve that here. 

Also note that while downloading resources on-demand may be inefficient for a mobile device/network, downloading an image the user likely not to see at all (e.g. far below the fold) is even more inefficient. Especially in responsive websites and news websites like The Sun and Daily Mail, scrolling can be very long, and most users don't go far down.

Perhaps we can allow defer to be "explicit" vs use the automated logic?
Ilya's original suggested split between async and defer may do the trick - have defer be explicitly "load image only when visible" while async would leave the decision to the browser.


2) Deferring image download until all CSS has completed probably doesn't delay them much more than today. On most web pages today, due to the mixing of CSS & JavaScript at the top, images don't start getting downloaded until after all CSS has been downloaded and processed anyway. 

However, this may be a serious penalty in the world of SPDY & HTTP 2, where browsers wouldn't need to block to prioritize. In a request multiplexing context, delaying image download till post CSS could hurt user experience. 

I actually don't have a better suggestion at the moment, just pointing out the concern. It may be that the only solution would be for devs to avoid "Defer" on primary above-the-fold images, losing some of the responsive flavor of it.


3) If the image's load state doesn't delay the window's load state, your RUM page load measurements may become useless, as you'll be "told" the page loaded before any of the major images are shown. Same goes for browser progress indicators. 

Since the browser already knows, by the time DOMContentLoaded fired, which images it intends to load, could *those* images still delay the window's load?


4) Our optimization also allows for a "buffer" space below the visible area, e.g. 100px "below the fold". Any image partially included within that space is considered visible for this purpose, and is downloaded immediately. This proved useful when trading off costs (don't download what users may not see) and speed (if the user scrolls down "a bit" or slowly, the image is already there).

This is again a user-agent decision, but it might be useful to define some meta tag or similar mechanism for the user to offer its guidance.
Comment 28 Ilya Grigorik 2013-04-08 03:40:02 UTC
> Also note that while downloading resources on-demand may be inefficient for
> a mobile device/network, downloading an image the user likely not to see at
> all (e.g. far below the fold) is even more inefficient. Especially in
> responsive websites and news websites like The Sun and Daily Mail, scrolling
> can be very long, and most users don't go far down.

Right, and this is not a problem to solve with attributes on the img tag. This is an application problem: if your users are not consuming that content, don't put it there! We would only encourage the wrong behavior here by lowering the barrier.

> Perhaps we can allow defer to be "explicit" vs use the automated logic?
> Ilya's original suggested split between async and defer may do the trick -
> have defer be explicitly "load image only when visible" while async would
> leave the decision to the browser.

I think anything to do with visibility is application-specific. We shouldn't dictate this behavior in the spec, as it will only lead to more confusion. We know how to do this with JavaScript, and let's keep it there.
 
> 2) Deferring image download until all CSS has completed probably doesn't
> delay them much more than today. On most web pages today, due to the mixing
> of CSS & JavaScript at the top, images don't start getting downloaded until
> after all CSS has been downloaded and processed anyway. 

That's not entirely true. It is not uncommon for the preload scanner to scan the doc in its entirety way before CSS is available. If we explicitly dictate that we must wait for CSS.. that means we will underutilize the network link.

> However, this may be a serious penalty in the world of SPDY & HTTP 2, where
> browsers wouldn't need to block to prioritize. In a request multiplexing
> context, delaying image download till post CSS could hurt user experience. 

Exactly! Let the UA decide and optimize the download behavior. As a web-developer you *do not* have enough information to dictate a strict ordering of what/when should be downloaded -- let the UA do its job.

> 3) If the image's load state doesn't delay the window's load state, your RUM
> page load measurements may become useless, as you'll be "told" the page
> loaded before any of the major images are shown. Same goes for browser
> progress indicators. 

Yup, I share the same concern here. While I like the idea of it, consider this case: I append defer on all my images, my onload fires quickly, and yet all the images are still loading? Is that valid? Arguably, this is opt-in behavior, so perhaps that's fine.

---

I'm still of the mind that we (a) should not mix any "image is visible" semantics here, as this complicates the matters enormously (and implementation), and (b) same for display:none, as it ties UA download and prioritization logic with CSSOM in the browser and creates a number of race conditions I would rather not even get into.

I'd propose the following:

- Images with 'defer' MUST NOT download while they are not in a document.
- Images with 'defer' MAY be downloaded at lower priority than images witout the attribute once in the document.
- (maybe) Images with 'defer' MUST NOT delay the 'load' event. 

The above keeps the surface area simple, but still allows the UA to make different decisions: lower download priority for mobile, load on demand on tethered links, etc.
Comment 29 Guy Podjarny 2013-04-08 05:27:56 UTC
Ilya, IMO your suggestion would make implementation easier, but would defeat the application need. Yes, we can implement this in JavaScript, but the reason we're having this discussion is because we want to make it easier and faster.

While I get the implementation simplicity you get from ignoring on-demand loading and waiting for layout, I think you'll lose 99% of the value.

If you don't wait for layout (and thus CSS) to complete, you can't make good decisions on what to download. At best you can guess that you'll only fetch the first images in the HTML, which connection limits do anyway and will fail miserably on RWD sites. 

If the browser feature doesn't mean below-the-fold images would be deferred (or at least doesn't guarantee it), then people would just resort to JavaScript anyway - and again you'll lose all of your value. 

If you didn't wait for layout, the feature you're left with is simply a way to mark some images as important and some as not important - which IMO is not what websites actually need.
Comment 30 Ilya Grigorik 2013-04-08 07:28:06 UTC
> While I get the implementation simplicity you get from ignoring on-demand
> loading and waiting for layout, I think you'll lose 99% of the value.
> 
> If you don't wait for layout (and thus CSS) to complete, you can't make good
> decisions on what to download. At best you can guess that you'll only fetch
> the first images in the HTML, which connection limits do anyway and will
> fail miserably on RWD sites. 
> 
> If the browser feature doesn't mean below-the-fold images would be deferred
> (or at least doesn't guarantee it), then people would just resort to
> JavaScript anyway - and again you'll lose all of your value. 
> 
> If you didn't wait for layout, the feature you're left with is simply a way
> to mark some images as important and some as not important - which IMO is
> not what websites actually need.

So, mark the important images as the ones above the fold. RWD isn't "on-demand design", nor should it be... As we already discussed on this thread, the "now in viewport" approach is often an anti-pattern both for battery and latency on mobile. And delaying all download decisions until CSS is available will impose *a minimum* of another RTT on the load sequence - no thanks! <Insert more inline CSS hacks here>.

With HTTP 2.0 we will finally have the ability to pipeline all requests as soon as we discover them and then reprioritize them on the fly. This is a *huge* win, and it would suck to break that right as we're finally closing in on removing these bottlenecks from HTTP 1.1. 

If you want a "lazyload" image tag, stick to JS, or wrap it into a Web Component and stamp out as many as you need -- we have the tools to do this already.
Comment 31 Guy Podjarny 2013-04-08 08:50:38 UTC
I think bottom line is that we're looking for two different features.
The feature I find most useful is one that supports lazy loading of images without JS. That feature will have to suffer waiting for CSS, but that's true for the current scripted solutions too. 
My understanding was that this is also the feature this bug was originally created for.

The other feature that I think you (Ilya) are looking is a way to mark non-important images so that they wouldn't delay other resources, either through deferral or prioritization. 
Perhaps it's worth logging that one as a separate bug?
Comment 32 Marcos Caceres 2013-04-08 09:20:57 UTC
(In reply to comment #31)
> I think bottom line is that we're looking for two different features.
> The feature I find most useful is one that supports lazy loading of images
> without JS. That feature will have to suffer waiting for CSS, but that's
> true for the current scripted solutions too. 
> My understanding was that this is also the feature this bug was originally
> created for.
> 
> The other feature that I think you (Ilya) are looking is a way to mark
> non-important images so that they wouldn't delay other resources, either
> through deferral or prioritization. 
> Perhaps it's worth logging that one as a separate bug?

Although the above clearly describes the two core features, I would personally not be in favor of separating them into separate bugs. The appear to be intertwined.
Comment 33 Yoav Weiss 2013-04-08 09:28:21 UTC
*With my <picture>/RICG hat off*

I agree with Guy that there may be 2 different use cases here, with the main developer need is to "unhack" lazy loading of images when they enter (or about to enter) the viewport.
I believe the best way to do lazy loading is in the browser, rather than in JS because:
* It'd enable browsers to ignore it and download the images anyway in low priority when downloading them on demand will be more expensive, when the user is likely to scroll all the way through with high probability (using this user's previous stats), etc.
* It will have better performance, and will be easier to develop.
* Pages will not break without JS, and it'd avoid the need to add verbose <noscript> to avoid that

I'm sure we can agree that the applications for lazy loading today are numerous. Off the top of my head, I can see YouTube and PageSpeed's lazy loading module profiting from such capability, as well many other optimization modules and websites. I prefer that these optimizations can be done in the browser using declarative markup, so that they can be turned off where they don't make sense.

Yes, lazy loaded images will have to wait for layout, but it's a price that Web devs will have to take into consideration.
Comment 34 Jake Archibald 2013-04-08 11:42:25 UTC
(In reply to comment #27)
> 1) While I understand the value to the user-agent, I believe website devs
> often intentionally want to separate deferred loading of images (meant to
> make the page load faster) and loading images only when they become visible
> (meant to reduce costs). Odin eluded to that, but unless I'm missing
> something, devs would still need to resort to JavaScript to achieve that
> here.

Yeah, this feature doesn't allow the cost-reducing option. This would be at the expense of performance on some platforms, so I'm happy not making that easy. Of course, 'defer' makes it easier to use JavaScript to implement this behaviour, by making images display:none by default.
 
> …downloading an image the user likely not to see at
> all (e.g. far below the fold) is even more inefficient. Especially in
> responsive websites and news websites like The Sun and Daily Mail, scrolling
> can be very long, and most users don't go far down.

Yeah, I want to allow the browser to do what it thinks best with images below the fold. It's only required to load the image when it's visible in the viewport, but it can download earlier if it wants, as long as the image is in the document & isn't calculated display:none.

> I actually don't have a better suggestion at the moment, just pointing out
> the concern. It may be that the only solution would be for devs to avoid
> "Defer" on primary above-the-fold images, losing some of the responsive
> flavor of it.

Yeah, I don't think 'defer' would be used on all images, I don't think I'd put it on the main article image unless I needed JS to change the src. Without 'defer', the prescanner can do its thing, you'll get the image quicker.

> 3) If the image's load state doesn't delay the window's load state, your RUM
> page load measurements may become useless, as you'll be "told" the page
> loaded before any of the major images are shown. Same goes for browser
> progress indicators. 
> 
> Since the browser already knows, by the time DOMContentLoaded fired, which
> images it intends to load, could *those* images still delay the window's
> load?

This sounds complicated, but doable. Aren't RUM page load measurements already misleading as they include images the user won't see unless they scroll? Is it worth complicating the spec for this case?

> 4) Our optimization also allows for a "buffer" space below the visible area,
> e.g. 100px "below the fold". Any image partially included within that space
> is considered visible for this purpose, and is downloaded immediately. This
> proved useful when trading off costs (don't download what users may not see)
> and speed (if the user scrolls down "a bit" or slowly, the image is already
> there).
> 
> This is again a user-agent decision, but it might be useful to define some
> meta tag or similar mechanism for the user to offer its guidance.

What kind of guidance would the user give and what would the browser do with that? The browser knows more about the device its running on, the connection it has, and what the user's doing, it feels like the browser is in a much better position to decide when to download the images here.
Comment 35 Jake Archibald 2013-04-08 11:49:43 UTC
(In reply to comment #30)
> If you want a "lazyload" image tag, stick to JS, or wrap it into a Web
> Component and stamp out as many as you need -- we have the tools to do this
> already.

I disagree. Developers are already recreating browser behaviour to achieve similar things and none of the solutions are great in terms of performance. They have heavy scroll listeners and don't take the device, connection type, tab visibility etc etc into account. This is something we can handle much better natively.
Comment 36 Odin Hørthe Omdal 2013-04-08 13:32:25 UTC
(In reply to comment #28)
> > Also note that while downloading resources on-demand may be inefficient for
> > a mobile device/network, downloading an image the user likely not to see at
> > all (e.g. far below the fold) is even more inefficient. Especially in
> > responsive websites and news websites like The Sun and Daily Mail, scrolling
> > can be very long, and most users don't go far down.
> 
> Right, and this is not a problem to solve with attributes on the img tag.
> This is an application problem: if your users are not consuming that
> content, don't put it there! We would only encourage the wrong behavior here
> by lowering the barrier.

Note that this is an extremely common pattern. And lots of huge web sites doing it wrong is worse for performance than having the browser do it correctly at one place.

Also, just because it is possible in the spec, doesn't mean your mobile user agent have to do viewport lazy load.  I know for a fact that we tested lazyload as a feature for some versions of Presto but it broke too much of the web (because sites were expecting images to load before onload, instead of checking the progress on each image).

There are places it makes sense, and places it makes no sense; that should really be up to the user agent.  It will better know what environment it is in and what restrictions it has.

That said. That /could/ be added later as a defer='lazyload', as long as the browser ignores things it doesn't understand.  At that point it will *still* only be a hint, but it could decouple the «allow javascript to handle the picture» from «get really nice behavior from the browser for free».

If they were coupled (as I prefer at least one desktop user agent will do because web developers will then understand they need loadend listeners on each individual image), you could still turn off the lazy-loading with javascript (for browsers that does in fact do that, which won't be most mobile browsers). You'd just remove all of the @defer's on 'onload'.

So in short, this is about saying:

  If it makes sense for your user agent, load @defer'red images
  when they are needed (visible in the viewport).

... and fade them in for sanity's sake, because it looks beautiful and not popup-ugly. ;-)

> > Perhaps we can allow defer to be "explicit" vs use the automated logic?
> > Ilya's original suggested split between async and defer may do the trick -
> > have defer be explicitly "load image only when visible" while async would
> > leave the decision to the browser.
> 
> I think anything to do with visibility is application-specific. We shouldn't
> dictate this behavior in the spec, as it will only lead to more confusion.
> We know how to do this with JavaScript, and let's keep it there

I disagree.  This has to do with user experience, and what restrictions and environment the user agent currently operates under.  People doing visibility loading in javascript will still do that for mobile UA's on 3G.  Which you yourself say is a bad strategy.

This is not something the web site authors can know because they don't have that information.

However, if you wanted to *save money*, then you would actually _force_ this behaviour even for mobile UA's. The way you do that is basically to go through all the images (on DOMContentLoaded, or inside a blocking inline <script>) and exchange

  <img src=big.jpg>

with

  <img src='' data-original-src=big.jpg>

Then you can save money even on the mobile browsers which would presumably only use @defer for network priority/queueing hint. So basically overriding them again.

> > 2) Deferring image download until all CSS has completed probably doesn't
> > delay them much more than today. On most web pages today, due to the mixing
> > of CSS & JavaScript at the top, images don't start getting downloaded until
> > after all CSS has been downloaded and processed anyway. 
> 
> That's not entirely true. It is not uncommon for the preload scanner to scan
> the doc in its entirety way before CSS is available. If we explicitly
> dictate that we must wait for CSS.. that means we will underutilize the
> network link.

I am not convinced of waiting for CSS.  All the cool stuff can be done at DOMContentLoaded using Javascript.  It would still be possible to downprioritize, and totally defer stuff that is hidden, say, if you do have one CSS file already cached and can tell that this resource is both @defer'ed and hidden in js.

It would surely be a race, but that's the whole reason behind this thing anyway, after DOMContentLoaded whatever can happen because the browser can choose to take those images and load, or it can lazyload to viewport.  Having some extra CSS could be an extra hint.

> > However, this may be a serious penalty in the world of SPDY & HTTP 2, where
> > browsers wouldn't need to block to prioritize. In a request multiplexing
> > context, delaying image download till post CSS could hurt user experience. 
> 
> Exactly! Let the UA decide and optimize the download behavior. As a
> web-developer you *do not* have enough information to dictate a strict
> ordering of what/when should be downloaded -- let the UA do its job.

We have to wait on javascript to do its thing.  So that really application specific code can run.  The most obvious example being to load the correct image asset.  Or move the @src to @data-original-src if you want to totally control all image loading yourself via javascript.

@defer is a strong hint that «yes, I want you do wait with this one, because I want to handle it».  It should **NOT** be a general "performance tip" to just put this attribute on all your images always because it goes "magically faster".  It's actually opting out of the preloader so it's for images that can load later!

> > 3) If the image's load state doesn't delay the window's load state, your RUM
> > page load measurements may become useless, as you'll be "told" the page
> > loaded before any of the major images are shown. Same goes for browser
> > progress indicators. 
> 
> Yup, I share the same concern here. While I like the idea of it, consider
> this case: I append defer on all my images, my onload fires quickly, and yet
> all the images are still loading? Is that valid? Arguably, this is opt-in
> behavior, so perhaps that's fine.

It's perfectly valid.  The page should be totally usable at that point in time, and some pictures will fade nicely in while your are reading your article.

For horrible internet connections, like modems, it might be a quite much better user experience.

> I'd propose the following:
> 
> - Images with 'defer' MUST NOT download while they are not in a document.
> - Images with 'defer' MAY be downloaded at lower priority than images witout
> the attribute once in the document.
> - (maybe) Images with 'defer' MUST NOT delay the 'load' event. 
> 
> The above keeps the surface area simple, but still allows the UA to make
> different decisions: lower download priority for mobile, load on demand on
> tethered links, etc.

Well, the central part of actually opting in to take over the control of the image is not in there though.  So you'd need to have the page fully parsed and having run all the blocking javascript.  Yes, if you load blocking javascript from an URL this can suck, but don't do that.  Have one cached js file at most.

So, plus:

 - Images with 'defer' MUST NOT download before DOMContentLoaded's
   handlers have finished running.


(In reply to comment #34)
> (In reply to comment #27)
> > 1) While I understand the value to the user-agent, I believe website devs
> > often intentionally want to separate deferred loading of images (meant to
> > make the page load faster) and loading images only when they become visible
> > (meant to reduce costs). Odin eluded to that, but unless I'm missing
> > something, devs would still need to resort to JavaScript to achieve that
> > here.
> 
> Yeah, this feature doesn't allow the cost-reducing option. This would be at
> the expense of performance on some platforms, so I'm happy not making that
> easy. Of course, 'defer' makes it easier to use JavaScript to implement this
> behaviour, by making images display:none by default.

Yes, an always-lazyload would have to be a special option to the attribute like defer=always-lazyload or javascript.  I think even an attribute is too easy, because you'd get people not really considering why it is bad.

But I'm (as always) very much interested and this being possible.  Letting Javascript regain some image loading control :-)
Comment 37 Simon Pieters 2013-04-08 14:53:18 UTC
About setting attributes in a certain order, I think <img> has already solved that problem for src/srcset/crossorigin. Making defer use the same scheme is probably not a problem.
Comment 38 Ilya Grigorik 2013-04-09 05:30:22 UTC
> @defer is a strong hint that «yes, I want you do wait with this one, because
> I want to handle it».  It should **NOT** be a general "performance tip" to
> just put this attribute on all your images always because it goes "magically
> faster".  It's actually opting out of the preloader so it's for images that
> can load later!

I like that formulation. But why wouldn't I sprinkle defer over *all* of my below-the-fold images? If the goal is to get the fast first render, then isn't this the new performance pixie dust? And if that's the new default for most developers, then we've effectively opted out the bulk of images from preloader... 

> > I'd propose the following:
> > - Images with 'defer' MUST NOT download while they are not in a document.
> > - Images with 'defer' MAY be downloaded at lower priority than images without
> > the attribute once in the document.
> > - (maybe) Images with 'defer' MUST NOT delay the 'load' event. 
> > 
> So, plus:
> 
>  - Images with 'defer' MUST NOT download before DOMContentLoaded's
>    handlers have finished running.
> 
> ...
>
> Yes, an always-lazyload would have to be a special option to the attribute
> like defer=always-lazyload or javascript.  I think even an attribute is too
> easy, because you'd get people not really considering why it is bad.
> 
> But I'm (as always) very much interested and this being possible.  Letting
> Javascript regain some image loading control :-)

So, can we reconcile these use cases? I think we have conflicting requirements between stating that defer "is just a hint" and then turning around and mandating that images must wait until DCL.. and now we're talking about "always lazy load", entering viewport, etc. 

Assuming the starting point is:

--
> (A) @defer is a strong hint that «yes, I want you do wait with this one, because
> I want to handle it».

Now DCL fires, and the browser has a queue of images it wants to dispatch. How does it know when's a good time to do so? Your script may run anytime between DCL and onload - we have a race condition? And what logic would the client run here? Cancel the download by removing it from the DOM vs. remove defer attribute to trigger download? If the client doesn't do either, do we defer to UA to make a "good decision"?

--
(B) How does lazyload work here? Would defer=lazyload simply defer the decision to the UA?

<aside>What about cost of reflows? These can be very expensive and jarring to the user. Does this mean all images should now have an explicit w/h? (reasonable, but can be painful)</aside>

--
(C) How does display:none work in light of (B) - can we say (C) and (B) are functionally equivalent? Aka, defer to UA.
Comment 39 Jake Archibald 2013-04-11 09:55:25 UTC
(In reply to comment #38)
> I like that formulation. But why wouldn't I sprinkle defer over *all* of my
> below-the-fold images?

I think it's of benefit to all but primary article images that won't be altered by JS. What problem do you see.

> And if that's the new default for
> most developers, then we've effectively opted out the bulk of images from
> preloader... 

I think we're putting too much weight on the preloader. Images are low priority to the preloader compared to css & js.

> So, can we reconcile these use cases? I think we have conflicting
> requirements between stating that defer "is just a hint" and then turning
> around and mandating that images must wait until DCL.. and now we're talking
> about "always lazy load", entering viewport, etc.

Just a hint doesn't cut it, asking for something more than that isn't a turn around as a hint wouldn't solve the problem as presented in the first post of this ticket.

I don't see the need to wait until DCL. We only need to wait until layout which is round-about when images start downloading anyway, so no big loss.

The proposal is that images shouldn't download until they have layout (as in, not calculated display:none). This makes sense from a performance perspective, and CSS backgrounds work like this already. It allows developers to modify particular image elements with js before downloads are queued by making them display:none.

Once the image has layout (not display:none), the browser is given freedom over downloading the image, it's only required to download the image if it's likely to appear on the user's screen.

> (B) How does lazyload work here? Would defer=lazyload simply defer the
> decision to the UA?

I don't think we need that. What does it give us over the "defer" behaviour I proposed?

> <aside>What about cost of reflows? These can be very expensive and jarring
> to the user. Does this mean all images should now have an explicit w/h?
> (reasonable, but can be painful)</aside>

It's down to the developer to do what's best for the design. Eg, an image with position:absolute but no w/h only impacts the layout of itself once it learns its natural w/h. This isn't a new issue, this happens today.

> (C) How does display:none work in light of (B) - can we say (C) and (B) are
> functionally equivalent? Aka, defer to UA.

Avoiding image downloads for display:none images is required to meet the use-case of the ticket. Also, if we had this today, making something like a webp polyfill would be trivial.
Comment 40 Ilya Grigorik 2013-04-12 22:46:37 UTC
(In reply to comment #39)
> > I like that formulation. But why wouldn't I sprinkle defer over *all* of my
> > below-the-fold images?
> 
> I think it's of benefit to all but primary article images that won't be
> altered by JS. What problem do you see.
> 
> > And if that's the new default for
> > most developers, then we've effectively opted out the bulk of images from
> > preloader... 
> 
> I think we're putting too much weight on the preloader. Images are low
> priority to the preloader compared to css & js.
>
> > So, can we reconcile these use cases? I think we have conflicting
> > requirements between stating that defer "is just a hint" and then turning
> > around and mandating that images must wait until DCL.. and now we're talking
> > about "always lazy load", entering viewport, etc.
> 
> Just a hint doesn't cut it, asking for something more than that isn't a turn
> around as a hint wouldn't solve the problem as presented in the first post
> of this ticket.

Right, and maybe therein is the root of our disagreement. I don't share the starry-eyed optimism that simply defering all image downloads is a performance win. To the contrary, if abused, it'll lead to higher battery usage, and network and UX latencies. 

I think we can learn from our native counterparts in this respect. A few years back, if you read iOS/Android documentation, you would indeed find that they recommended "loading data progressively". Since then, a full 180 was done, and the new best practices are: prefetch data, prefer bursty transfers, and "decouple user transitions from data interactions".  

An excerpt from the Android docs:  By front loading your transfers, you reduce the number of radio activations required to download the data. As a result you not only conserve battery life, but also improve the latency, lower the required bandwidth, and reduce download times... Prefetching also provides an improved user experience by minimizing in-app latency caused by waiting for downloads to complete before performing an action or viewing data.

I encourage everyone to read:
- http://developer.android.com/training/efficient-downloads/efficient-network-access.html
- http://developer.apple.com/library/ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/PerformanceTuning/PerformanceTuning.html
- https://developer.att.com/home/develop/referencesandtutorials/whitepapers/BestPracticesFor3Gand4GAppDevelopment.pdf

To make this concrete, lets work through an example where a page with a few below-the-fold images is loaded. The user has been reading the AFT content for ~10s, and is now scrolling:

- defered image enters viewport and now needs to be fetched
- the radio is in low or mid-power state: on 3G, to acquire a radio context you'll wait 200-2.5s; on 4G: <100ms.
- the request packet is dispatched: advertised RTT's are 100-250ms for 3G, and 50-150 on 4G + transit time on public internet
- (half a second+ has passed)
- image packets are streaming in.. we must wait to decode it 
- image is decoded (non-neglible amount of ms, let's say 20 to be optimistic)
- potential layout reflow + paint..
- image is displayed to the user

In other words, we're looking at ~500ms+ between image entering viewport and image appearing on screen. Add to this the battery overhead, and you've got <2 FPS and no battery life left on the device. How is this good?

To be clear: there is a time and place for "lazyload when in viewport", but *it is not* the right default.

-- 

Further, while above behavior is specific to 2/3/4G networks, the behavior and latencies can/is different for Wi-Fi and tethered. In some cases, more aggressive use of lazyloading will be beneficial. In others, if the latencies are high, we fallback to same concerns as above. Which is to say: the UA is the one in best position to make the decision as to when the image is best loaded.

Apologies if we're going in circles here.. But I don't want to see us repeat the last two years of hard-won experience from the native world. 

> Avoiding image downloads for display:none images is required to meet the
> use-case of the ticket. Also, if we had this today, making something like a
> webp polyfill would be trivial.

Fair enough, then I guess my position is that this ticket should be closed / marked as invalid. It'll do more harm than good, as far as performance is concerned. If the intent is to proceed on premise of "we want to make it simpler" - ok - but let's then drop all pretense of doing this as a way to improve performance (and warn people upfront that it's an antipattern).
Comment 41 Yoav Weiss 2013-04-13 06:18:26 UTC
@Ilya
I agree that waking up the radio on cellular networks is bad for battery life & latency, and therefore has a negative impact on the overall UX. That is an argument for downloading all the page's resources in advance.
OTOH, I think we can agree that downloading data that will *never* be used is also bad for battery life (and the user's data plan).
Real life scenarios are too complex to decide in advance that one of the above options is inherently better than the other.

The probability of the user scrolling all the the way down in a certain page varies for each (page,user) combination.
If that probability is close to 0, there's no point in downloading the images in advance on a cellular network, and we can pay the price of waking up the radio in the odd case that the user does scroll.
If that probability is close to 1, we probably want below-the-fold images to downloaded with the initial page, as low priority resources.

Who is in a better position to calculate these heuristics than the browser? 

I claim that the lazy load pattern (which is very commonly used) should move from JS hacks to the browser for several reasons. One of them is ease of development (it's easier to add an attribute than an entire lazy loading module), but more importantly because we need these kind of heuristics to get lazy loading working right. These heuristics can only happen *in the browser*. The browser can take page statistics, user behavior and the current network into account when making such heuristics. 

Regarding the latency impact and images popping out after the user has scrolled down, the browser can mitigate it by:
a. Smooth transitions (as Odin said) that will make that side-effect less dramatic.
b. Deploy smart heuristics once the user started scrolling, to try and download the images for a certain viewport *before* the user reaches it.
Comment 42 Ilya Grigorik 2013-04-13 07:14:02 UTC
> The probability of the user scrolling all the the way down in a certain page
> varies for each (page,user) combination.
> If that probability is close to 0, there's no point in downloading the
> images in advance on a cellular network, and we can pay the price of waking
> up the radio in the odd case that the user does scroll.
> If that probability is close to 1, we probably want below-the-fold images to
> downloaded with the initial page, as low priority resources.
> 
> Who is in a better position to calculate these heuristics than the browser? 

I think this overlooks an obvious point: if the probability is low, then that content shouldn't be there - you'll save markup bytes, as well as resource bytes. Instead, provide a menu, and fetch the content when the user requests it.

Why would we want to encourage mile-long pages which nobody ever sees?

> Regarding the latency impact and images popping out after the user has
> scrolled down, the browser can mitigate it by:
> a. Smooth transitions (as Odin said) that will make that side-effect less
> dramatic.

Smooth transitions to somehow "fix" low single digit FPS? That seems unlikely.

> b. Deploy smart heuristics once the user started scrolling, to try and
> download the images for a certain viewport *before* the user reaches it.

This does very little to address battery or network latency concerns. You will have to incur the control + user-plane latency costs (500ms), and burn the battery for next 10-20s.

---

If we treat defer as a hint, then we can get the best of all worlds:
- in cases where it makes sense, on demand loading (viewport or otherwise)
- in cases where it does not, let the UA fallback to prefetching when necessary

Further, defering this decision to the UA also allows user input: some users may opt-in to use aggressive lazy-loading to save bytes (at cost of battery), whereas others may want fast and smooth scrolling (at cost of bytes).

(in other words, I don't think I'm all that far from what you're looking for...)
Comment 43 Guy Podjarny 2013-04-13 10:10:39 UTC
(In reply to comment #42)
> > Who is in a better position to calculate these heuristics than the browser? 
> 
> I think this overlooks an obvious point: if the probability is low, then
> that content shouldn't be there - you'll save markup bytes, as well as
> resource bytes. Instead, provide a menu, and fetch the content when the user
> requests it.
> 
> Why would we want to encourage mile-long pages which nobody ever sees?

I think you're over estimating both the insight and the influence the UA has. 

Re insight, the browser has the best knowledge of network, battery, and even local user, but the website owner has the best knowledge of their actual app. The website owner know better than the UA what their users want from a design and content perspective, they know what are the actions they're likely to do on a page (like unhide or scroll), and they know what gives them the best bottom line business results - a navigation menu or a long page. 

Re influence, I do not expect website owners to stop creating long pages because the UA doesn't want them to. Many websites prefer a long page (often with infinite scrolling), ranging from "old" websites like news sites with lots of ongoing content, through cutting-edge websites like Twitter's infinite scroll, to RWD sites that collapse horizontal columns vertically. 

They do so because they believe, and often KNOW (from A/B tests) that it gives them better results. I think it's a bit presumptuous to think that the UA and its developers know better, and regardless it's "starry eyed" to think that supporting image defer or not would make them change this behavior.

So bottom line, websites would keep doing on-demand image loading, as they are doing now. If we give them a clean and more efficient way to do so (like image defer), they'll gladly use it. If the image defer doesn't give them what they need, they'll stick to their JS. 

My understanding, from what I've seen and heard from our customers, that the bare minimum they need is:
1. Do not load images that are hidden with display:none
2. Do not load images that are below the fold AT LEAST until the page is loaded 

If image defer doesn't commit to these two options, it'll not be used.
This statement is based on what websites are *actually doing* today, so I feel pretty comfortable making it.


Moving off what I see as "bare minimum", a common 3rd requirement is to commit to lazy-loading the images. IMO we should support a "defer=lazy-load" option that guarantees that as well, since without it people may still revert to JS.

As a side note, I also think website developers expect consistency in the end result. I expect many would have a hard time with not knowing whether an image would be downloaded at all or not (since we left the decision to the UA). 

Are there other examples for resources that may or may not get called?
Prefetch doesn't count, since it's a suggestion by its very nature.
Comment 44 Jake Archibald 2013-04-13 11:15:46 UTC
(In reply to comment #40)
> Right, and maybe therein is the root of our disagreement. I don't share the
> starry-eyed optimism that simply defering all image downloads is a
> performance win.

I think you've misunderstood the proposal.

I understand the issues of waking the radio, I understand the latency. People are getting this wrong now with hacky JS lazy-load solutions. I agree this is bad.

The proposal gives the browser the choice, it may defer until in-viewport, that's the latest it can defer. However, it can load much earlier than that. The earliest it can load the image is when it's in the document & gets layout (isn't calculated display:none).

For low latency situations where there's no radio wakeup penalty, the browser may defer images as long as possible. For high latency situations where radio wakeup is costly, the browser may load images much earlier.
Comment 45 Odin Hørthe Omdal 2013-04-13 13:57:22 UTC
Ouch, sorry for the length.  I was crazy-busy during week at work and don't have time to write short right now. So you'll get it unedited:

(In reply to comment #42)
> If we treat defer as a hint, then we can get the best of all worlds:
> - in cases where it makes sense, on demand loading (viewport or otherwise)
> - in cases where it does not, let the UA fallback to prefetching when
> necessary
> 
> Further, defering this decision to the UA also allows user input: some users
> may opt-in to use aggressive lazy-loading to save bytes (at cost of
> battery), whereas others may want fast and smooth scrolling (at cost of
> bytes).

We're missing the crucial point here.  What this bug is about, and also the core reason why I started looking into this.

  *It is not about performance.*


It's about regaining control over image loading.  So yes, that will indeed suck _sometimes_, and might give us worse perf characteristics.  However, in the case the connection is still warmed up when the DOMContentLoaded listeners has run, all the deferred images is in the user agent's total control.  It can then queue them up and load them all.

  Hence: A hint yes, but you are not allowed to act on it UNTIL
         you have run the DOMContentLoaded listeners.
         So maybe not so much a hint anymore ;-)

In essence, this is actually reversing the performance prefetch fix that user agents implemented.  Allowing the people who want to create truly responsive sites that fall back gracefully, to do that.  Instead of creating over-engineered solutions other places.  Basically, if srcset is not enough for you, write your own responsive image implementation in javascript.

There's people out there who really want to wait for layout to be finished and drawn before making loading decisions.  We should let that be possible (by deactivating the images in js first time you run, and later at your leisure bring them back in).  This is custom code, for custom people on custom sites.  Experimentation.

Those people have to know that they're giving up some optimizations for others, and those might not be totally correct for all their users.  And might actually hurt the ones we seem to care the most about (mobile users in the western world, as opposed to both desktop and non-western).


HOWEVER: even for mobile western-world users, it is still much better than:

 <div data-src=huge-2k.jpg></div>

Which is how people mark up images to do this today, *that* is the competition here. Or for people who care about fallbacks:

  <div data-src=huge-2k.jpg></div>
  <noscript><img src=medium-640.jpg alt=''></noscript>

vs:

  <img src=huge-2k.jpg alt='' defer>

And it will still be possible to actually disregard the @defer here, just like legacy user agents will do.  Not waiting for DCL at all.  E.g. Opera Mini would do that, or an aggressive caching proxy or a spider etc.  However, that should not be done for general user agents, as the whole reason for the feature is to give web page authors some control.

A common thing would be for the javascript to see that the screen is only 320x2 wide and in portrait, so it will instead load the portrait-medium-640.jpg image.


(In reply to comment #38)
> > Yes, an always-lazyload would have to be a special option to the attribute
> > like defer=always-lazyload or javascript.  I think even an attribute is too
> > easy, because you'd get people not really considering why it is bad.
> > 
> > But I'm (as always) very much interested and this being possible.  Letting
> > Javascript regain some image loading control :-)
> 
> So, can we reconcile these use cases? I think we have conflicting
> requirements between stating that defer "is just a hint" and then turning
> around and mandating that images must wait until DCL.. and now we're talking
> about "always lazy load", entering viewport, etc. 

Always lazy load is not something I want in.  I was just saying that could be possible as an attribute, but I shouldn't say it because people then think that I consider it a good idea.  So no; don't do that (however, it is possible to do in js, that's the nature of this, but that will require determination).

The browser itself should be allowed to implement lazy-load on entering viewport though.  As a differentiating feature.  And there you have the quality of implementation saga...

> > (A) @defer is a strong hint that «yes, I want you do wait with this one, because
> > I want to handle it».
> 
> Now DCL fires, and the browser has a queue of images it wants to dispatch.
> How does it know when's a good time to do so? Your script may run anytime
> between DCL and onload - we have a race condition? And what logic would the
> client run here? Cancel the download by removing it from the DOM vs. remove
> defer attribute to trigger download? If the client doesn't do either, do we
> defer to UA to make a "good decision"?

When the handlers for DCL has fired, the browser is immediately allowed to do whatever it chooses.  I don't know where the "your script may run anytime between DCL and onload" came from.  No, it won't, it will run either before DCL or as one of the handlers of DCL.  So once all DCL handlers have run it's too late.

This is just like transactions in IndexedDB.

> (B) How does lazyload work here? Would defer=lazyload simply defer the
> decision to the UA?
> 
> <aside>What about cost of reflows? These can be very expensive and jarring
> to the user. Does this mean all images should now have an explicit w/h?
> (reasonable, but can be painful)</aside>

I don't think we need an attribute.  It's up to the user agent to choose how to deal with the images after DCL.  Reflows are a painpoint.  A user agent can choose to not lazyload images without an explicit w/h, or it can calculate the CSS and load.  Or lazyload and take the reflow hit.  This is a quality of implementation issue.
 
> (C) How does display:none work in light of (B) - can we say (C) and (B) are
> functionally equivalent? Aka, defer to UA.

I'm sceptic to waiting for CSS as well.  Waiting longer than DCL is bad enough.  I think this too can be a quality of implementation issue.  If the browser can check the CSS, and sees that this picture won't be needed after it has fired DCL, then, yes, of course it can not load that image.  It is totally allowed to not do that.  That's the beauty of this, images with defer, it's possible to *DO* stuff with it again, because the page can't rely on them actually being there loaded.

The problem with <img> today is that it is required to be there, because web pages expect that.

(In reply to comment #39)
> I don't see the need to wait until DCL. We only need to wait until layout
> which is round-about when images start downloading anyway, so no big loss.

I don't really understand how you get to this conclusion. «Only waiting for layout» is always *on* OR *after* DCL.  Because you can't know what CSS you have to load until after DCL.  Remember that nasty <script> at the bottom of the page.  So having to potentially wait longer or best-case just as long as it'd take to allow javascript to run is too much IMHO.

Waiting for layout will always be as slow, or slower than DOMContentLoaded. :-)

> > <aside>What about cost of reflows? These can be very expensive and jarring
> > to the user. Does this mean all images should now have an explicit w/h?
> > (reasonable, but can be painful)</aside>
> 
> It's down to the developer to do what's best for the design. Eg, an image
> with position:absolute but no w/h only impacts the layout of itself once it
> learns its natural w/h. This isn't a new issue, this happens today.

Yeah, good point.

> > (C) How does display:none work in light of (B) - can we say (C) and (B) are
> > functionally equivalent? Aka, defer to UA.
> 
> Avoiding image downloads for display:none images is required to meet the
> use-case of the ticket. Also, if we had this today, making something like a
> webp polyfill would be trivial.

I disagree with that.  Allowing Javascript to stop the load is enough.  CSS doesn't really have to be taken in.  And the same for WebP polyfill, you only need a way/method to give Javascript some control over loading.


(In reply to comment #40)
> > Just a hint doesn't cut it, asking for something more than that isn't a turn
> > around as a hint wouldn't solve the problem as presented in the first post
> > of this ticket.
> 
> Right, and maybe therein is the root of our disagreement. I don't share the
> starry-eyed optimism that simply defering all image downloads is a
> performance win. To the contrary, if abused, it'll lead to higher battery
> usage, and network and UX latencies. 

Deferring all images is not a performance win, I don't think anyone claimed that.  That's a vast oversimplification.  It can /potentially/ have /some/ positive performance sides (given enough smarts on the user agent implementation side), but there's many cases it will just be much worse.  Even always-worse depending on the website (and UA implementation).  But bad-website and web site developers who do crazy things is hard to guard against, and I don't think a select few manhandling powertools should ruin for those who really need this control.

> I think we can learn from our native counterparts in this respect.
>
> [ ... lots of text ... ]
> 
> In other words, we're looking at ~500ms+ between image entering viewport and
> image appearing on screen. Add to this the battery overhead, and you've got
> <2 FPS and no battery life left on the device. How is this good?

This is common knowledge today.  I'm sure everyone in this discussion is well aware of that.  Reading the discussion, we've talked about this many times.

> To be clear: there is a time and place for "lazyload when in viewport", but
> *it is not* the right default.

This is about making it possible.  It is not possible today.  And it doesn't matter if it's default, a phone on a high-latency network won't use the default anyway.  Yes, it'll have to wait for DCL (like *everyone* did in the old days) and that's bad and all, but it will open up a world of possibilities for Javascript (some client-side innovation) and also untie browser innovation on image loading.  Currently you *have* to load all images, or sites break.

> Further, while above behavior is specific to 2/3/4G networks, the behavior
> and latencies can/is different for Wi-Fi and tethered. In some cases, more
> aggressive use of lazyloading will be beneficial. In others, if the
> latencies are high, we fallback to same concerns as above. Which is to say:
> the UA is the one in best position to make the decision as to when the image
> is best loaded.

Yes, exactly.  That is not the case today with <div data-src=2k.jpg>.  And also, today the UA is not *allowed* to decide when the image is best loaded (it can't actually _not load_ an <img> image if it so chooses).

Even if there are 100 images in the page, you have to load them all.

Yes, web developers can misuse this, some will.  I always try to think how to make the abuse least likely.  E.g. having a MQ-based responsive image solution is a worm hole because the media queries webdevs write in good faith is a contract between UA and webdev, and they often do not take everything into consideration.  It's the wrong place to solve problems.  Also, putting that burden on webdevs is not good, they will not consider different circumstances and can't plan for the future.

Having more of the smarts in the browser is thus a good thing.  However, having a totally open way to do lots of image hacks is good because we can leverage webdevs to come up with more flexible responsive loaders as well.  It's a hook at a low level, so that we don't need higher-level features to do the same for narrow use cases.

The common thing should be easy, the special case should be possible.  This is about making the special cases possible.

Coinciding with that, this newly-found «ye don't have to load yer images» that @defer would bring allows the user agent itself to also innovate (which we agree on).

> > Avoiding image downloads for display:none images is required to meet the
> > use-case of the ticket. Also, if we had this today, making something like a
> > webp polyfill would be trivial.
> 
> Fair enough, then I guess my position is that this ticket should be closed /
> marked as invalid. It'll do more harm than good, as far as performance is
> concerned. If the intent is to proceed on premise of "we want to make it
> simpler" - ok - but let's then drop all pretense of doing this as a way to
> improve performance (and warn people upfront that it's an antipattern).

Yeah, having a warning can be done.  I think that's a very good idea.  It can have a bigass warning around it, saying it will hurt performance, and you shouldn't use this unless you understand the consequences.

It is a feature for the special cases.  And if say, one js responsive image solution really sails up as a big winner that many use, then it could be possible to take that into the UA for real (and then without the added impact of waiting for DCL and running JS).

And ofc all the other narrow stuff we haven't though about.


(In reply to comment #41)
> @Ilya
> I agree that waking up the radio on cellular networks is bad for battery
> life & latency,
>
> [ ... ]
>
> OTOH, I think we can agree that downloading data that will *never* be used
> is also bad for battery life (and the user's data plan).
>
> Who is in a better position to calculate these heuristics than the browser? 

With the UA in control, it's also not an "all or nothing" decision. It can burst-download the first 20 deferred images, and learn how far down the 80 next are. So it has say 4-5 scroll-lengths of images already loaded (because it made sense to use the warmed up connection).  On scroll it can then suddenly start preloading the rest, hopefully before they come into view.

Also, we seem to be talking an awefully lot about 2G/3G/4G and mobile. But there's more to the web than those.  I think it makes lots of sense for desktop browsers.

> I claim that the lazy load pattern (which is very commonly used) should move
> from JS hacks to the browser for several reasons. One of them is ease of
> development (it's easier to add an attribute than an entire lazy loading
> module), but more importantly because we need these kind of heuristics to
> get lazy loading working right. These heuristics can only happen *in the
> browser*. The browser can take page statistics, user behavior and the
> current network into account when making such heuristics. 

Important, concise point.


(In reply to comment #42)
> I think this overlooks an obvious point: if the probability is low, then
> that content shouldn't be there - you'll save markup bytes, as well as
> resource bytes. Instead, provide a menu, and fetch the content when the user
> requests it.

That would give a worse experience for legacy browsers.  Also, just because few people scroll down, doesn't automatically mean that those who *do* scroll down are just as much worth, maybe that's just the people you're looking for.  It's also about content which I'll write about further down.

In the blog case, most people visit it to see if there's anything new.  I think that is much of the reason why blogg.no implemented their lazy load scheme.  Having lots of "next, next, next" links or menus is awfully old fashioned, and makes me think of the old Gmail I got on Opera Mobile before, where even the messages where cut into different pages.  It was horrible.

Markup bytes vs. image bytes is by the way not even in the same league.  You're talking about extreme optimization.  Yoav's point was shaving off the considerable part (and almost for free).

> Why would we want to encourage mile-long pages which nobody ever sees?

If the probability truly is 0, then of course, there's no use.  But it won't be 0, it will be somewhere in the range between 1 and 0.

And often the text is the main content.  If you scroll down a newspaper, you can still see all the articles laid out, but if the image fades in a second after you scrolled down there it won't be the end of the world.

So you'd still want to have a super quick content scroll.  And scrolling is much nicer than having to press buttons in a menu.

> Smooth transitions to somehow "fix" low single digit FPS? That seems
> unlikely.

This is a Quality of Implementation issue. I don't understand what you mean with FPS here; a network load of an image and decoding of that should not block your rendering thread.  That's just crazy, noone does that.  Everything will be smooth and nice, you will be missing a picture, but it will pop up when it is ready.  If that is 2 seconds after you scrolled down, then that's just how this is.  As long as you've set w+h, it can just fade itself in.

To show that this is a QoI issue;

- Chrome on Android doesn't have to do it that way - it can preload
  everything (after DCL, that is).
- Chromium on Linux might load 50%, and then load the rest on scroll.
- Internet Explorer on a desktop might want to do a full defer and
  fade-in when needed based on the current scroll speed, and statistics
  on how long the previous images took - and load them if they get it
  before it hits the viewport. If they are a little bit late, they will
  also put in a 100ms fade-in on the picture.
  To reduce reflow problems, they can choose to only lazyload images
  where they know what the size would be (so either w/h explicit), or
  a reasonable guess based on the layout they already have rendered.

> > b. Deploy smart heuristics once the user started scrolling, to try and
> > download the images for a certain viewport *before* the user reaches it.
> 
> This does very little to address battery or network latency concerns. You
> will have to incur the control + user-plane latency costs (500ms), and burn
> the battery for next 10-20s.

What?  The implementation and the "smart heuristics" (if any) are up to the UA to do and figure out for their use.  If you are concerned about battery and network latency (so, being a mobile browser or knowing so from stats/Network Manager) the implementation could choose to load everything after it has run DCL.

The first simple implementation *will* do that, but it will be possible to do other things when you're on a super fast low-latency, but low-bandwidth connection on your desktop.  Or however you'd like to innovate.

If it makes no sense for your mobile browser, maybe Opera or Microsoft wants to experiment with an implementation that gives better user experience on their mobile browsers.  Maybe someone will find a good algorithm that does that, if not, there's still the desktop browsers left which often find themselves on lower latency connections.

Or even forgoing that (because it is not the primary problem); it will still, even if it does not help in any other way, make it possible to control image loading from javascript with a nice fallback (and simple code) and allow the browser to /potentially/ defer forever.  Which is the core problem here (with the rest being very nice added-on extras).

> If we treat defer as a hint, then we can get the best of all worlds:
> - in cases where it makes sense, on demand loading (viewport or otherwise)
> - in cases where it does not, let the UA fallback to prefetching when
> necessary

But we won't be able to use this as a RWD image solution.  Or a WebP fallback.  Or as a better method of doing cheapskate website-developer image super-lazy-loading (that they still will have to implement using Javascript, but the image tags can stay <img> instead of <div>).  Having JS turned off will give you the images, instead of the way it is now.

> Further, defering this decision to the UA also allows user input: some users
> may opt-in to use aggressive lazy-loading to save bytes (at cost of
> battery), whereas others may want fast and smooth scrolling (at cost of
> bytes).

It would also be possible to have a preference overriding the DCL-wait for @defer images.  Saying «I don't care AT ALL about space savings, my own or the server owner, and if some Javascript changes the image at DCL-time, just download that one _as well_ as the one I just downloaded».  But that should never ever be a default in a normal user agent.  And of course, the prefetch should be silent then as well, not exposing that it already has the image loaded before after DCL listeners has run.

> (in other words, I don't think I'm all that far from what you're looking
> for...)

Well, except for taking away the core point, no, not that different ;-) :D


So there's two points to this;

1) Allow user agent to defer <img>s possibly forever
2) Allow web author to change <img> source before UA does the loading (effectively don't prefetch)

Maybe they could indeed be seperated.  @defer doing #1, and @defer='force' doing #2.  But number two has to be possible. :)  Also, it doesn't have to be called defer since we're going there already; lazy and lazy='force', lazy='defer' or lazy='wait'.  Or even @defer for #1 and @no-prefetch for #2.

I can see why you would want to have #1 without being restricted to #2.  I started with trying to find a solution to lots of cases that could be distilled to doing #2, and #1 just came as a result of what #2 would allow.
Comment 46 Ilya Grigorik 2013-04-13 23:37:05 UTC
(In reply to comment #43)
> My understanding, from what I've seen and heard from our customers, that the
> bare minimum they need is:
> 1. Do not load images that are hidden with display:none
> 2. Do not load images that are below the fold AT LEAST until the page is
> loaded 

sgtm, with minor clarification: I assume #2 is implicitly referring to "defered" images.

(In reply to comment #44)
> The proposal gives the browser the choice, it may defer until in-viewport,
> that's the latest it can defer. However, it can load much earlier than that.
> The earliest it can load the image is when it's in the document & gets
> layout (isn't calculated display:none).
> 
> For low latency situations where there's no radio wakeup penalty, the
> browser may defer images as long as possible. For high latency situations
> where radio wakeup is costly, the browser may load images much earlier.

sgtm.

(In reply to comment #45)
> A common thing would be for the javascript to see that the screen is only
> 320x2 wide and in portrait, so it will instead load the
> portrait-medium-640.jpg image.

Perhaps this is too far down into the weeds, but how is this any different from running JS on DCL and dynamically injecting an <img> tag based on screen dimensions? Or put differently, how would I go about canceling / swapping out a defered image from JS land when my DCL handler is running? Aren't these two functionally equivalent?

I see the benefit for defering the load/no-load decision until CSS is available (aka, taking into account display:none), but it seems like we don't win anything new from JS land?

> Always lazy load is not something I want in.  I was just saying that could
> be possible as an attribute, but I shouldn't say it because people then
> think that I consider it a good idea.  So no; don't do that (however, it is
> possible to do in js, that's the nature of this, but that will require
> determination).
> 
> The browser itself should be allowed to implement lazy-load on entering
> viewport though.  As a differentiating feature.  And there you have the
> quality of implementation saga...

+1

> > <aside>What about cost of reflows? These can be very expensive and jarring
> > to the user. Does this mean all images should now have an explicit w/h?
> > (reasonable, but can be painful)</aside>
> 
> I don't think we need an attribute.  It's up to the user agent to choose how
> to deal with the images after DCL.  Reflows are a painpoint.  A user agent
> can choose to not lazyload images without an explicit w/h, or it can
> calculate the CSS and load.  Or lazyload and take the reflow hit.  This is a
> quality of implementation issue.

Fair enough.

> > (C) How does display:none work in light of (B) - can we say (C) and (B) are
> > functionally equivalent? Aka, defer to UA.
> 
> I'm sceptic to waiting for CSS as well.  Waiting longer than DCL is bad
> enough.  I think this too can be a quality of implementation issue.  If the
> browser can check the CSS, and sees that this picture won't be needed after
> it has fired DCL, then, yes, of course it can not load that image.  It is
> totally allowed to not do that.

+1

> To show that this is a QoI issue;
> 
> - Chrome on Android doesn't have to do it that way - it can preload
>   everything (after DCL, that is).
> - Chromium on Linux might load 50%, and then load the rest on scroll.
> - Internet Explorer on a desktop might want to do a full defer and
>   fade-in when needed based on the current scroll speed, and statistics
>   on how long the previous images took - and load them if they get it
>   before it hits the viewport. If they are a little bit late, they will
>   also put in a 100ms fade-in on the picture.
>   To reduce reflow problems, they can choose to only lazyload images
>   where they know what the size would be (so either w/h explicit), or
>   a reasonable guess based on the layout they already have rendered.

Yup.

> > If we treat defer as a hint, then we can get the best of all worlds:
> > - in cases where it makes sense, on demand loading (viewport or otherwise)
> > - in cases where it does not, let the UA fallback to prefetching when
> > necessary
> 
> But we won't be able to use this as a RWD image solution.  Or a WebP
> fallback.  Or as a better method of doing cheapskate website-developer image
> super-lazy-loading (that they still will have to implement using Javascript,
> but the image tags can stay <img> instead of <div>).  Having JS turned off
> will give you the images, instead of the way it is now.

Yes, fair points. I think we're actually on the same page - perhaps I should be more precise when I say "as a hint". What I mean is that we should let UA's design and decide their own adaptive strategies for when and how to load these defered images, *after* the DCL event fires, etc.
 

> So there's two points to this;
> 1) Allow user agent to defer <img>s possibly forever
> 2) Allow web author to change <img> source before UA does the loading
> (effectively don't prefetch)
> 
> Maybe they could indeed be seperated.  @defer doing #1, and @defer='force'
> doing #2.  But number two has to be possible. :)  Also, it doesn't have to
> be called defer since we're going there already; lazy and lazy='force',
> lazy='defer' or lazy='wait'.  Or even @defer for #1 and @no-prefetch for #2.
> 
> I can see why you would want to have #1 without being restricted to #2.  I
> started with trying to find a solution to lots of cases that could be
> distilled to doing #2, and #1 just came as a result of what #2 would allow.

Ok, in an attempt to reconcile all of this in my head.. this is how I understand it today: if an <img> is marked with "defer", then the resource request should be delayed until IMG element is part of the DOM, and part of the page layout as calculated by CSS. As a result, defer'ed images must wait for CSS; images must not be display:none; defer'ed image requests must wait for DCL before being dispatched (due to having to wait for CSS).

(So, really, it just boils down: wait for, and don't download display: none; .. heh :-))

From there, QoI... Different UA's may end up with different strategies for how to optimally load the defer'ed images following the DCL event: load none, load as comes into viewport, prefetch N screen-folds, fetch all, etc.. Some flags could also be exposed to the user to specify which strategy it wants.

(In reply to comment #43)
> My understanding, from what I've seen and heard from our customers, that the
> bare minimum they need is:
> 1. Do not load images that are hidden with display:none
> 2. Do not load images that are below the fold AT LEAST until the page is
> loaded 

(coming back to this...) I believe both are satisfied. For #2, if below-the-fold images are marked with defer, then that gives implicit priority to AFT content, images included.
Comment 47 Odin Hørthe Omdal 2013-04-14 01:14:04 UTC
(In reply to comment #46)
> (In reply to comment #45)
> > A common thing would be for the javascript to see that the screen is only
> > 320x2 wide and in portrait, so it will instead load the
> > portrait-medium-640.jpg image.
> 
> Perhaps this is too far down into the weeds, but how is this any different
> from running JS on DCL and dynamically injecting an <img> tag based on
> screen dimensions? Or put differently, how would I go about canceling /
> swapping out a defered image from JS land when my DCL handler is running?
> Aren't these two functionally equivalent?

Not sure I understand the question, but trying:  The difference is that one will function on legacy browsers (and keeps the content <img> in the HTML), and the other one will only function with JS on.  This is about having a fallback (and not letting that fallback be loaded by supporting UA's).


> I see the benefit for defering the load/no-load decision until CSS is
> available (aka, taking into account display:none), but it seems like we
> don't win anything new from JS land?

Nah, nothing new.  It's what you _don't_ get by doing DCL instead of layout wait; potentially more waiting.  Take a look at this waterfall:

            DOMContentLoaded
                    v              Layout ready
[ --- index.htm --- ]                   v
    [ style.css ]
       [ menu.png ]
                   [ -- factbook.css -- ]

Since factbook.css was found right at the end of the DOM processing.  We'd in this scenario have to wait longer than we needed for DCL.

If factbook.css was /not/ loaded, we would still have to wait for DCL (and not style.css finishing) because we're not sure if a new CSS comes along before the document is loaded.


I think the "wait for layout" arguments are nice, so I'm not against it.  It's just that I think the sweet spot is when Javascript gets its say.  Because when you have control with Javascript, you can (on your own) wait for layout too if you would like that.

Also, saying DCL is less restrictive, and lets a UA either implement "wait till Layout ready", or just to DCL.  While saying "wait till Layout ready" doesn't allow you to go at DCL if you want to.

So my thinking is not more sinister than that ;-)

> Ok, in an attempt to reconcile all of this in my head.. this is how I
> understand it today: if an <img> is marked with "defer", then the resource
> request should be delayed until IMG element is part of the DOM, and part of
> the page layout as calculated by CSS. As a result, defer'ed images must wait
> for CSS; images must not be display:none; defer'ed image requests must wait
> for DCL before being dispatched (due to having to wait for CSS).
> 
> (So, really, it just boils down: wait for, and don't download display: none;
> .. heh :-))

Correct.  Here you said "part of page layout calculated" though, and that's more in the "only wait for DCL" and not in the "wait for layout" (which would need all CSS files to be loaded).

> From there, QoI... Different UA's may end up with different strategies for
> how to optimally load the defer'ed images following the DCL event: load
> none, load as comes into viewport, prefetch N screen-folds, fetch all, etc..
> Some flags could also be exposed to the user to specify which strategy it
> wants.

Yup! :)
Comment 48 Jake Archibald 2013-04-14 16:40:57 UTC
(In reply to comment #45)
>   *It is not about performance.*

I want to solve both. My proposal handles allows the browser to lazy-load, but it does not force it to do so when loading all at once is better (to avoid waking the radio up).

It also allows JS to get at an image before the original src downloads, by hiding the image with CSS.

> (In reply to comment #39)
> > I don't see the need to wait until DCL. We only need to wait until layout
> > which is round-about when images start downloading anyway, so no big loss.
> 
> I don't really understand how you get to this conclusion. «Only waiting for
> layout» is always *on* OR *after* DCL.

This isn't true, progressive rendering is proof of this.

Go to http://www.whatwg.org/specs/web-apps/current-work/, note how you can see content while the browser is still downloading and parsing HTML. Naturally, the browser had to do a layout prior to painting. DCL will fire after all the HTML has been downloaded and parsed, there will have been many layouts by the time this happens.

See http://stevesouders.com/hpws/js-bottom.php - in this example DCL fires a good 8 seconds later than the first layout.

> Because you can't know what CSS you
> have to load until after DCL.

The browser knows it needs to download CSS as soon as it sees <link rel="stylesheet" href="...">

> Remember that nasty <script> at the bottom of
> the page.

Yes, it blocks DCL, it doesn't block layout and rendering prior it. This is why layout is a better trigger than DCL.

We recommend putting scripts at the bottom of the page not to speed up DCL (it doesn't) but to allow layout & rendering to happen before the script loads.

> Waiting for layout will always be as slow, or slower than DOMContentLoaded.
> :-)

Nope, DCL firing before layout is an edge case.
Comment 49 Jake Archibald 2013-04-14 17:16:54 UTC
There's been a lot of railroading and poor assumptions in this thread so far. I'd like to reiterate the proposal and the valid pros & cons so we can get back on track:

Images with 'defer' MUST NOT download while they are not in a document, or their calculated 'display' style property is 'none'.

The download state of images with 'defer' MUST NOT delay the 'load' event of the window.

Images with 'defer' MAY download once they are in the document, and their calculated 'display' style property is not 'none'.

Images with 'defer' MUST download once they are in the document, and their calculated 'display' style property is not 'none', and any part of the image would be visible within the viewport. If visibility within the viewport is uncertain (due to element size determined by image dimensions), the image MUST download.

PROS:
* Imagery in hidden elements that may never be seen by the user aren't downloaded
* JS can modify the src of images before their original src downloads, by initially hiding the image with CSS
* The browser gets more freedom in regards to image downloading, allowing it to do the best thing given the user's connection type & interaction. Eg:
** If making a connection is cheap & quick (eg, fast wifi), the browser may delay downloading imagery until it's in the viewport, which can avoid unnecessary downloads way below the fold
** If making a connection is expensive & slow (eg, 3g), the browser may download all imagery that isn't calculated display:none to avoid waking the radio up later and using up battery

CONS:
* When creating images with js, you'd need to set .defer to true before setting .src, else the image would download straight away

I don't see this as a big problem, devs know the ordering matters, you already have to set src after load events

* Deferred images cannot be downloaded early by preloaders unless they calculate page styles & images that are at the top of the page may download later as they need to wait for style calculation to know if they should download or not

Not a big deal. Images are low priority downloads anyway compared to JS & CSS so waiting for layout isn't a problem. Also, primary imagery shouldn't use this attribute.

* Firing the load event before all images download may skew metrics

By putting 'defer' on an image you're saying it isn't required to consider the page loaded. I don't think this is an issue.

* Lazy-loading isn't good on mobile connections

The browser isn't required to lazy-load. It can make the choice based on the connection, which is far better than the hacky JavaScript developers are using to currently perform lazy-loading.
Comment 50 Guy Podjarny 2013-04-14 18:21:30 UTC
(In reply to comment #49)
The summary looks good to me.

> The browser isn't required to lazy-load. It can make the choice based on the
> connection, which is far better than the hacky JavaScript developers are
> using to currently perform lazy-loading.

I may be nitpicking, but given this definition is the term "defer" the right one for the attribute? AFAIK defer always means "load later", as opposed to "may never load". Wouldn't the term "lazy" be more properly descriptive? 

I know some browsers may burst download, but I find it more intuitive to accept that some browser may "prefetch" images marked for lazy loading than to accept browsers may never download a deferred image.

To be perfectly clear - I'm not disagreeing with the defined behavior, just challenging the name.
Comment 51 Jake Archibald 2013-04-14 19:24:42 UTC
(In reply to comment #50)
> To be perfectly clear - I'm not disagreeing with the defined behavior, just
> challenging the name.

Hmm, agreed. Any suggestions? "lazy"?
Comment 52 Guy Podjarny 2013-04-14 20:41:18 UTC
(In reply to comment #51)
> (In reply to comment #50)
> > To be perfectly clear - I'm not disagreeing with the defined behavior, just
> > challenging the name.
> 
> Hmm, agreed. Any suggestions? "lazy"?

I considered "on-demand", "late" and "jit" (just in time), but I think "lazy" is the most descriptive, as the browser would do what's easiest (i.e. fastest) when loading the images (as long as it adheres to what it MUST do)
Comment 53 Ilya Grigorik 2013-04-14 21:44:02 UTC
(In reply to comment #47)
> Not sure I understand the question, but trying:  The difference is that one
> will function on legacy browsers (and keeps the content <img> in the HTML),
> and the other one will only function with JS on.  This is about having a
> fallback (and not letting that fallback be loaded by supporting UA's).

Ah, ok - thanks for the clarification. So, practically speaking, because I have to build my site for lowest common denominator (aka, browser that does not understand "defer"), I should use <img>'s which are optimized for those browsers (likely older hardware / DPI, etc), but in the newer browsers, the image load is delayed and I can intercept it and swap out the src's with high-DPI, etc? 

It's still a pretty involved page construction process.. But I can see the benefit over inert div's and data- URI's.

> Images with 'defer' MUST NOT download while they are not in a document, or
> their calculated 'display' style property is 'none'.
> 
> The download state of images with 'defer' MUST NOT delay the 'load' event of
> the window.
> 
> Images with 'defer' MAY download once they are in the document, and their
> calculated 'display' style property is not 'none'.
> 
> Images with 'defer' MUST download once they are in the document, and their
> calculated 'display' style property is not 'none', and any part of the image
> would be visible within the viewport. If visibility within the viewport is
> uncertain (due to element size determined by image dimensions), the image
> MUST download.

Yup, I think that's consistent with everything discussed - thanks for pulling it together. 

a) +1 for lazy. 
b) the delay of load is a good one to discuss on public-web-perf, as this could affect a lot of existing tools and assumptions. Not saying it shouldn't be done, but at the very least, everyone should be aware of this.
Comment 54 Ian 'Hixie' Hickson 2013-04-15 18:30:23 UTC
(I haven't yet read comments 13-53 — if the participants in that discussion could provide a summary of that discussion, that would be very convenient! Otherwise, I'll read it, don't worry, but it's like a novel by this point!)

The main thing blocking this bug now is a clear indication of implementor interest from browser vendors.
Comment 55 Odin Hørthe Omdal 2013-04-16 08:25:26 UTC
(In reply to comment #54)
> (I haven't yet read comments 13-53 — if the participants in that discussion
> could provide a summary of that discussion, that would be very convenient!
> Otherwise, I'll read it, don't worry, but it's like a novel by this point!)

I have clearly shown I'm not fitted to write TL;DR's.  But I think we are close to a point of agreement about what this bug is about and it would be easy to write a summary then.  The Comment 49 <https://www.w3.org/Bugs/Public/show_bug.cgi?id=17842#c49> of Jake plus the answers to my questions below might be enough.  They are about how long the browser /must/ delay the image load.

That comment was caught in mid-flight with Ilya and I didn't see it was missing before now.




(In reply to comment #48)
> > (In reply to comment #39)
> > > I don't see the need to wait until DCL. We only need to wait until layout
> > > which is round-about when images start downloading anyway, so no big loss.
> > 
> > I don't really understand how you get to this conclusion. «Only waiting for
> > layout» is always *on* OR *after* DCL.
> 
> [...] DCL will fire
> after all the HTML has been downloaded and parsed, there will have been many
> layouts by the time this happens.

Okay, so the words we've used are too vague then, I've read «waiting for layout» as «waiting for final layout». I never thought that you meant only waiting until you have /some/ partial layout.  But that really feels dangerous.  The <style ...> that has the main layout might be the one to load first in almost every case, but then suddenly not.

I see this working well when developing and testing, but then suddenly not working all the time in production on other types of connections.  A path lined with gotcha's, and the only real fix to have the style in-page, or only a single linked CSS file.

Even with a single linked CSS file; on a bigger site someone adds a Social Media box which puts in another CSS file.  Since that one is already cached the normal CSS used for loading is not in the first partial layout anymore and the pictures are not hidden and thus load.  So suddenly the site is not working as it used to.  I think that behavior wil be very surprising.  


That said, I might be over-dramatizing.  I have not though through loading on first partial layout, because of the uncertainty and racy "randomness" of it put me off it.  And for some reason I failed to understand that *that* was the wait that was suggested.

The Javascript story is uglier because you'd need to check for js, and then add .has-js and hide images with  .has-js .defered { display:none }, but I do see all the positive sides.  I'm just not convinced that we can so easily overlook the potential problems of doing it.

Ready to be convinced though :-)
Comment 56 Jake Archibald 2013-04-16 14:21:06 UTC
(In reply to comment #55)
> > [...] DCL will fire
> > after all the HTML has been downloaded and parsed, there will have been many
> > layouts by the time this happens.
> 
> Okay, so the words we've used are too vague then, I've read «waiting for
> layout» as «waiting for final layout».

I'm not sure there's any such thing. Hover effects, JavaScript, CSS animation, browser window resize and much more cause layouts through the life of the page.

> I never thought that you meant only
> waiting until you have /some/ partial layout.  But that really feels
> dangerous.  The <style ...> that has the main layout might be the one to
> load first in almost every case, but then suddenly not.

If you have 2 CSS files defined in your markup, subsequent layouts will be blocked on their requests.

The current system works for CSS background, which is basically what I'm proposing + viewport sensitive download.

> I see this working well when developing and testing, but then suddenly not
> working all the time in production on other types of connections.

Do you still think this given how CSS downloads block layout? If so, can you provide an example?

> Even with a single linked CSS file; on a bigger site someone adds a Social
> Media box which puts in another CSS file.  Since that one is already cached
> the normal CSS used for loading is not in the first partial layout anymore
> and the pictures are not hidden and thus load.  So suddenly the site is not
> working as it used to.  I think that behavior wil be very surprising.

It'll only be surprising to people who are surprised by the current behaviour of CSS backgrounds. I don't see that as an issue.

> That said, I might be over-dramatizing.  I have not though through loading
> on first partial layout, because of the uncertainty and racy "randomness" of
> it put me off it.

It's well specced and relied upon behaviour. I don't think it's random. It can get racy with dynamically added CSS, but if you're doing that you already have to work around FOUC and other racy things.

> The Javascript story is uglier because you'd need to check for js, and then
> add .has-js and hide images with  .has-js .defered { display:none }

Not a big deal:
<script>document.documentElement.className+=' js'</script>

Many sites do this already & it's part of libraries like Modernizer.
Comment 57 ben 2013-04-19 01:12:58 UTC
I have two big problems with #49. I speak as someone who has wrestled hard with this issue, and also as someone who has actually read this entire thread :O

1) It requires the extra authoring step of hiding the image with "display:none" css in addition to adding the "defer" attribute. This feels redundant and error-prone. Here is the authoring pattern I see this giving rise to: developers will start their css with attribute selectors, just to block loading of deferred assets:

img[defer] {display: none}

and then add a class to the html element on dom ready, after some src-related script has executed, to kick off loading again:

.html-domready img[defer] {display: block}

Is this what we want? This has the weird effect of coupling css and loading MORE deeply, rather than LESS. One way to simplify this might be to somehow treat all "defer" elements as "display: none" automatically prior to page load. At a minimum, this needs refinement.

2) Firing the "load" event when images IN MARKUP (and not altered or removed) aren't actually loaded seems liable to break lots of things that rely on the traditional semantics of "load". What other event do we have to listen for to tell us when images in markup have loaded? This gives me serious pause--I think you're opening up a whole new area of problems here.

I think the focus of the discussion here needs to be on the pre-parser, specifically on the fact that there is currently no way to conditionalize prefetching behavior. This thread, long as it is, desperately needs more real-world examples. As a demonstration of the nature of the problem, and as an additional gauge of developer interest, here are just three real world projects addressing this area but not discussed so far:

a) Matt Wilcox's Adaptive Images, which uses a cookie and PHP to control the resolution that the server supplies images at. Limitations are that it's PHP only, javascript-dependent, and won't work for other assets (like video). Matt has studied this problem in depth. He has banged his head against the pre-parser probably harder than anyone. Please get his thoughts before treating this issue as remotely settled.

b) Mobify's Capturing, which uses a plaintext hack at the very top of the page, to force the browser not to parse the page at all until some conditional logic has executed. Problems are that it pollutes the very beginning of the page hideously, is javascript-dependent, and uses a deprecated plaintext element with an uncertain future. Lastly, as Steve Souders has recently noted, Capturing actually causes a slight performance penalty because the topmost script is blocking. It's brilliantly written, but for at least some users, it's actually likely to be baby-step in the wrong direction.

c) David Mark Clements' Respondu, which uses noscript tags (with odd dashes in the closing tag) to hide content from the pre-parser. Problems are that the syntax is oddball, and that it verbosely requires a unique separate block and noscript tag for each element or set of elements the developer wants to hide from the pre-parser. It also doesn't seem to be maintained :( (To its credit, this solution does fall back elegantly when javascript is disabled.)

These use cases are all fundamentally different from the (to my eye tangential) "move lazy load into the browser" discussion that has taken off here. That's a goal of much larger scope. (I personally find typical lazy loading behaviors to be terrible UX. I would way rather load something I never look at than have the page stutter and look at empty placeholders every time I scroll.) Preloading is good, just not in every scenario across-the-board. In retrospect, the lookahead pre-parser was implemented without any consideration for the occasional but significant cases in which it is vastly more performant to have it turned off.

So here's my stab at a proposal:

Assets with the attribute prefetch="off" will be requested last, after all other assets in markup and after all images applied by css.

If their srcs of elements are altered or removed by script, the requests will be cancelled immediately. This doesn't happen reliably now, or at least the overhead on the pipe has a real performance hit, as far as I can tell, hence the need for the three plug-ins/libraries above.

If this doesn't work out for some reason, then I'm perfectly content to wait for DomContentLoaded for the requesting of the "no prefetch" assets. Sorry, performance guys, but in some specific cases it's WAY better than what we have now. Sorry for the long post.
Comment 58 Marcos Caceres 2013-09-06 14:31:00 UTC
In [1] you asked for implementer interest - Mozilla is interested in this. 

If you are going to put the feature into the WHATWG version, as you suggest in [1], then is it worth reviewing the WebPerf spec? 

[1] http://lists.w3.org/Archives/Public/public-web-perf/2013Apr/0076.html
Comment 59 Ian 'Hixie' Hickson 2014-04-10 22:03:52 UTC
I'm not really sure what the status of this stuff is in public-web-perf. I've been getting mixed messages about whether they want HTML to do it or not.
Comment 60 Michael[tm] Smith 2014-04-10 22:16:18 UTC
(In reply to Ian 'Hixie' Hickson from comment #59)
> I'm not really sure what the status of this stuff is in public-web-perf.
> I've been getting mixed messages about whether they want HTML to do it or
> not.

It's worth noting that the Resource Priorities spec in public-web-perf now covers a lot more than just <img>. IMHO it would be better to just constrain the solution to <img> first, since that's the only problem case that anybody has been describing so far. And it makes more sense for it to be specified in the HTML spec itself instead of in some other bolt-on spec somewhere else.
Comment 61 Aaron Gustafson 2014-12-08 16:20:35 UTC
> I claim that the lazy load pattern (which is very commonly used) should move from 
> JS hacks to the browser for several reasons. One of them is ease of development 
> (it's easier to add an attribute than an entire lazy loading module), but more
> importantly because we need these kind of heuristics to get lazy loading working
> right. These heuristics can only happen *in the browser*. The browser can take page
> statistics, user behavior and the current network into account when making such
> heuristics. 

100% agree with this. The UA is the smartest actor here. I also like the idea of declarative hints rather than JS hackery. It’s very similar to where we ended up with `srcset`.
Comment 62 Domenic Denicola 2017-07-01 19:07:02 UTC
Although there is good information here, GitHub is friendlier for new contributors, so let's move this to the duplicate issue opened over on GitHub: https://github.com/whatwg/html/issues/2806.