TPAC Breakout: HDR

13 September 2023


ada, Chris_Needham, dbaron, fserb, Nigel_Megitt, Riju, svgeesus, Xiaohan

Meeting minutes

ccameron: we want to solve two problems: images and video, make sure it looks good and predictable across browsers
… the second problem is enabling dynamic HDR content, with canvas
… Are there problems? Yes
… HDR media can be unpredictable how it looks, jarring
… For HDR canvas it doesn't exist yet
… Goal is to converge on the approach, some things are block, so can we unblock them?
… Slide shows the set of things we need to solve
… HDR definition: A display is HDR if it can draw brighter than white, FFF in CSS
… Quantify the HDR headroom, how many times greater than white can it display?
… It varies across devices, and with the display brightness, ambient light
… It's queryable on all operating systems

<svgeesus> greater in absolute luminance, I assume; and also peak not full-screen luminance

ccameron: There is a dynamic range media query, returning "standard" or "high"
… In terms of out for HDR stuff, lots of discussion of nits, but devices don't do nits, they do pixel values
… SDR is range 0 to 1, then extended headroom

ChrisL: Does 2X mean in linear light?

ccameron: Yes

??: As someone encodes for a specific number of nits, how should I think about this?

ccameron: With PQ in particular, you can write a video frame at 400 nits, but that's on a reference monitor in a reference environment
… That doesn't apply so much on the web

ccameron: Convert to the pixel space, is the rough trajectory. Everything goes into the SDR relative floating point space

NigelM: Is this definition in the proposal, or is it scene setting?

ccameron: This is what I want to do for the web, a proposal, aligning with what's in the desktop computing environment, so the broadcast concepts are brought into this world

myles: I think it's the right terminology

Timo: So you're converting, need a convresion that maintains the intent, which is difficult

ccameron: We'll need to address that.
… There's an ISO work item, on gainmap images
… They have a defined rendering into the SDR space
… HDR images, all browsers do something a bit different. What are the problems?
… First is the rendering is undefined, and other is it's unclear whether to serve SDR or HDR
… There's a spec for rendering HLG and PQ, many specs... end result is some number of nits
… More concretely, if I have a HLG or PQ image, draw to a canvas, and read back, we need to agree, don't want non-deterministic

<svgeesus> whatwg/html#9112

ccameron: In ISO we're working on a non-normative recommendation for converting HLG and PQ to SDR pixels for use in SDR colour managed ecosystem
… Then we point web specs at it, and write WPT tests
… So you get the same thing on every browser
… Second thing: HLG/PQ image in an img tag on an HDR monitor, what do I see?
… Want that to be a function of HDR headroom as well
… In discussion on HDR spec, want to also define conversion to full range, and interpolate based on HDR headroom
… Second issue is HDR can be too bright

<svgeesus> w3c/csswg-drafts#9074

ccameron: CSS property to let you set a dynamic range: standard to not go beyond SDR white, constrained-high which goes to 2x, or 'high' which uses full capability
… what should the default be? Some browsers go full blast on video, others on images
… Existing dynamic range media query isn't expressive enough

<svgeesus> w3c/csswg-drafts#9306

ccameron: "standard" or "high"

ccameron: don't want to serve PQ to some devices with not enough headroom, use SDR instead

<dbaron> Agenda: https://www.w3.org/events/meetings/009a5b81-0459-4ae4-9b33-f88dd9a9d89f/ and w3c/tpac2023-breakouts#74

ccameron: want to propose exposign something more about HDR headroom to the web. Value in having 1,2,4,8, one more bit of information, but solves the problem I see. That's a proposal
… The dynamic range limit is something implementers are exceited about
… Next topic is HDR canvas, using WebGL, WebGPU and 2D canvas
… 2D WebGL, SDR or display-p3. But limited to 8-bit back buffers, so can't do HDR
… WebGPU allows a floating point backbuffer. But the spec says the buffer will be composited as SDR
… anything beyond that is clamped
… I have a plan I've developed with people here

<svgeesus> https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md

ccameron: First part is WebGPU extended range. When you draw pixels, you can use the full capability of the display
… This unlocks building apps brighter than SDR
… Step 2 is WebGL with Canvas float support
… 2D canvas has a proposal that's stuck on what getImageData should do

<svgeesus> KhronosGroup/WebGL#3222

ccameron: Part 3 is apply the colorMetadata to WebGL and 2D canvas as well

<svgeesus> whatwg/html#8708

Part 4 is tone mapped canvases, then specify HDR content with a defined tone map algorithm
… With this you can create a canvas and render an HLG or PQ image same as via img tag
… In summary, standardise rendering of HDR content - this is moving forward
… Dynamic range limit is disussed in CSSWG, converging
… 2d Canvas is stalled
… once dependencies are solved, we can then get into tone mapping HLG/PQ
… Not sure what to do with HDR CSS color

<svgeesus> https://drafts.csswg.org/css-color-hdr/


ccameron: WHat are the points of interest or contention?

Ada: For WebXR you render to a different display with possibly different color space, so the canvas would need to be configured with the colorspace of the device
… Would this work or cause problems in this model?

ccameron: The canvases should all work as offscreen canvases, so if you can get info about the screen you could use it

Ada: We're discussing in the Immersive Web WG, but we're not HDR experts

ccameron: You'd need to know what kind of screen you're plugging into, media queries, don't know about multiscreen APIs

ChrisL: You mentioned a tone mapping thing

<svgeesus> ISO 22028-5:2023

ccameron: The main issue is it's for images, used in non-HDR contexts. We need a standard way of ending up with the same result for how images look
… Take metadata available, use a simple/predictable curve. I'm proposing a Reinhard curve, there are other options. No one curve is best for all content. Fools errand to try
… For images you can include an ICC profile with defined mapping to SDR

dbaron: one thought about HDR and CSS, it's not clear to me how much it matters where you do the conversion
… There's a point where you map the HDR thing to what the device's HDR headroom is
… If you're going to do HDR in CSS, is it desirable to control at what level in the DOM tree where it happens, e.g., whole compsiting pipeline, different performance characteristics
… COuld have a CSS property to identify a container for the convresion

ccameron: That's a reasonable idea, create a container to tell the OS to map to the display capabilities
… I like the idea of having CSS all assumed to be SDR, and opt-in to HDR
… Didn't have so many use cases for this, so no proposals, but I think that makes a lot of sense

mfoltz: You talked about addressing HDR, encoded vs dynamic media, but there are ways to go back and forth, capture from canvas etc
… How to maintain fidelity of HDR rendering with capture APIs?

ccameron: Video generally has to be HDR or PQ, no other options
… Need to get the maths sorted out. If you have canvas and pulling in HDR video, want to be able to choose between tonemap to SDR or use full HDR
… and then when you take an element and go to convert it to SDR, tone map in the same way, or somehow specify you want to capture it as HDR with conversion to HLG or PQ
… with the extended linear space, I'd like it to be defined so it's round trippable, that's about defining the math
… I think canvas will be HLG or PQ, not the extended space

myles: on opt-in for CSS, could opt-in apply to canvas and img and everything else?

ccameron: my sense is make all video full HDR by default, and do the same thing for video and images
… canvas is opt-in
… Hard for me to say, could be assume HDR unless said otherwise. But could go the other way

ChrisL: What problem are you seeing?

ccameron: Don't think we've defined how color 2,2,2 should be twice as bright as SRGB linear 1,1,1

ChrisL: You can specify the color now, but what happens is not defined

ccameron: It means changing the buffer from implementation point of view

ChrisL: Might make sense, effectively you've created an offscreen context

Timo: If the display is 5x or 10x more, if you change the relative value of the SDR light, do you ??
… could end up with signal represnetations at a few thousand nits

ccameron: Think that's OK. Rephrasing, if I darken the monitor, you could do 16x brightness, but if you can do 50x brightness, take 203 nits map to 1.0, that's now a 10,000 nit display
… Don't think it's a problem, HLG and PQ is mapped to that relative space
… and the max HLG on the display is always where 5x white is

Timo: It's not wrong, but a different way of looking at the problem. You need a good perceptual colour volume mapper to avoid messing up the image

ccameron: The relative space is in the reference environment, not specifying colorimetry
… I expect monitors as they change brightness monitor that so things don't change how they look, but that's outside the remit of the UA

Timo: I'm concerned about making that someone else's problem

Nigel: I'm wondering about situations like overlaying semi-transparent elements in rgb 0..1 range colours. There could be surprises for the page author ... what do you think may be the biggest suprise is?

<nigel> ccameron: Overlaying translucent elements is going to be one of the weirder behaviours.

Myles: Key is if all browsers implement this, the author will change their site

Nigel: Depending on the monitor they use

<nigel> annevk: That's true, and it could change over time.

ChrisL: Headroom, making display brighter, what about overall OOTF?
… In the cpntext display brightness changes

ccameron: I see it as responsibility of the display and OS
… Right now, OS's try to do that right

Anne: Action items from this meeting?

ccameron: goal was to see where there's disagreement. Any general concerns on the approach

Timo: can work if we get the mapping part right

myles: You've deferred all the hard questions. Immediate next steps seem right

ccameron: We're not painting ourselves in a corner doing that now

myles: I'm less sure on the specific media queries
… Not impossible to solve in a privacy preserving way

ccameron: I think we can make progress on all three of the green issues today

Eric: I think exposing another bit of information about the user will be a hard sell

<dbaron> (They're also bits that might change over time which makes them a *little* less problematic.)

Nigel: Is the alternative to keep colors in a standard range, and carry other information and let the OS figure out what to do

ccameron: could be, but some apps don't want that, e.g., games renderers
… alternative is let the user agent ramp it down

Anne: Or ask the user

myles: Ask once and persist it, but it is more friction


Minutes manually created (not a transcript), formatted by scribe.perl version 221 (Fri Jul 21 14:01:30 2023 UTC).


Succeeded: s/??/nigel/

Succeeded: s/??2/myles/

Succeeded: s/nigel: As someone/??: As someone

Succeeded: s/??/apply the colorMetadata to WebGL and 2D canvas as well/

Succeeded: s/There could be surprises/I'm wondering about situations like overlaying semi-transparent elements in rgb 0..1 range colours. There could be surprises

Succeeded: i/Myles: Key is/ccameron: Overlaying translucent elements is going to be one of the weirder behaviours.

Succeeded: i/ChrisL: Headroom/annevk: That's true, and it could change over time.

Succeeded: s/steps/next steps/

Found 'Agenda:' not followed by a URL: 'https://www.w3.org/events/meetings/009a5b81-0459-4ae4-9b33-f88dd9a9d89f/ and https://github.com/w3c/tpac2023-breakouts/issues/74'.

Maybe present: ??, Anne, ccameron, ChrisL, Eric, mfoltz, myles, Nigel, NigelM, Timo

All speakers: ??, Ada, Anne, ccameron, ChrisL, dbaron, Eric, mfoltz, myles, Nigel, NigelM, Timo

Active on IRC: ada, annevk, cpn, dbaron, fserb, Ian, mfoltzgoogle, myles, nigel, Riju, svgeesus, Xiaohan