12:08pq: swick[m], as it's Friday afternoon here already, I'll get back to the white point discussion next week.
12:31swick[m]: I love that st2086 is even more vague than me
12:32JEEB: xD
12:32JEEB: I think I actually never looked into the definitions directly there
12:33JEEB: since the SEI contents are in H.274 and the over-the-wire stuff is in CTA-861-H
12:33swick[m]: "these parameters describe to color volume of the mastering display" EOF
12:34pq: swick[m], one thing in my mind to check if e.g. BT.2100 says anything about mastering, and maybe look if the ITU-T page for it has some additional notes. Or just search ITU for "mastering".
12:35JEEB: yea, so basically it's a display hint at how the mastering person saw it. so for example if actual content has like up to 4000 nits
12:35JEEB: and the mastering display only was able to show up until 1000 nits
12:35swick[m]: yeah, so far so good
12:35JEEB: then if you want to show it how it was visible to the mastering person, you can ignore >1000 nits
12:35JEEB: same for gamut
12:35pq: JEEB, this discussion started from https://gitlab.freedesktop.org/swick/wayland-protocols/-/merge_requests/26#note_1774283
12:36JEEB: in other words, if you have a UHD BD of Mad Max Fury Road with ~9000 nits max brightness in one scene
12:36JEEB: you can safely ignore it
12:36JEEB: but yes it's kind of dumb, it doesn't describe the actual content
12:36JEEB: but rather how it was visible to the mastering person
12:37JEEB: ah, white points
12:37pq: JEEB, the question is: when in production the movie is displayed on a mastering display, what kind of color mapping is happening from the ??? to the display?
12:37swick[m]: JEEB: we already don't completely understand how it should be used when you ignore the luminance stuff
12:37swick[m]: it's always white points, isn't it? :)
12:38swick[m]: pq: jup, I think we need to understand the mastering process to really get any further here
12:38pq: IOW, do we just simply compute the colorimetry of the mastering display, given its primaries and white point, and assume that same colorimetry of the encoded video is the color volume of interest, or does something happen in between?
12:40pq: I think the same question applies to primaries as the white point, they are all necessary parts of describing a color space.
12:41JEEB: primaries describe the gamut limits
12:41JEEB: IIRC
12:41JEEB: so in that sense it is clearer for me than white point adjustment
12:42pq: if you do white adjustment into mastering display, why stop there and not also do something with the primaries?
12:42pq: *white point adjustment
12:43pq: or, if adjusting to the mastering display primaries is a no-no, then why is adjusting white point a good thing?
12:43JEEB: I think we're stepping a bit too far into it. I think these are just meant to be limiting values when doing gamut/tone mapping to the actual output
12:44JEEB: you don't have to do decoded YCbCr->ref display->output display
12:44pq: the only thing we want to know is which part of the encoded color space is meaningful, i.e. which part of it was displayed on the mastering display
12:44JEEB: yes
12:45pq: right, so how do you compute the volume in the encoded color space from the mastering display characteristics?
12:45JEEB: haasn is doing it already so clearly it is possible
12:46pq: of course, but what is the *right* way to do it? :-D
12:46JEEB: I just fed him specs so I'd have to read more about it to properly respond :D
12:46pq: haasn, comment?
12:47haasn: not sure what the question is
12:47pq: I assumed a direct colorimetry conversion from one trichromatic system to another, and swick[m] thinks it could be something else.
12:47haasn: primaries conversions commute
12:48haasn: encoding -> mastering -> display is the same as encoding -> display
12:48JEEB: yea
12:48haasn: unless you add some nonlinear step (like clipping) in between
12:48JEEB: I think they are jus specifically asking how complex the limiting of gamut and tone mapping should be
12:48haasn: but we had this discussion before and I remember advising not to bother clipping in the mastering display volume
12:48swick[m]: the mastering display color volume chromatically adapted to the encoded color space white point
12:48pq: haasn, but do you do chromatic adaptation when you map the mastering display color volume into the encoding color volume to find out the mastering display limits, or not?
12:49haasn: chromatic adaptation also commutes so going from encoding white point to display white point is the same as if you go via mastering white point first
12:49haasn: oh
12:49swick[m]: :)
12:49pq: or actually does the production process do the inverse of that when displaing the product on a mastering display?
12:49haasn: no, you don't need to do chromatic adaptation on the mastering display primaries to find the mastering display color volume
12:49JEEB: ok, so that is what I assumed :)
12:49haasn: well, hmm
12:50swick[m]: why not?
12:50pq: haasn, would happen to remember any links/docs related to that?
12:50haasn: because white point adaptation doesn't touch the primaries, I think
12:50haasn: by design
12:50haasn: so the gamut volume should stay the same during white point adaptation
12:50haasn: but I'm not 100% on this
12:50swick[m]: that doesn't sound right...
12:51pq: that makes sense to me
12:51swick[m]: if you move the white point, the color volume must change
12:51pq: no, white point is just the balance between the primaries' power
12:52pq: each primary is still ranging from 0 to 100%, defining the extremes of the color volume
12:52haasn: hmm
12:52haasn: no, I think swick[m] is right, it's just a 3x3 matrix in XYZ space
12:52pq: or wait...
12:52haasn: so it can only be linear, moving the white point while keeping the primaries the same is impossible
12:53pq: yeah, that makes sense
12:54pq: white point defines what the 100%, 100%, 100% color is, so naturally that changes if white point changes, and it must an extreme point in the color volume
12:54haasn: then I don't really know the answer to the question
12:54haasn: but I'm also not sure why it matters
12:54haasn: what are you doing with information about the mastering display volume (in encoded space)?
12:54haasn: is it to guide tone-mapping?
12:54pq: haasn, isn't that volume used to quite gamut and tone mapping?
12:54haasn: or gamut mapping
12:55pq: *guide
12:55haasn: in libplacebo currently no
12:55haasn: pq: you know, I would conservatively guess that mastering is probably done without adapting the white point
12:55haasn: realize in practice this doesn't matter because 99% of content is mastered on D65 displays with D65 encoding
12:55haasn: the only exceptions are, like, what, digital cinema XYZ raws?
12:56haasn: of course, that leaves the question open of why the mastering display white point is even signaled
12:56pq: right now we want to know what the mastering display primaries and white point mean and how they are used, so that we can put them in protocol and figure out if the concept is compatible what we want to express for scRGB as well, i.e. the relevant part of the container color volume.
12:57haasn: we don't use the mastering display white point for anything and I'm not sure what it would even be useful for, tbh
12:57haasn: all of this metadata was designed-by-committee in an ad-hoc "let's just add metadata that might be useful later" fashion
12:57pq: the mastering display white point tells us what color is the most luminous color possible on the mastering display.
12:58haasn: I guess it's useful if you want to clip the input to the mastering display gamut before clipping it to the target display gamut?
12:58haasn: to simulate that 1% distortion
12:59pq: that's not in my mind, no
13:00pq: if I want to optimize my gamut and tone mapping, surely knowing what the most luminous color that could ever matter is will help me?
13:00pq: and I can map that color to the most luminuous color of my actual display, or something similar
13:01haasn: pq: to be clear, when I say "mastering display white point" I don't mean the luminance (brightness) but the CIE x/y chromaticity coordinates
13:01haasn: they're signalled separately
13:01pq: or to make sure that that color is relatively presentable on my actual display after my mapping
13:01haasn: the luminance information is of course very useful
13:01pq: yes, I do mean the chromaticity coordinates, not nits
13:02pq: maybe the peak luminance on the mastering display is red, and the peak luminance on my actual display is green, to say in an extremely exaggerated example
13:03pq: or I should say pink and light greenish
13:03pq: surely that matters somehow to what the optimal mapping to the actual display is
13:07pq: maybe a mastering display is able to reach up to 1000 nits in a slightly pinkish tone as the highest possible peak nits
13:07pq: and maybe an actual display reaches the same 1000 nits peak, but in a slightly greenish tone
13:08pq: maybe the actual display can only 800 nits of the same pinkish tone as the mastering display did 1000 nits
13:09pq: swick[m], am I making sense? :-D
13:09swick[m]: yes, very much so
13:09pq: cool
13:10pq: The thing is, there is only one color (as in one coordinate pair in CIE 1931 xy) that can reach the panel peak nits, and that is when all sub-pixel elements are at 100% intensity each.
13:12pq: if you want to shift the chromaticity, then luminance must come down, because they only way to do that is to reduce one or two sub-pixel intensities.
13:13swick[m]: practically, what are we going to do about this? My MR currently uses the wp of the encoding color space so it's clear that the primaries of the color volume are relative to that. In that sense we don't have the ambiguity the mastering display concept has.
13:14pq: swick[m], but standard HDR metadata has separate white point, and we are not able to carry that.
13:14swick[m]: exactly...
13:14swick[m]: that would mean we force them to do chromatic adaptation
13:15pq: swick[m], I would just add the white point there. We can think more about what to do with it. And assume no chromatic adaptation as the simplest possible appraoch.
13:15swick[m]: meh
13:15swick[m]: we could also just reuse the mastering display concept and then let compositors figure out how to use it correctly
13:15swick[m]: same thing I guess
13:15pq: it's just a change of bases from linear algebra
13:16pq: yeah
13:16pq: I suppose there is no problem applying the mastering display concept to scRGB either?
13:17JEEB: yea it's the possible composition space so it wouldn't yet be affected anyways
13:17JEEB: then after compositing stuff you convert to output(s)
13:17JEEB: and at that point you might or might not want to apply it. afaik windows just ignores it and passes onto displays
13:17pq: why convert twice, when you can blend in the optical output space?
13:18pq: passes what to displays?
13:18JEEB: the metadata
13:19JEEB: in case of HDR output
13:19pq: but we have multiple windows each with different metadata on the same monitor
13:19JEEB: I think windows passthroughs when application == output, and otherwise through scRGB would be my guesstimate. scRGB is just convenient since you map HDR at 203 nits to graphics white (1.0) and with SDR applications you just map 1.0 to 1.0
13:19JEEB: and same for output
13:20JEEB: pq: I think windows just goes lalala and either doesn't output metadata, or picks the widest on screen
13:21pq: heh
13:21JEEB: I think windows might have not output any metadata unless application is fullscreen
13:21JEEB: need to check my mpv issues :P
13:22pq: then you also need to map scRGB to output space which may be more complicated than KMS can express, so you take the hit of second pass on the GPU - at least Weston will try to avoid that by blending in optical output space to the after-blending operations are few and simple.
13:22pq: *so the after-blending
13:36pq: swick[m], btw. as Vitaly has been looking into tone mapping, maybe he has seen something about using the mastering display characteristics?
14:31haasn: 14:12 <pq> if you want to shift the chromaticity, then luminance must come down, because they only way to do that is to reduce one or two sub-pixel intensities. -> okay, it makes sense
14:32haasn: or paraphrased, if you know that mastering display white point = very blue, mastering display peak nits = 1000
14:32haasn: then the actual content might contain blue at 1000 nits
14:32haasn: but your display can only show blue at 200 nits
14:32haasn: even though your display can show white at 1000 nits
14:32haasn: so you might naively think "oh great, I don't need to tone-map", but it would be wrong
15:20pq: haasn, exactly :-)
15:22pq: and that matches your description of a unit cube mapped into XYZ or another through a 3x3 matrix
15:24pq: white point affects where the peak goes, and I think it also affects where all the other corners go except the 0,0,0
15:25pq: white point even affects where the corners of the primaries go - not their chromaticity but their luminance
15:26pq: hmm, an interactive app would a nice visualization for this
15:31swick[m]: yeah it's kind of weird that there is no interactive tool like that
15:32swick[m]: the best we have is python scripts which output some image
15:35pq: shouldn't be too hard to write that python to make it interactive with the help colour, numpy and matplotlib.
15:35pq: but hard enough that I'd need to put a couple days aside to do that :-)
16:45haasn: there's a tool for visualizing icc gamuts in argyllCMS
16:46haasn: even comes with a fancy WebGL HTML5 thing