09:10d_ed: Could I ask for some people to take a look at: https://gitlab.freedesktop.org/wayland/wayland/-/merge_requests/283 it would be good to get some movement on it.
09:27pq: d_ed, sorry, that's a too complicated topic for me to dig in this year I think.
09:30d_ed: I'm happy to spend time going through anything. It's not too complicated after that
09:32pq: Sorry, I don't think I can invest the time for that.
09:34pq: things can change, but not any time I can see yet
13:57swick[m]: pq: I have the feeling you're over complicating chromatic adaptation for color space conversions
13:58pq: I don't know what it is.
13:59swick[m]: the appearance of a specific chromaticity is different when adapted to different white points
14:00swick[m]: so when you do a color space conversion where the color spaces have different white points you chromatically adapt the colorimetry to the new white point (with relative colorimetric and perceptual intents)
14:00pq: but we are not talking about *those* rendering intents
14:01pq: why would we?
14:01swick[m]: all the video signal always implicitly assume a perceptual intent
14:02pq: really? even for a mastering display?
14:02swick[m]: that's what I'm not entirely sure about
14:03swick[m]: someone must have specified this somewhere...
14:04pq: perceptual intent is arbitrary, and I cannot imagine a mastering display would do random arbitrary things to the signal
14:04swick[m]: I mean, to me it makes sense for it to be relative colorimetric
14:04swick[m]: because otherwise you loose information...
14:05swick[m]: if you assume the wp of the mastering display to be different from the encoding wp and you do not chromatically adapt it, then the appearance on a reference monitor will be different than that from the mastering display
14:06swick[m]: (assuming the viewer is mostly adapted to both the mastering and reference display each time)
14:07pq: I don't understand reference monitor.
14:08swick[m]: in this case it's only relevant that the reference monitor has the white point of the color space
14:08pq: so let's talk only about the colorspace and not a reference monitor?
14:09swick[m]: sure, but the argument I made was about appearance and for that you need some media
14:10swick[m]: let's try to make this a bit simpler...
14:11pq: it sounds like you are talking about adapted white, but then you say you are not talking about adapted white.
14:11pq: adapted white is connected to appearance
14:12pq: color space white point is not about appearance
14:12swick[m]: if you have two monitors, each with a different white point, and when the viewer is viewing one of the monitors and is adapted to the monitor white point, then the same colorimetry will look different on the different monitors, right?
14:13swick[m]: and yes, this is about adapted white
14:13pq: yeah, and my big gitlab comment was all about adapted white
14:14swick[m]: now you describe the monitors with a color space. the color spaces will have different white points.
14:14swick[m]: pick a color, encode it in one of the color spaces. encode the same colorimetry in the other color space. do the colors appear the same, when viewed on the monitors?
14:18pq: if the environment is the same, and both monitor can actually display that color, then yes, I think they do, because they both emit the same colorimetry.
14:19swick[m]: even if the viewer is adapted to the monitor?
14:19pq: adapted to *what*?
14:20pq: the viewer does not magically see the monitor white point, something needs to give it away
14:20pq: I'm assuming the chosen color is the only thing showing on the monitors
14:20swick[m]: ah... in that case you're probably right
14:21swick[m]: but let's assume there is/was enough content on there for the viewer to be adapted to the monitor white point
14:21pq: if you had monitor-white background on each monitor, then I'd say they do not appear the same, because the monitor-white is not the same and it is being displayed
14:21pq: however, where would that monitor-white content come from?
14:22pq: if you take an image instead of a single color, and do the same thing, and not show anything else on each monitor, then again there is nothing giving away the actual monitor white point
14:23swick[m]: true, but if you have an avergae scene then our hvs will adapt to the white point of the monitor so some degree
14:24swick[m]: I guess that's just implicitly assumed
14:24pq: no, because the whole image has gone through the same conversion to the monitor's trichromatic system
14:25swick[m]: mh, I see
14:25pq: if you had some literally other content on the monitor that did not go through that conversion, then yes, the adaptation would be, well, ruined
14:25swick[m]: yeah, I get it now
14:26pq: cool - are we any closer to an answer? :-)
14:27pq: btw. this is going to hurt floating window color appearance, as you always have something else on screen too :-p
14:28pq: but that's not our problem, and it probably should not even be attempted to be fixed, because that would imply static screen content to shift in color when just one window is animating, which would probably be even more disturbing
14:29pq: and it certainly won't be on any mastering display
14:30swick[m]: I think there are two good reasons why chromatically adapting still makes sense
14:30swick[m]: 1. as you said, when you combine content they now have different white points and the adaptation will be screwed
14:31swick[m]: 2. when you chromatically adapt, the color volumes will match better and you'll be able to reproduce more colors in the other color space
14:32swick[m]: so I think you're technically correct but everyone just chromatically adapts when doing color space conversions, except for when you want something very specific like softproofing
15:14pq: swick[m], in what context are we talking about now? Mastering displays specifically, or in general?
15:14swick[m]: in general
15:14pq: yes, I agree
15:15swick[m]: hence my question, do we chromatically adapt the mastering display volume to the encoding volume?
15:15swick[m]: I think so, but I'm not entirely sure
15:16pq: I'd say no, that's what the production should be doing already
15:16swick[m]: but it literally describes the mastering display, not a chromatically adapted mastering display
15:17pq: exactly
15:18pq: asking from another angle: if the mastering display did chromatic adaptation, why would we need HDR static metadata to describe the mastering display?
15:19pq: why does HDR static metadata carry a white point explictly?
15:20pq: JoshuaAshton, do you know? :-)
15:20swick[m]: well, I'm not saying the display does any chromatic adaptation, I'm saying the mastering process does, when converting between the mastering display and the encoding color space
15:21pq: that's indistinguishable from the mastering display doing it
15:22swick[m]: the way I see it, they need the white point to describe the mastering display color space, but they could have also chosen to chromatically adapt the primaries of the mastering display white point and then only store that (but naming this is harder)
15:23pq: primaries of the white point?
15:23swick[m]: ehh, chromatically adapt the primaries of the mastering display to the encoding color space white point
15:23pq: but then it would not describe the colorimetry of the mastering display color volume anymore?
15:24swick[m]: yes, only the volume in the encoding color space which is relevant
15:24pq: I see
15:25pq: but then, what do you use as the "unit cube" ranges to actually find the volume?
15:26swick[m]: no idea. that's why I'm starting to really like the idea of the mastering display color space
15:26pq: which idea? just switch the name from color volume to colorspace?
15:27swick[m]: kind of...
15:27pq: what about scRGB?
15:28swick[m]: same deal, right? if you choose a white point that matches sRGB then it's exactly like the color volume concept right now...
15:29pq: scRGB kinda does not have the unit cube though
15:30pq: I mean, nothing forces one to pick sRGB white point for scRGB content colorspace
15:31pq: I hope Troy can shed some light here.
15:31pq: till tomorrow \o.
15:35swick[m]: o/
16:24wlb: wayland-protocols Merge request !193 opened by Simon Ser (emersion) Add blender protocol https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/193 [In 30 day discussion period], [Needs acks], [Needs implementations], [Needs review], [New Protocol]