07:15wlb: weston Merge request !1225 opened by Marius Vlad (mvlad) tests/drm-writeback-screenshot-test: Grab the first available output https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1225 [Testing]
09:11MrCooper: zamundaaa[m]: I assume kwin uses a high priority EGL context via EGL_IMG_context_priority ?
11:32wlb: weston/main: Marius Vlad * CONTRIBUTING.md: Inform users that they'd need to ask for perms https://gitlab.freedesktop.org/wayland/weston/commit/837ebaf487c0 CONTRIBUTING.md
11:32wlb: weston/main: Marius Vlad * CONTRIBUTING.md: Fix link for patchwork https://gitlab.freedesktop.org/wayland/weston/commit/af4fb2b9f6cd CONTRIBUTING.md
11:32wlb: weston Merge request !1221 merged \o/ (CONTRIBUTING.md: Inform users that they'd need to ask for perms https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1221)
11:47wlb: weston Merge request !1199 merged \o/ (clients/simple-dmabuf-feedback: Fix surface size https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1199)
11:47wlb: weston/main: Sebastian Wick * clients/simple-dmabuf-feedback: fix dangling pointers https://gitlab.freedesktop.org/wayland/weston/commit/57cba6afb47d clients/simple-dmabuf-feedback.c
11:47wlb: weston/main: Sebastian Wick * clients/simple-dmabuf-feedback: fullscreen surface from the start https://gitlab.freedesktop.org/wayland/weston/commit/62d7a46ba249 clients/simple-dmabuf-feedback.c
11:47wlb: weston/main: Sebastian Wick * clients/simple-dmabuf-feedback: create buffers on demand https://gitlab.freedesktop.org/wayland/weston/commit/6c27f0b87ccd clients/simple-dmabuf-feedback.c
11:47wlb: weston/main: Sebastian Wick * clients/simple-dmabuf-feedback: get buffer size from configure events https://gitlab.freedesktop.org/wayland/weston/commit/34400d7d1686 clients/simple-dmabuf-feedback.c
13:34wlb: wayland Merge request !283 closed (Introduce API for clients to handoff between compositors)
13:53wlb: weston/main: Robert Mader * pipewire-[backend|plugin]: Add timestamps to buffers https://gitlab.freedesktop.org/wayland/weston/commit/445ff6728b9d libweston/backend-pipewire/pipewire.c pipewire/pipewire-plugin.c
13:54wlb: weston Merge request !1224 merged \o/ (pipewire-[backend|plugin]: Add timestamps to buffers https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1224)
14:10emersion: pq, i'm reading up the 3D LUT stuff from Weston, what is the curve offset/scale for?
14:10emersion: e.g. the offset is 0.5f / lut_len
14:11pq: emersion, ah, that's because of how OpenGL uses texture coordinates.
14:12pq: the offset points to the middle of the first texel, because 0.0 would point to the left edge of the first texel.
14:12pq: the range from 0.0 to offset will only yield the first texel's value and not the interpolation we'd want.
14:13pq: so the usable texture coordinate range is from the middle of the first texel to the middle of the last texel
14:13emersion: hm, why do we want to interpolate even the first one?
14:13pq: that gives use the linear interpolation we want
14:14pq: because this is not texturing, this is LUT interpolation and it just works that way
14:14emersion: if i have a 3D LUT, i'd expect (0, 0, 0) to not need interpolation
14:14pq: exactly!
14:14pq: and it won't
14:15pq: but if you sample (0.001, 0.0, 0.0), then you do want a bit of interpolation in the first dimension
14:15emersion: but that'll sample in-between two texels as you said?
14:15emersion: since the offset is always added?
14:15pq: no, it samples the middle of the first texel
14:15emersion: oh.
14:15emersion: damn
14:15emersion: i see what you mean
14:15emersion: fun fun
14:15pq: if you go off the middle of a texel, then you mix in the neighbour texel(s)
14:16emersion: do you have some test ICC profiles btw?
14:16emersion: i've been using a random one from a random website and it doesn't work great, but not sure it's my fault or not
14:17pq: the weston test suite generates some on the fly, so that we also have their ground truth in precise form
14:18pq: IOW, we have a list of test case parameters, for each we generate an ICC profile, shoot what Weston produces with the profile, and then compute manually what the result should have been, and compare.
14:19pq: and for each test case, we have a few tests like opaque colors and a semi-transparent gradient in a sub-surface (compositor-specific results, yes)
14:20pq: tests/color-icc-output-test.c is all that, but it's also the most complex test code so far in weston, I think
14:22pq: I have never really trusted my eyes for testing the correctness.
14:22pq: at least in this case there is a defined correct answer...
14:24pq: We also test two different types of ICC profiles: matrix-shaper and CLUT. Particularly with CLUT, we introduce error already by simply creating the ICC profile. So we check that too.
14:25emersion: hm
14:25pq: emersion, colord installs a bunch of ICC files that you could experiment manually with. SwappedRedAndGreen.icc sounds fun. :-)
14:26emersion: so there's no easy way to generate a crappy and unreliable ICC profile from EDID?
14:26emersion: ahah
14:26daniels: pq: we need SwappedRedAndBlue.icc so people can get the big-endian experience
14:26emersion: i already have that even without that ICC profile :)
14:26emersion: https://l.sr.ht/U2Fx.png
14:26pq: if you pick a library like LittleCMS that can create ICC files, generating a crappy profile from EDID should be quite easy.
14:26emersion: supposedly this is color-corrected weston-simple-egl
14:27emersion: now i don't trust my eyes but…
14:27emersion: yeah, was mostly wondering about a ready-made CLI tool for that
14:27emersion: but yeah i'm using lcms like everybody
14:27pq: you could load Rec709.icc from colord, and that should probably be a no-op/identity, that's a good initial check
14:28emersion: thanks for the colord hint!
14:28emersion: will try that
14:28pq: I'm just guessing by the ICC file name here, mind.
14:28pq: ah yes, that image does look a little off :-)
14:29pq: you definitely have some interpolation, but your result had only red channel and forgets the other two
14:30pq: maybe confusion with indices when populating a 3D table?
14:31pq: that pattern is really curious, like it was telling us what the problem is
14:32pq: maybe your indices are correct, but you forgot each table element is 3 values and not one?
14:34pq: gotta go, good luck :-)
14:34emersion: indices look correct, and offset too
14:34emersion: offset = 3 * (r_index + dim_len * g_index + dim_len * dim_len * b_index)
14:34pq: yup
14:35pq: but somehow 2/3rds of the table is zeros
14:53emersion: oh.
14:53emersion: okay, it turns out i copied only dim_len³ * sizeof(float)
14:53emersion: instead of 3 * dim_len³ * sizeof(float)
15:01emersion: … and the driver doesn't support VK_FORMAT_R32G32B32_SFLOAT
15:15drakulix[m]: <pq> "if you pick a library like..." <- So for a crappy one one would probably pipe the color characteristics from the EDID into cmsCreateRGBProfile, right?
15:15drakulix[m]: How usable would something like that be? Assuming not everybody has a calibrated ICC profile for their monitor, is a profile created like this any improvement?
15:31emersion: ah, nice, it works on Intel!
15:31emersion: thanks for the hitns pq :)
15:31emersion: hints*
15:31JEEB: 'grats
15:31JEEB: LUT for output conversion in DRM/KMS?
15:32emersion: just in Vulkan for now
15:32JEEB: ah
15:32emersion: and it's a fat 3D LUT
15:34JEEB: if it's on the actual graphics hardware as opposed to some hard-wired ASIC that applies a LUT, I think haasn found out that some functions were actually faster as math VS a lut. although stuff now I think libplacebo uses LUTs for some of the tone mapping?
15:43swick[m]: shader math is cheap, memory bandwidth is not. if you can represent a transform analytically in a shader that will be better but a LUT works for every transform.
15:43emersion: yeah, that doesn't really work for ICC profiles though
15:44emersion: for well-known color spaces that's a good option
15:44swick[m]: depends on the ICC profile
15:44emersion: i'm assuming ICC profiles for outputs will be tricky in general
15:45emersion: maybe if it's just a matter of applying the EDID color management stuff it's fine?
15:45emersion: but for ICC profiles generated from sensor data it'll be tricky?
15:46JEEB: also IIRC ICC profiles tend to already have a LUT, right?
15:46JEEB: so if you have a thing available, you could just reutilize it
15:49swick[m]: a transform always connects two profiles. depending on how they are created the transform can contain LUTs, matricies and curves.
15:50swick[m]: matrices and curves can be implemented with shader math just fine
15:50swick[m]: but because you do not know when you will get a LUT you will always have to handle LUTs
15:50emersion: btw, i assume converting to cmsCreate_sRGBProfile() will result in optical values
15:51swick[m]: and because you can convert everthing else to LUTs you can support LUTs and be done with it
15:51emersion: sorry, let me reword
15:52emersion: if i create a transform from cmsCreate_sRGBProfile() to an ICC profile, i assume the transform will take optical values
15:52emersion: is there a way to feed electrical values instead?
15:52emersion: bleh
15:52emersion: i think i mixed things up again
15:52emersion: let's use the normie terminology instead:
15:52swick[m]: the profile describes a transform from encoded values to XZY
15:52emersion: if i create a transform from cmsCreate_sRGBProfile() to an ICC profile, i assume the transform will take gamma-encoded color values
15:53emersion: how can i create a transform which takes linear values?
15:53swick[m]: you can't
15:53emersion: the blending happens with linear values
15:53emersion: right?
15:53swick[m]: it should, yes
15:54emersion: so i need to convert the linear values to something that can be piped into the lcms col9or transform
15:54emersion: so the only way is to convert to sRGB?
15:54emersion: there's no "null" profile?
15:55swick[m]: you want to blend in the output color space on tristimulus values
15:56swick[m]: in other words the output color space with the inverse EOTF applied
15:56emersion: do i really want though?
15:56swick[m]: the problem is that ICC doesn't have a concept of EOTF. you only get a transform from encoded values to XZY
15:56swick[m]: where else do you want to blend?
15:56emersion: scRGB maybe?
15:57swick[m]: scRGB is an basically infinite container
15:57emersion: a neutral space not tied to any output and big enough
15:57swick[m]: but even then scRGB is non-linear
15:57swick[m]: nothing about scRGB is neutral
15:58swick[m]: and you don't want big enough, you want the right size
15:58swick[m]: because otherwise you're going to have to compress that stuff down again
15:58emersion: okay, let's explain my goals a bit
15:58swick[m]: and scRGB in particular gives you an infinite space
15:59emersion: i'd like to start by touching only the post-blending pipeline
15:59emersion: all textures are sRGB, decoded before blending
15:59emersion: then i want to convert the blended, decoded sRGB into the output ICC profile
16:00emersion: there is no way in lcms to represent that color space where color values or decoded sRGB?
16:00emersion: are*
16:00swick[m]: I see
16:00swick[m]: you'll have to construct one yourself
16:01emersion: i will add the pre-blending pipeline too, but ideally later
16:01emersion: hm
16:02emersion: so i could basically create a custom color space with a linear TF, and otherwise identical to sRGB?
16:03swick[m]: yes
16:03emersion: via cmsCreateRGBProfileTHR most likely
16:04emersion: okay. then later on, the issue of taking linear color values remains
16:07drakulix[m]: emersion: But that function does take transfer functions as parameters. If you supply identity functions here, wouldn't that profile operate on linear values? Creating a transform to the sRGB profile or any other output profile should then do what you want, no?
16:07emersion: which function?
16:07emersion: i can call cmsCreateRGBProfileTHR() with the sRGB whitepoint/primaries, and linear TF
16:08drakulix[m]: That is what I meant.
16:11emersion: nice
16:12emersion: swick[m]: so, if i blend in output color space except linear, i basically still have the same issue right?
16:12emersion: so that's where the inverse EOTF comes in?
16:13swick[m]: yes, even worse of a problem because the output profile doesn't give you an EOTF
16:13swick[m]: by assuming everything is sRGB at the input you already know your EOTF
16:14emersion: so what's your plan?
16:14emersion: the issue is that the output profile is just an opaque transform right?
16:15swick[m]: yes
16:15swick[m]: clever maths trying to estimate the EOTF
16:15swick[m]: not sure if weston already implements that
16:16emersion: hm
16:16emersion: is blending in output color space except linear really our best option here?
16:20drakulix[m]: <swick[m]> "clever maths trying to estimate..." <- Isn't that exactly what this function is doing? https://gitlab.freedesktop.org/wayland/weston/-/blob/main/libweston/color-lcms/color-profile.c#L57
16:33emersion: is there a good name for "sRGB but linear"?
16:33emersion: or in general "<color space> but linear"?
16:43swick[m]: drakulix: jup, that looks right
16:44swick[m]: emersion: I don't think so. It's best to always say either encoded values or tristimulus values
16:45emersion: so linear in regular human wording == tristimulus in color management wizard speak
16:45emersion: ?
16:46emersion: == optical?
16:46emersion: and encoded == electrical? (i thought "electrical" was better?)
16:58drakulix[m]: Depends on the use-case? Electrical/Encoded makes better use of memory, as in uses more bits, where humans perceive more detail. Optical/Linear gives better (as in closer to real world) results when blending. Probably more differences, but I don't think you can call either of them "better".
17:01emersion: optical/linear *is* the correct thing to use when blending AFAIU
17:02emersion: electrical/encoded for blending is what people are used to because everybody does blending incorrectly
17:02drakulix[m]: I am pretty sure some browsers currently expect compositors to do blending in encoded sRGB (like everyone is pretty much doing right now).
17:02drakulix[m]: So I would say it is at least debatable, if you should blend untagged surfaces in linear spaces.
17:04emersion: in the end it's compositor policy
17:05drakulix[m]: Which means an application can't use semi-transparent subsurfaces, if it wants to control blending. (Like browser probably have to, because I expect this is defined in some web standard.)
17:05drakulix[m]: I don't think everyone is aware of that.
17:11JEEB: emersion: apple uses kCGColorSpaceExtendedLinearSRGB, MS uses the name scRGB as well as the enum DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709
17:11JEEB: vulkan also has its own enum for it
17:11emersion: with NONE being in place of something else?
17:12emersion: i think i don't really understand what scRGB is
17:12JEEB: yea that is where with chroma subsampling you have the chroma location
17:12JEEB: they decided to have enum values defining the whole H.273 shebang
17:12emersion: hm, but nothing in that enum specifies that it's *not* encoded?
17:13JEEB: G10 for transfer
17:13JEEB: "gamma 1.0"
17:13emersion: oh.
17:13JEEB: compare with YCBCR_FULL_GHLG_TOPLEFT_P2020
17:13emersion: i've seen this in the lcms docs as well
17:13emersion: but "gamma" is a cursed word AFAIU
17:13JEEB: oh yes
17:14JEEB: not recommending it. scRGB seems to be in general utilized for BT.709 primaries RGB where it may or may not go under/over 1.0, in order to ease mapping into SDR while keeping HDR capabilities
17:15JEEB: generally utilized with linear transfer
17:15emersion: right, i see it has both linear and non-linear versions
17:16JEEB: vulkan has VK_COLOR_SPACE_EXTENDED_SRGB_LINEAR_EXT if I understand correctly
17:16emersion: why do they put "extended" in there?
17:16emersion: to mean that values can be outside [0, 1]?
17:17JEEB: yes, that was one of the reasons why scRGB was defined. [0,1] being SDR, and then HDR content may be delivered with same color space with values outside of that range
17:33emersion: swick[m]: so the reason you dislike linear scRGB for blending, is that it's "too large"?
18:21swick[m]: in theory you can blend where ever you want. usually you want to do it on tristimulus values. scRGB is horrible because there are negative values and it is infinitely big so you need to carry some information about what part of scRGB you actually use. Then there is the issue of the whitepoint. If it is different than your output color space you have to chromatically adapt that as well.
18:21swick[m]: emersion: ^
18:24swick[m]: If you really do want a single space to composite in then at least use something like bt2020. You won't even need gamut mapping then because everything else is smaller so you can just remap all the inputs to bt2020. The issue is that you then eventually have to compress the bt2020 to the output color space, and even if the input was sRGB you lost that information because the blended data is bt2020 now and your gamut mapping will be kind of
18:24swick[m]: horrible.
18:24swick[m]: all in all, bad idea
18:25emersion: hm
18:26emersion: remapping bt709 to bt2020 is not the exact mathematical reverse of compressing bt2020 to bt709?
18:26swick[m]: if you want to stay in [0,1], no
18:26swick[m]: you loose information going from bt2020 to bt709
18:27emersion: even if the input was bt709?
18:27emersion: i guess i don't quite get what is involved in color space conversions
18:28swick[m]: it depends. if you just transform from one to the other it doesn't loose information but you will get out of the [0,1] range
18:28emersion: why do i get outside [0, 1]
18:28emersion: ?
18:28emersion: and why should i try to stay in [0, 1]?
18:29swick[m]: because one is bigger than the other
18:29swick[m]: and you should try to stay in [0,1] because that's usually what can be represented
18:30swick[m]: if a monitor supports a color space it usually means it supports encoded values between 0 and 1
18:31swick[m]: if you only do a temporary color space transform it is fine to work with values outside of [0,1] but handling negative values is usually really hard, so the easier way is to transform it to something like XYZ where all colors can be represented with values in [0,1]
18:33emersion: what makes negative values hard to deal with?
18:34emersion: bt2020 is bigger than bt709 but i don't get why going bt709→bt2020→bt709 would not give me back the exact same values?
18:35swick[m]: you need negative red and blue to get more green. it makes everything much harder to reason about...
18:36swick[m]: if your gamut mapping strategy for bt2020->bt709 is clipping then it does
18:36swick[m]: but you don't usually want that because it looks horrible
18:36swick[m]: so you even move colors inside bt709 to make space for colors in bt2020
18:37emersion: oh…
18:38emersion: so basically, when going from bt709 to bt2020 it will map the bt709 blue primary to something which isn't the bt2020 primary, something less blueish
18:38emersion: and then when doing bt2020→bt709 it will map the bt2020 blue primary to the bt709 blue primary?
18:38swick[m]: yes, so it can take bluer colors from bt2020 to move there
18:38emersion: i see
18:39emersion: thanks for all the explanations!
18:39swick[m]: otherwise you get banding effects
18:39emersion: i feel like i understand at least .1% of color management now :P
18:39swick[m]: there is an analog to brightness where it's easier to see
18:39emersion: banding?
18:40emersion: banding would be a precision issue no?
18:40swick[m]: any kind of color gradiant which lies outside of bt709 for example would end up getting mapped to the same color in bt709
18:40emersion: right
18:40swick[m]: that also creates banding
18:41emersion: hm, does it?
18:41emersion: i would've expected… uniform color areas where there was a gradient before
18:41swick[m]: it depends on the direction of the color gradiant but in general the gradient becomes less intense
18:42emersion: like, two points of a gradient taken out of the bt709 would end up with an identical color
18:42emersion: anyways, doesn't matter much, i see what you mean
18:43swick[m]: well, it projects the gradient curve to the gamut boundary which must be equal or less in length
18:44swick[m]: but yeah, depends on the gradient
18:45emersion: and so, "gamut mapping" is this trick of converting one color space to the other without damaging the colors too much?
19:13swick[m]: yeah
21:20JoshuaAshton: We have some gamut mapping code in Gamescope if you want to take a look emersion
21:21JoshuaAshton: In the short term we are planning on shipping our own "vibrant deck" type thing but done properly using the AMD DC HW's 3D LUT + Shaper LUT