05:16 i509vcb: Clarify what you mean by protocol fragmentation?
05:16 i509vcb: It's one of the two:
05:16 i509vcb: 1. Compositor specific protocols/many protocols having varying levels of support
05:17 i509vcb: 2. the protocols have features thrown across many protocol definitions
10:28 ifreund: hmm, it looks like creating multiple xdg-decoration objects for a single xdg-toplevel isn't forbidden
10:28 ifreund: is that correct?
10:32 davidre: No
10:33 davidre: see already_constructed error
10:33 ifreund: ah, it's hidden
10:33 ifreund: thanks
10:34 pq: argh
10:35 pq: astlep4 doing join/quit flooding
10:36 ifreund: I didn't even realize due to weechat's filter
10:36 ifreund: I can imagine that's super annoying in some clients though
11:36 wlb: weston Merge request !1187 opened by Loïc Molinari (molinari) weston-log-flight-rec: Map ring buffer using memset() https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1187
11:52 wlb: weston/main: Loïc Molinari * weston-log-flight-rec: Map ring buffer using memset() https://gitlab.freedesktop.org/wayland/weston/commit/ad141defcdd5 libweston/weston-log-flight-rec.c
11:52 wlb: weston Merge request !1187 merged \o/ (weston-log-flight-rec: Map ring buffer using memset() https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1187)
11:57 pq: I think stale color-management protocol !14 threads have been cleaned up now. :-)
13:11 pq: swick[m], I'm happy we chose to use CICP, it seems to save a lot of work and offers vocabulary.
13:12 pq: and dropping EDR dropped a lot of open questions from protocol, but those still need to be solved in compositor implementations anyway, which probably gives us a good idea of how to interface with EDR.
13:13 JEEB: yea these wordings were for ~20 years in the video specs
13:13 JEEB: but now they're in a separate common coding points thing
13:13 JEEB: also what in this context is EDR?
13:14 JEEB: IIRC what Apple called EDR is just floaty linear scRGB
13:14 pq: the Apple invention of Extended Dynamic Range
13:15 JEEB: yeh, MS also lets you set that, tested it with mpv back when, and I used that with non-HDR wide gamut stuff in libplacebo
13:15 pq: and specifically use "EDR value" as a luminance factor relative to SDR peak white luminance
13:15 JEEB: yea that is just a hint for you regarding how far you can go I think. SDR graphics white being 1.0
13:16 JEEB: basically apple then encoding that to whatever then gets passed over the wire
13:16 pq: well, it's the equivalent to HDR static metadata max luminance
13:16 pq: just with a different reference point
13:16 JEEB: yes, just given to the application
13:16 pq: and from app to the display system
13:16 JEEB: that you can go this high
13:17 pq: the concept is not difficult, but getting all the reference points right is confusing
13:17 JEEB: didn't notice that that bit, but makes sense. it keeps it simple for people used to SDR, and as such it can be handled in CICP
13:18 JEEB: since you just set primaries to BT.709, linear transfer, your sample format is float16
13:18 pq: EDR has nothing to do with CICP
13:18 JEEB: I mean, the thing that goes over the compositor thing can be advertised by means of CICP :P
13:19 JEEB: the specific 1.0-based metadata does not of course
13:19 JEEB: but as such, scRGB can be handled just fine
13:19 pq: sure
13:19 pq: EDR also does not seem to consider the black point at all
13:23 pq: scRGB can be implied thrugh CICP and a floating-point pixel format, but that's also useless without context of gamut and dynamic range. SDR range can be the unit range, but you also need to know the actual bounds for gamut and tone mapping.
13:24 JEEB: that I do agree with :)
13:24 pq: or, you need the assumption that the app already did the gamut and tone mapping into the monitor capabilities, so the compositor does not need to.
13:24 JEEB: yup
13:24 pq: that's what we are having a wee bit of problems still
13:24 JEEB: the idea just being that it seems to have been defined as a simple way to bolt HDR for projects already doing SDR graphics. since if you keep within the 0-1 range
13:25 JEEB: and then if you go over, that's then wider gamut/range
13:26 pq: right
13:28 pq: HDR static metadata uses nits for the luminance values, but we also need to define what they are relative to, since you practically never display nit-for-nit.
13:29 JEEB: they're relative to the reference mastering environment as far as I can tell. so if you are in a brighter state than that, it's OK to raise brightness
13:29 JEEB: meanwhile apple went the other way
13:29 JEEB: they define "ambient environment brightness" metadata
13:29 JEEB: and then you look at your ambient environment and adjust either way
13:30 JEEB: iphone videos have this at 500 nits or so :D
13:30 pq: exactly, so how do we encode a reference mastering environment in the protocol? :-)
13:30 JEEB: *ambient viewing environment
13:30 pq: HLG also went the other way. Or should we way sideways?
13:30 pq: *say
13:30 JEEB: and now the static image people are defining their own way of presenting HLG ;)
13:31 JEEB: which is not BT.2100
13:31 JEEB: https://jvet-experts.org/doc_end_user/documents/29_Teleconference/wg11/JVET-AC0316-v1.zip
13:31 JEEB: noticed this while looking at the jvet document register :)
13:31 pq: ugh, can't they deliver just pdf...
13:32 JEEB: thank goodness for libreoffice
13:32 pq: so much for the brand new absolute luminance signalling...
13:34 JEEB: also you can check the test result diff in https://git.videolan.org/?p=ffmpeg.git;a=commit;h=5de565107a32414aec9f079940c76812265157c5 for how the ambient viewing env metadata looks
13:34 JEEB: "implemented for the shits and giggles"
13:34 pq: JEEB, https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/23 btw.
13:36 JEEB: but yea, it seems like people except for apple (or maybe even apple for non-metadata'd content) just expect things to be according to the reference mastering environment. and then if that ambient viewing environment metadata is available, they utilize that as the content base
13:38 JEEB: the ambient viewing environment metadata block comes straight out of H.274, which yes is one-off from H.273 xD
13:39 JEEB: pq: I think you have noticed by now that due to various headroom reasons I'm quite lazy to actually register to various places :)
13:39 JEEB: while I'm quite available @ IRC which is low-effort
13:40 pq: it's not low-effort for me :-p
13:40 JEEB: :)
13:42 pq: gosh, the title of H.274 is so... descriptive
13:45 pq: and only 110 pages, but I guess it's more than just viewing environment
13:53 JEEB: yea
13:53 JEEB: it contains all of the metadata blocks that have been found useful in H.264+
13:54 JEEB: and for whatever reason it also then contains
13:54 JEEB: > Parsing process for k-th order Exp-Golomb codes
13:54 pq: I was just thinking "all the parameters we could ever think of"
13:54 JEEB: and yea, if you have a seat at JVET/ISO/etc then as long as your proposal is not too insane it will probably get included
13:55 JEEB: I saw them adding stuff like neural network post-processing
13:55 pq: oh poor video codec devs
13:57 JEEB: yes, and esp. open source ones like me. who need to hyena draft document archives to figure out things
13:57 JEEB: or gop-stop students who might have PDFs
14:00 pq: The decision to have compositors advertise the CICP they accept was blatantly correct.
14:01 JEEB: yea
14:02 pq: swick[m], scene-referred HLG and PQ are coming. No, we should not support them. :-P
14:02 JEEB: :DD
14:03 swick[m]: uhm...
14:03 swick[m]: HLG is scene-referred already?
14:03 JEEB: yea
14:03 JEEB: which is why you utilize the BT.2100 currently with it
14:04 JEEB: *BT.2100 OOTF
14:04 JEEB: swick[m]: he is referring to the zip I linked which contains a doc which raises a question whether the static image folks' new way of presenting it on screen will affect CICP
14:04 JEEB: since they are defining it as part of a new spec
14:05 swick[m]: oh boy
14:05 pq: a different kind of scene-referred, one without the display OOTF or the reference OOTF.
14:05 pq: a.k.a scene-referred harder HLG
14:07 pq: a.k.a useless to us since not made for display
14:08 pq: oh, also the provision to use reference display metadata with the "regular" HLG
14:09 pq: I *knew* it! HLG did seem to be missing pieces related to color gamut mapping.
14:11 pq: so two new HLG variants and one new PQ variant, whee
14:11 JEEB: yea, will see if it ends up as new CICP values or the same
14:11 JEEB: kinda like the two new YCgCo reversible things
14:12 JEEB: (latest draft for CICP @ https://jvet-experts.org/doc_end_user/documents/29_Teleconference/wg11/JVET-AC1008-v2.zip )
14:12 pq: I think the weekend is looking pretty attractive.
14:12 JEEB: indeed :)
14:13 JEEB: too bad I do multimedia as a hobby so for me the less boring stuff starts now (I need to get HDR/ICC profile stuff etc passed into FFmpeg encoders)
14:14 pq: If I ever end up with the original HLG guys in the same conference call again, I should ask what they thing of these new things.
14:14 JEEB: :)
14:18 swick[m]: we learned a lot since then
14:19 swick[m]: still think that someone should add CICPs for reference viewing environment and reference display characteristics
14:19 JEEB: that's in the H.273+1 spec :)
14:19 JEEB: ahh, right. enums for teh standard values. gotcha
14:20 swick[m]: jup
14:20 JEEB: for the record I don't even recall where the definition even is
14:20 JEEB: ITU 2408, maybe?
14:21 swick[m]: for what exactly?
14:21 JEEB: reference viewing environment
14:22 JEEB: I just know it's probably something very dark
14:22 JEEB: thus in general you only have to adjust upwards if you know that your viewing environment is bright
14:22 swick[m]: well, the point here is that there are different recommendations for the reference viewing environment
14:23 swick[m]: if there was a single definition that all content uses we wouldn't have a problem
14:23 swick[m]: reading ITU-T H.274 is exactly what I wanted to do today :(
14:24 pq: still beats rpm packaging? :-)
14:24 swick[m]: SMPTE ST 2084 for example just specifies everything iirc
14:24 swick[m]: heh, maybe, not sure yet
14:26 swick[m]: mh, SMPTE ST 2084 lists 3 common reference viewing environments
14:27 swick[m]: this is going to be fun
14:30 JEEB: and I think that is partially why Apple (?) came up with the h.274 ambient viewing environment metadata that I implemented in FFmpeg. because iphone videos are generally captured in a manner to also be viewed in such an environment. (so instead of making them less bright and then just boosting, they instead save that it is meant to be viewed at 500+ nits)
18:18 useless: hi all. i'm fighting with the infmous disappearing mouse pointer trick. i'm out of ideas. it vanishes if i cross a window that would alter the pointer and it never comes back. seems limited to wlroots as weston does not have the bug
18:20 ifreund: useless: it's more likely an issue with your wlroots-based compositor than wlroots itself
18:20 ifreund: perhaps you should ask them for support?
18:20 useless: ok, but i've tried several with the same results
18:21 useless: wlroots seems to be the common thread
18:21 ifreund: do you have some xcursor theme installed?
18:21 useless: i do now, it was part of my troubleshooting
18:21 ifreund: and have you somehow told your compositor to use it?
18:22 useless: i've told everything (i think) to use Adwaita
18:33 kennylevinsen: Is it a specific window? So you use scaling?
18:33 kennylevinsen: *do
18:35 useless: no, its all windows with all toolkits...except alacritty
18:39 useless: also, i'm using nvidia-drivers
18:40 useless: alacritty shows no issues but dmesg shows that it is segfaulting...a lot.
19:22 useless: i was able to capture this as the cursor vanished:
19:22 useless: info: wayland.c:1624: cursor theme: (null), size: 24, scale: 1