03:44JoshuaAshton: zamundaaa[m]: Do you even know if your BT_2020 packet is being respected?
03:47JoshuaAshton: What are you sending content wise with 2020 selected? Are you doing a 709->2020 CTM before sending or just sending 709 raw as 2020
03:48JoshuaAshton: If you have 709 selected and sending 709, most monitors don't use 709 primaries, but their widest possible gamut for the content. There is like a "Wide Gamut" vs "709" option usually
03:48JoshuaAshton: If you have 2020 selected and sending stuff after doing 709->2020, then yes it will look more washed out in comparison to that
03:50JoshuaAshton: In Gamescope there is some code for doing 709 -> native inside of 2020 so we can send HDR10 PQ data that looks the same as it would in SDR for a typical user
03:51JoshuaAshton: ie. content colorimetry = 709, display colorimetry = native but the output encoding colorimetry = 2020
03:54JoshuaAshton: pq: Most desktop displays don't use the gamut in the HDR metadata at all. I've tried setting that data to rand() across the selection of random gaming/desktop displays I tried when I was in office and saw literally nothing ever change.
03:55JoshuaAshton: I should have tested it on the big LG TV we have but I forgot
03:57JoshuaAshton: ohhh
03:57JoshuaAshton: also interesting that you are seeing BT 2020 extremes being mapped to native gamut... that's a bit different to what I have seen here. Can you let me know what display you have?
04:12JoshuaAshton: zamundaaa[m]: I am also curious as to what bpc you are outputting at. FWIU, some displays only support it properly when in 10-bit+ mode...
06:11wlb: wayland Issue #384 closed \o/ (wayland backend: wl_display@1.error(wl_display@1, 1, "invalid arguments for xdg_toplevel@17.resize") in GTK3.0 https://gitlab.freedesktop.org/wayland/wayland/-/issues/384)
07:28MrCooper: JoshuaAshton: from the scrollback, it's a Samsung C49RG9/CRG9
07:41pq: JoshuaAshton, re: desktop displays don't use gamut from HDR metadata; right... I'm not surprised, thanks for confirming.
07:42pq: JoshuaAshton, did you get any feeling about how those displays map BT.2020 signal to their native gamut?
09:59zamundaaa[m]: JoshuaAshton: I'm sending content encoded as BT2020 with `Colorspace = BT2020_RGB`
10:01zamundaaa[m]: That works correctly with my Samsung TV, but with the CRG9 49" it's too desaturated. Not in comparison to the monitor in wcg mode, but in comparison to sRGB mode and other displays too
10:18pq: I would guess the TV might actually use the HDR static metadata.
10:21pq: That's why the metadata exists in the first place, and I don't understand how monitors intended to get away with not implementing it...
10:23zamundaaa[m]: The only reasonable explanation I'd have is that the HDR stuff in most monitors is built around gaming only
10:24pq: very plausible
10:24zamundaaa[m]: As games don't send usable metadata, might as well ignore it completely
10:25pq: it's probably a cycle: first HDR monitors did it like this, then games followed, then more monitors followed what games did
10:27pq: it's not enough to not send metadata, games also need to pretend to use the full BT.2020 color gamut if that's what the monitors map their native gamut.
10:27pq: otherwise it would look more or less de-saturated
10:29pq: nevermind that game artists have not been able to actually view the full BT.2020 color gamut, and I'm not sure they could even today. It would require monochromatic primaries, i.e. lasers.
10:31pq: but the closer to monochromatic primaries you get, the more likely it is that people with perfect vision start seeing the colors differently.
10:32zamundaaa[m]: I guess that's half the reason for why the FreeSync HDR stuff exists
10:49swick[m]: it's why I really want SBTM
10:51swick[m]: but in general there needs to be way more rigorous testing and certification in place
10:52swick[m]: most displays are a bad joke
11:53tawonga: weechat switch server
16:27JoshuaAshton: pq: I have seen a laser backlit display before at least :D
16:27JoshuaAshton: I know some people will frown at this or whatever, but I guess we should probably start making a repo for display erratas?
16:29JoshuaAshton: zamundaaa[m]: I would be doubtful of that. From what we can tell, most HDR games are typically just ""mastered"" in a dark room with some big HDR TV until it looks good.
16:31JoshuaAshton: We found a decent amount of HDR games that looks completely different to their SDR counterparts which is obviously probably not the intent.
16:32JoshuaAshton: But at least I can validate that the games look equally... bad on both Windows and Linux/Gamescope in HDR :S
16:32JoshuaAshton: There are some that genuinely do it really really well though. Most first-party titles are great.
16:33JoshuaAshton: Forza Horizon 5 looks great and it doesn't decide to just go overboard with everything being stupidly bright or over saturated. Someone actually went and mastered that game for HDR and it looks incredible.
16:34JoshuaAshton: But you can easily tell games where HDR was an afterthought or just enabled by some random toggle in the engine and not really tested aside from "wow this looks bright and cool on my TV at home!" :c
16:37JoshuaAshton: swick[m]: I agree with more rigorous testing... Even stuff with DisplayHDR branding is completely fucked sometimes. Which is interesting because they actually have a full conformance suite for that...
16:38JoshuaAshton: It makes me wonder if what zamundaaa[m] is seeing is due to not outputting in 10-bit. The DisplayHDR CTS does test the behaviour here, but it does only test this in 10-bit mode.
16:40JoshuaAshton: (To clarify it tests that outputting colors that would be native display primaries in 2020 output encoding do correspond to the native primaries)