09:20colinmarc: what's the correct timestamp value for wp-presentation for my remote compositor? it seems like I need to support this protocol for gpu accelerated apps, but I'm not sure what to put in the callback. I have a virtual refresh timer in my compositor, should I just treat that instant as "the pixels turn to light"? or try to estimate the display latency all the way on the client side, including network?
09:20colinmarc: my goal is just the best compatibility with existing apps
09:22kennylevinsen: The purpose of the presentation timestamp is to communicate, in the local clock, when the image hit your eyeball
09:23kennylevinsen: well, at least when the image arrive at the display
09:24colinmarc: Right, but this is a remote-desktop situation. even if I were able to add up network + remote client display latency, it's not clear that that would be useful to the wayland client. For the purposes of audio sync, it seems like it would just want to match up when the video frame is encoded to when the audio frame is encoded
09:25kennylevinsen: I am not sure how the audio latency would be dealt with in this case. In theory, you'd want the presentation time for both a video frame and audio frame so that the client could sync up the two...
09:26kennylevinsen: (Or one could solve the problem the opposite way by attaching the intended time of presentation for both frames and then let the respective servers deal with it)
09:27kennylevinsen: In any case, I'd start out with just doing it the way the protocol intended, then if that's terrible use that as a point of discussion :)
09:28kennylevinsen: in which case, a requested presentation feedback should emit the timestamp of when the display server presented the content, which would include transfer
09:28colinmarc: Right, it seems like the wayland protocol is assuming that the audio latency is measured out of band and the client deals with matching those up. Since I control the pulseaudio side, I could indeed give realistic values for the audio latency as well. it seems more like it would be effective to fudge both up to the network point, though
09:29kennylevinsen: That might be good enough for syncing up video, but some apps might want to know the actual system latency - say, a rhythm game
09:29kennylevinsen: (ignoring the fact that playing osu! over remote desktop would be cursed)
09:32colinmarc: The remote desktop part has its own sync mechanism, because the encoded audio and video frames have a PTS attached
09:32colinmarc: including the network latency in the information given to the game just seems like extra complexity to me
09:33colinmarc: I guess it could try to include input latency, to your point. but most games like that allow you to configure it anyway
09:35colinmarc: One thing I'm confused about is how apps use presentation time, in practice. I get the mpv audio/video sync case. But it also seems like the mesa/xwayland/etc pipeline uses it for games, as well?
09:37MrCooper: FWIW, Xwayland doesn't use wp-presentation yet, it uses frame events only so far
09:39colinmarc: ah, that's good to know, thank you!
09:40colinmarc: seems like I had a false impression somewhere
09:47any1: presentation time is not only used for audio/video sync but also for pacing during playback. If you have a jittery network, your frames will not arrive at a steady rate, so a jitter buffer is needed for smooth playback.
09:56YaLTeR[m]: I feel like if you include network latency in presentation time, mpv timing will just completely screw itself up, but idk
09:56YaLTeR[m]: Since presumably it was written with the assumption that the times are mostly steady
09:57MrCooper: I had similar thoughts
10:15daniels: ++
10:15daniels: what you want to do is to keep as consistent a cadence as possible of sending the events from your local time, e.g. a faked 60Hz timer
10:17daniels: keep that lined up with Pulse so the app sees one set of consistent timestamps which is shortly before you push them to the network
10:17daniels: then correlate both streams on the far side so you can keep them in sync there
10:18daniels: the fact that you might be +250ms from the time reported to the client is neither here nor there; what's important is that you get a consistently-timed set of streams
10:18colinmarc: cool, yeah, I'm already doing the latter with video/audio PTS. that seems much simpler to implement, thanks.
10:19colinmarc: I already have a 60mhz timer for encoding anyway
10:19kennylevinsen: 60 millihertz might be a bit too fast
10:19colinmarc: heh
10:19colinmarc: oops
10:20colinmarc: actually maybe I will skip this protocol for now, if xwayland/mesa aren't using it. not sure why I thought they were
10:20colinmarc: is there maybe unmerged work where they are using it? or is that just the new wp-commit-timing thing?
10:21kennylevinsen: mesa supports for some wait-for-present functionality in the Vulkan WSI
10:24daniels: yeah, it's VK_KHR_present_id / VK_KHR_present_wait, which Proton really wants to have
10:24daniels: (SDL uses it as well IIRC)
10:24colinmarc: SDL wayland?
10:32daniels: mm, I'm mistaken, it doesn't actually use it
10:33colinmarc: just searching online it does look like DXVK uses that extension
10:36colinmarc: https://github.com/doitsujin/dxvk/releases/tag/v2.3
10:37colinmarc: ok, seems like it's plausible I do want the presentation time protocol
10:37colinmarc: and not hard to implement that way (based on the fake timer) so I'll just do that
13:46wlb: weston Merge request !1588 opened by Joan Torres (joantolo) Add a new client: weston-color https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1588
17:10FreeFull: I'm trying to share a wayland socket between two different users. So far I only got it by using the same XDG_RUNTIME_DIR for both users, for some reason just symlinking the socket doesn't work
17:12FreeFull: Do clients explicitly check if `$XDG_RUNTIME_DIR/$WAYLAND_DISPLAY` is a socket file?
17:19FreeFull: Ah, I figured it out, for some reason XDG_RUNTIME_DIR wasn't set at all for the second user by default