-
Notifications
You must be signed in to change notification settings - Fork 8
2025 Multimedia in WebKit
- GitHub issue: https://github.com/Igalia/webengineshackfest/issues/58
- URL: https://meet.jit.si/WEH2025-Multimedia
- Slides: https://github.com/user-attachments/files/20566744/multimedia_webkit_2025.pdf
Audience probing: many attendees are familiar with Multimedia-related web specs.
Round of introductions.
Philipp goes over quickly showing the separation between WebCore/Modules and Webcore/platform folders for the MediaCapabilities spec as a simple use-case.
The migration of some multimedia capabilities are being moved to the GPUProcess. Philipp
Younnen explains that WebAudio also uses GPUProcess for the high-priority threads with the output being shared with the WebProcess via IPC and shared memory mechanism.
Philipp mentions that with DMABUF support, that paves the way to making the transition to GPUProcess easier.
Alex asks what are our limitations with GStreamer. Philipp and Younnen mention the fact that only part of the multimedia processing should be moved to the GPUProcess. One example is in WebRTC where RTP processing/parsing would still be done in WebProcess but the actual encoding/decoding is done in the GPUProcess. In the GStreamer backend that would mean having some pipelines in the GPUProcess, while others remain in the WebProcess.
Alex asks whether the IPC mechanisms uses by the Apple port are all defined in WebKit. Younnen answers that there might be some IPC in the lower level components with the stack but they are fully hidden and outside the control of WebKit, so the IPC logic in WebKit is all implemented in WebKit itself.
Enrique asks about (de)muxing and whether it should be done in the GPU or Web Process. Younnen answers that ideally such operation should be done in the WebProcess which is the most sandboxed process.
Audience member (name?) asks whether with the hardware decoding/encoding transition to GPUProcess it would still be possible to use custom hardware elements. Philipp answers that it should still be possible and you could configure this behavior based on the rank of plugin elements which can be dynamically changed e.g. via environment variable.
Alex asks about the situation of GStreamer on Android. Philipp mentions that Adrian would know more about this situation but as far as he knows it works there and WPE uses Cerbero which properly packages GStreamer for Android and uses the Android multimedia framework for HW-acceleration.
Younnen explains the current status of Managed MSE and that it is currently working well on Apple platforms. Enrique asks what is the difference between MSE vs Managed MSE. Younnen explain that Managed versions allows the User Agent to more optimally control buffer levels in e.g. 5G connections on mobile.
Discussion about resource usage in the pages that create hundreds of video elements (e.g. scrolling on Reddit). Philipp mentions that with the GStreamer many threads can be created and slow down your computer. Younnen mentions that Apple port had this issue and it was fixed but he can't remember how exactly. It is brought up that perhaps this should be discussed in the Media WG so that behavior converges between browser engines to tackle this.