Display driver enhancement topics
Revision as of 15:39, 7 March 2011 by Robclark
- Configurability: Add ability to handle multiple paths between overlays/managers/panels
- Scalability: Make it easier to add new overlays / managers in DSS
- Add writeback support
- FB - V4L2 inter-operability
- Support access to HDMI via both Audio and Video devices
- Add ability to switch channel-out of overlays per-frame, such that e.g. one iteration goes to writeback pipeline, next iteration to manager and so on.
- HDMI: Add standard way of HPD / user-space notification mechanism.
- Power management: current design is that if any one panel is enabled, some minimum clocks are used. If the panel is a smart panel, we can cut the clocks, but this is not supported currently.
- Support DRM
- support security model for direct rendering.. ie. even though rendering may be direct, not indirect thru display server, permission to put pixels on the screen must still come thru the display server. (This comes as part of DRM, but any non-DRM approach would have to re-invent this mechanism.)
- separation of buffers (ie. drm_framebuffer objects) from display path.. so at runtime we can dynamically switch between rendering via overlay/pipe(s) and/or GPU without reallocating buffers
- building on top of this we can implement virtual overlays to handle use-cases where there are more videos being rendered than there are pipes available in DSS.. a virtual overlay means using the GPU (or writeback pipe, or 2d hw in 4460/omap5 and later) to do YUV->RGB scale/blit of multiple different video streams into a shared overlay layer which is the same size as the framebuffer. This preserves the semantics of a non-destructive overlay and per-pixel ARGB blending, and bypasses the windowmanager/compositing step for video, so it is still more efficient then YUV->RGB blitting into the framebuffer layer.
- I'm a bit undecided at this point whether virtual overlays should/could all be hidden in the driver, or whether there should be a userspace component in the display server. Typically, if you look at vaapi (libva) it is using a sort of extension to DRI to render frames of video, in a similar way to how flipping is handled for GL apps. If we took this approach, then it could be the display server that decides to fall back to using virtual overlays. I guess there must be some analogy to DRI for android.
- also, separation of buffer and pipe is needed in the case of virtual display spanning multiple monitors, so that video window can also span multiple monitors.
- Support MCF