Display driver enhancement topics

From OMAPpedia

(Difference between revisions)
Jump to: navigation, search
Line 17: Line 17:
* Support DRM
* Support DRM
** support security model for direct rendering.. ie. even though rendering may be direct, not indirect thru display server, permission to put pixels on the screen must still come thru the display server.  (This comes as part of DRM, but any non-DRM approach would have to re-invent this mechanism.)
** support security model for direct rendering.. ie. even though rendering may be direct, not indirect thru display server, permission to put pixels on the screen must still come thru the display server.  (This comes as part of DRM, but any non-DRM approach would have to re-invent this mechanism.)
 +
*** see slide 7: http://www.slideshare.net/moriyoshi/x-architectural-overview
** separation of buffers (ie. drm_framebuffer objects) from display path.. so at runtime we can dynamically switch between rendering via overlay/pipe(s) and/or GPU without reallocating buffers
** separation of buffers (ie. drm_framebuffer objects) from display path.. so at runtime we can dynamically switch between rendering via overlay/pipe(s) and/or GPU without reallocating buffers
*** building on top of this we can implement virtual overlays to handle use-cases where there are more videos being rendered than there are pipes available in DSS..  a virtual overlay means using the GPU (or writeback pipe, or 2d hw in 4460/omap5 and later) to do YUV->RGB scale/blit of multiple different video streams into a shared overlay layer which is the same size as the framebuffer.  This preserves the semantics of a non-destructive overlay and per-pixel ARGB blending, and bypasses the windowmanager/compositing step for video, so it is still more efficient then YUV->RGB blitting into the framebuffer layer.
*** building on top of this we can implement virtual overlays to handle use-cases where there are more videos being rendered than there are pipes available in DSS..  a virtual overlay means using the GPU (or writeback pipe, or 2d hw in 4460/omap5 and later) to do YUV->RGB scale/blit of multiple different video streams into a shared overlay layer which is the same size as the framebuffer.  This preserves the semantics of a non-destructive overlay and per-pixel ARGB blending, and bypasses the windowmanager/compositing step for video, so it is still more efficient then YUV->RGB blitting into the framebuffer layer.

Revision as of 20:25, 7 March 2011


Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox