[go: up one dir, main page]

|
|
Log in / Subscribe / Register

Running Android on a mainline graphics stack

Running Android on a mainline graphics stack

Posted Sep 13, 2017 5:43 UTC (Wed) by linusw (subscriber, #40300)
In reply to: Running Android on a mainline graphics stack by Tara_Li
Parent article: Running Android on a mainline graphics stack

There are special hardware engines for composing, also called 2D accelerators.
On the ST-Ericsson ill-fated U8500 we had a hardware block called "B2R2" which reads "blit, blend, rotate and rescale", which is what compositors need. I vaguely recall that the TI OMAP had something similar. (Maybe someone can fill in?)
If there is a mainline kernel-to-userspace abstraction for these engines is another question. I think at the time it was made into a custom character device and used directly from what is now HWC2.


to post comments

Running Android on a mainline graphics stack

Posted Sep 13, 2017 6:26 UTC (Wed) by zyga (subscriber, #81533) [Link]

My kernel knowledge is very inadequate to comment on potential APIs but, looking at various chipsets designed for STBs there were very power efficient hardware video mixers that did color conversion, scaling, and alpha blending (with limits). The idea was that a very slow CPU could offload video decoding (e.g. from the antenna or IP feed) to one block, render some simple UI onto a buffer and then blend those all together for free, every frame, perfectly. The video buffer was in YUV and the UI was in 565 RGB. There were also some specialized layers for subtitles and some other niche applications. Essentially each layer had some limited set of ways in which it could be used, with restrictions on buffer format, blending order etc.

The hardware I used to deal with ~15 years could handle one video and one bitmap layer. Later on we got more and more features, two video layers (one full features with better de-interlace and scaling features and one limited for picture-in-picture), additional layers arbitrary graphics for some nicer blending possibilities. All of this was on hardware that could not do any openGL.

Unfortunately none of that had sane drivers. At the time each vendor provided their own libraries to configure and use the video stack. Nowadays the problem is less visible because we get those speedy CPUs and even integrated graphics has a lot to offer but I suspect, if available and used correctly, we could save some power in idle-desktop / watching-video use cases.

OMAP DSS

Posted Sep 13, 2017 20:24 UTC (Wed) by rvfh (guest, #31018) [Link]

OMAP had a display subsystem (DSS) with several overlays (4 IIRC), at least one of witch could be made "secure" (inaccessible to Linux, only to the TEE, for DRM). It was storing pixels in a special crafted way that enabled cheap (free?) rotation.

I am no expert so if you know better feel free to correct me!


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds