https://tech.slashdot.org/story/13/04/11/2147208/jolla-ports-wayland-to-android-gpu-drivers
ODMs don't know how to write software, so you're better off not asking them to; the result would just be garbage anyway. All the GPU drivers are actually written by the GPU IP vendors (Qualcomm, Imagination, ARM, etc.) and they only provide Android drivers. You could try to pay them to write KMS/DRM drivers, but they'd probably quote you a price in the millions which minority platform wannabes like Jolla could not afford anyway.
--------------------------------------------------------
Phoronix article is quite low on information, and even the original post at http://mer-project.blogspot.fi/2013/04/wayland-utilizing-android-gpu-drivers.html assumes some technical knowledge of graphics stack. The basic idea is actually pretty simple. I'll try to break it down.
- The SoC vendors are willing to target only Android
- Android GPU drivers are built against Bionic libc
- The GPU drivers talk to hardware, and expose themselves via EGL and GLESv2
- EGL is basically a common API for GPU memory management, buffer (region of memory used for rendering) allocation and display updates
- GLESv2 stands in for the functionality we commonly associate with OpenGL
GPU drivers form a combination of EGL and GLESv2 libraries, each GPU vendor providing their own
This is where libhybris comes into play. The GPU driver libraries don't work without Bionic libc - so libhybris, while running on top of regular linux (and thus [e]glibc), keeps a private Bionic libc open for the GPU drivers' use, and redirects all the EGL/GLESv2 calls to the GPU driver libraries. These libraries run in their own Bionic universe, and tell the actual display hardware what to do.
The new part about Wayland support is just a logical extension of the same behaviour. Wayland already depends on EGL for buffer management, so "all" it really needs is a native display handler. Now as it happens, the native Android display structure can be mapped to the Wayland-EGL display structure. It's not trivial, but it's certainly doable. Thanks to libhybris, the Wayland libraries see a correct native display type and operate on that, while the Android GPU libraries see their respective native display type and thus can drive the hardware as ever before. After all, it's still the SAME hardware regardless of what operating system we may be running. Registers are registers and memory is still memory. From the GPU drivers' point of view nothing has changed.
So what has happened? In addition to just redirecting graphics stack calls to Android drivers, we are now also translating the display subsystem between two somewhat different systems.
If all of the above sounds eerily familiar, you are correct. In networking this kind of design is called a proxy, or if we're talking about link layer, it would be a multi-protocol label switch. Logically there's not much difference.
Phoronix article is quite low on information, and even the original post at http://mer-project.blogspot.fi/2013/04/wayland-utilizing-android-gpu-drivers.html assumes some technical knowledge of graphics stack. The basic idea is actually pretty simple. I'll try to break it down.
GPU drivers form a combination of EGL and GLESv2 libraries, each GPU vendor providing their own
This is where libhybris comes into play. The GPU driver libraries don't work without Bionic libc - so libhybris, while running on top of regular linux (and thus [e]glibc), keeps a private Bionic libc open for the GPU drivers' use, and redirects all the EGL/GLESv2 calls to the GPU driver libraries. These libraries run in their own Bionic universe, and tell the actual display hardware what to do.
The new part about Wayland support is just a logical extension of the same behaviour. Wayland already depends on EGL for buffer management, so "all" it really needs is a native display handler. Now as it happens, the native Android display structure can be mapped to the Wayland-EGL display structure. It's not trivial, but it's certainly doable. Thanks to libhybris, the Wayland libraries see a correct native display type and operate on that, while the Android GPU libraries see their respective native display type and thus can drive the hardware as ever before. After all, it's still the SAME hardware regardless of what operating system we may be running. Registers are registers and memory is still memory. From the GPU drivers' point of view nothing has changed.
So what has happened? In addition to just redirecting graphics stack calls to Android drivers, we are now also translating the display subsystem between two somewhat different systems.
If all of the above sounds eerily familiar, you are correct. In networking this kind of design is called a proxy, or if we're talking about link layer, it would be a multi-protocol label switch. Logically there's not much difference.