You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 8, 2023. It is now read-only.
drm/ssd130x: Change pixel format used to compute the buffer size
The commit e254b58 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
ssd130x_buf_alloc()") used a pixel format info rather than a hardcoded bpp
to calculate the size of the buffer allocated to store the native pixels.
But it wrongly used the DRM_FORMAT_C1 fourcc pixel format. That is for
color-indexed frame buffer formats, while the ssd103x controllers don't
support different single-channel colors nor a Color Lookup Table (CLUT).
So the correct pixel format to use in this case is DRM_FORMAT_R1 instead.
Since both formats use a eight pixels/byte, there is no functional change
in practice by this patch. Still, the correct pixel format should be used.
Suggested-by: Geert Uytterhoeven <[email protected]>
Signed-off-by: Javier Martinez Canillas <[email protected]>
Reviewed-by: Geert Uytterhoeven <[email protected]>
Reviewed-by: Thomas Zimmermann <[email protected]>
Link: https://patchwork.freedesktop.org/patch/msgid/[email protected]
0 commit comments