drm/ssd130x: Change pixel format used to compute the buffer size

The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
ssd130x_buf_alloc()") used a pixel format info rather than a hardcoded bpp
to calculate the size of the buffer allocated to store the native pixels.

But it wrongly used the DRM_FORMAT_C1 fourcc pixel format. That is for
color-indexed frame buffer formats, while the ssd103x controllers don't
support different single-channel colors nor a Color Lookup Table (CLUT).

So the correct pixel format to use in this case is DRM_FORMAT_R1 instead.

Since both formats use a eight pixels/byte, there is no functional change
in practice by this patch. Still, the correct pixel format should be used.

Suggested-by: Geert Uytterhoeven <geert@linux-m68k.org>
Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>
Reviewed-by: Geert Uytterhoeven <geert@linux-m68k.org>
Reviewed-by: Thomas Zimmermann <tzimmermann@suse.de>
Link: https://patchwork.freedesktop.org/patch/msgid/20230713085859.907127-1-javierm@redhat.com
This commit is contained in:
Javier Martinez Canillas 2023-07-13 10:58:07 +02:00
parent 36672dda2e
commit 7dae503584
No known key found for this signature in database
GPG Key ID: C751E590D63F3D69

View File

@ -153,7 +153,7 @@ static int ssd130x_buf_alloc(struct ssd130x_device *ssd130x)
const struct drm_format_info *fi;
unsigned int pitch;
fi = drm_format_info(DRM_FORMAT_C1);
fi = drm_format_info(DRM_FORMAT_R1);
if (!fi)
return -EINVAL;