I'm running Libreelec 13 nightly on a PI 4B and a Samsung S95B tv. I would like to customise the HDMI output to my tv, instead of a fixed 12-bit RGB with 10-bit content and 8-bit RGB with regular 8-bit content and the GUI. It looks like Libreelec takes the highest bit-depth reported by the EDID, which is 12-bit for my tv. However, since I play 10-bit files at the most, I would like to output as close to native depth as possible. I don't trust what my tv does with 12-bit, since its panel is 10-bit, it might just truncate the last two bits, so I'd rather output 10-bits instead. I would also like to control the dithering, if applied. When playing 10-bit HDR white ramps and color gradients I see some flickering 'snow' or 'noise' on my display, suggesting temporal dithering is being taking place. I would like to control applying temporal/spatial dithering and favour rounding. Similarly, i would like to control the hardware resizing algorithm and debanding, if possible. Other settings would be custom YCC chroma subsampling and output range (limited/full). Kodi option for the latter doesn't seem to affect the video output.
I've tried using hdmi_ options in config.txt, such as hdmi_pixel_encoding, hdmi_force_mode, hdmi_deep color, hdmi_drive to force KMS but they doesn't seem to affect the output of cat /sys/kernel/debug/dri/0/state. I've also tried adding parameter to cmdline.txt such as video=HDMI-A-1:1920x1080-10@60eD,color_format=RGB, but it doesn't affect the output, it seems hardwired to 12-bit RGB with my tv.
Is there a way to pass kernel options or modify filesystem node parameters for the KMS to customise these settings?
chewitt HiassofT
Customising output bit depth, color format, dither, deband, hw resizer
-
wyup -
April 5, 2026 at 4:53 PM -
Thread is Unresolved
-
-
Is there a way to pass kernel options or modify filesystem node parameters for the KMS to customise these settings?
I have fuzzy recall RPi5 uses 10-bit internally padded to 12-bit for output in some circumstances as the SoC doesn't natively support the required 10-bit output. There is some upstream work being done to improve output from Kodi which might indirectly influence things, but the direct and short answer to the question above is "No"
-
I have fuzzy recall RPi5 uses 10-bit internally padded to 12-bit for output in some circumstances as the SoC doesn't natively support the required 10-bit output. There is some upstream work being done to improve output from Kodi which might indirectly influence things, but the direct and short answer to the question above is "No"
I think you are thinking of the yuv422 4kp60 hdmi output.
It's the hdmi spec that doesn't support 10-bit explicitly - it just uses the 12-bit timings with two padding bits.
Even if it did, it would look identical (there's no logical reason why sending 10-bits would look different than sending 10-valid-bits in a 12-bit word).
-
What is the output color format, range and bit depth that Libreelec outputs from a 10-bit HDR 24p video with 4kp60 hdmi enabled, 2160p out to a Enhanced HDMI port UHD TV on a RPI4?
I've tried a 23.976 4:4:4 2160p 10-bit HDR chroma subsampling test clip and my tv does not pass the 1:1 pixel accuracy. In theory, the PI4 supports 4:4:4, 23.976p, 2160p HDMI out if my tv supports it, does it? I have it configured as HDMI-PC in, to allow RGB.
It seems the PI4 always outputs full-range, no matter what I select in 'Force 16-236 output' GUI option. It only affects the GUI, not the video playback.
Are there any parameters to choose from drm prime hardware decoder & video processor such as scaling, dithering or debanding?
Is dithering and debanding being applied from 8-bit to 10-bit?