[TESTING][S905(X)] 10bit/HDR/Dithering Test Builds & Discussion

  • However, I do have a very fundamental question

    I tried playing back a 10 bit video which was reported by the AVR as 8 bit

    Subsequently, I enabled 10bit on the box by setting the class attribute as follows (on the original kszaq 8.2 commit)

    Code
    echo '444,10bit' > /sys/class/amhdmitx/amhdmitx0/attr

    However,now the AVR reports 10bit>10bit at all times, i.e. not only while playing a 10bit video but even regular 8 bit videos

    Short of copying said files on a USB drive and running a visual comparison between playback off the USB drive on the TV vs Kodi, is there anyway to conclusively determine what bit depth Kodi is actually sending out the output as?

    My guess is that the echo command above sends a code to the HDMI TX subsystem to force HDMI output to 10 bit permanently.

    This has no massive downside for 8 bit content (there should be no visual difference between sending 8 bit content as 8 bit and sending it as 10 bit with 00s in the 2 least significant bits - unless your display dithers 8 bit content to 10 bit) but it does mean you can't tell 8- from 10-bit content from the output format.