Posts by 2re

    2re

    That doesn't mean anything. What matters is the type of LCD (or OLED) panel used.

    All TVs that support DV have 12 bit pathways. I think some high-end Sony TVs are even 14 bit in that respect. I have one that supports DV, but the panel is 8bit + dithering, which helps a bit with banding, but it's still a 8bit+ TV. As long as the panel itself is limited to 10 bits, all that is pretty negligible.

    I'm not saying your wrong, but as long as the manufacture's don't advertise the actual bit size of the display, but only the supported bit size I'ts hard to tell, but I found some reports on several test done an newer high-brand TV's and It stated several of them actually were 10 and 12 bit displays. Unfortunately I can find the source :( But if I do, I'll post them :)

    But do you see it switching from 8 bits in SDR to 10 bits in HDR?

    HDR10 is just 10 bits, also all TVs are only 10 bits.

    Only Dolby Vision currently is 12 bits, but since displays are physically still 10 bits, there's no real benefit to it.

    Most of Samsungs newer displays are 12bit, but it's never specified in the spec. ;)

    One mustn't mix bits & HDR. I you set your GUI to 720p and disable rate change, than play a 4k HDR movie your TV will report 720p HDR. But this has nothing to to with the amount of bit used for colors. ;)

    Dose your AVR/TV actually say YcbCr 444:30bit (3*10bit) or is it just the HD/UHD(HDR) that changes?

    Nope, no auto-switching on bits on my boxes. It sticks to what's in "/sys/class/amhdmitx/amhdmitx0/attr".

    But it do switch from 444, 422 and 420 depending on values in disp_cap. So the trick questing here is, Can one also define Bits in disp_cap?

    Anyone??