Bits are also reportet by AVR - but I never saw 12 bits
And what version of LE are you running?
Bits are also reportet by AVR - but I never saw 12 bits
And what version of LE are you running?
Bits are also reportet by AVR - but I never saw 12 bits
But do you see it switching from 8 bits in SDR to 10 bits in HDR?
HDR10 is just 10 bits, also all TVs are only 10 bits.
Only Dolby Vision currently is 12 bits, but since displays are physically still 10 bits, there's no real benefit to it.
But do you see it switching from 8 bits in SDR to 10 bits in HDR?
HDR10 is just 10 bits, also all TVs are only 10 bits.
Only Dolby Vision currently is 12 bits, but since displays are physically still 10 bits, there's no real benefit to it.
Most of Samsungs newer displays are 12bit, but it's never specified in the spec.
brilliant, thanks bud. Much simpler than I thought.
Matt
... and simplicity turned to complexity
2re I'd like to see a source for that.
I normally keep tabs with new display tech, and I have heard nothing about 12 bit TV panels (be it 10+dithering or 12 bit native), especially from Samsung.
2re I'd like to see a source for that.
I normally keep tabs with new display tech, and I have heard nothing about 12 bit TV panels (be it 10+dithering or 12 bit native), especially from Samsung.
Well, as I said, normally it's not in the spec. But if you insist...
Norwegian usermanual:
That doesn't mean anything. What matters is the type of LCD (or OLED) panel used.
All TVs that support DV have 12 bit pathways. I think some high-end Sony TVs are even 14 bit in that respect. I have one that supports DV, but the panel is 8bit + dithering, which helps a bit with banding, but it's still a 8bit+ TV. As long as the panel itself is limited to 10 bits, all that is pretty negligible.
That doesn't mean anything. What matters is the type of LCD (or OLED) panel used.
All TVs that support DV have 12 bit pathways. I think some high-end Sony TVs are even 14 bit in that respect. I have one that supports DV, but the panel is 8bit + dithering, which helps a bit with banding, but it's still a 8bit+ TV. As long as the panel itself is limited to 10 bits, all that is pretty negligible.
I'm not saying your wrong, but as long as the manufacture's don't advertise the actual bit size of the display, but only the supported bit size I'ts hard to tell, but I found some reports on several test done an newer high-brand TV's and It stated several of them actually were 10 and 12 bit displays. Unfortunately I can find the source But if I do, I'll post them
... and simplicity turned to complexity
lol
I'm a bit lost on this whole colour space thing. Maybe someone in this thread can help clear this up for me?
I have an Odroid C2 and I'm currently using Raybuntu's last Krypton build. One of the problems I have is that I cannot playback video that has the color space chroma at anything other than 4:2:0. 4:2:2 and 4:4:4 playback audio, but I get no video.
Would these new Leia Alpha builds potentially resolve that problem? I have a Samsung UHD TV that claims it is HDR, but I'm not sure that it is true 10-bit HDR. Mostly, I would just like to be able to playback some of these 4:4:4 videos I have even though I might not be getting true 10-bit HDR?
The Odroid C2 uses the older S905 SoC, which doesn't support HDR. But that should not result in a black screen.
Without information on the TV model, how it's connected (do you have an AVR?) and configured, (is HDMI 2.0 enabled? what resolution is Kodi GUI set at? do you have an HDMI cable capable of 4K resolutions?) it's hard to give an accurate estimation of what's wrong.