My Yamaha RX-V6A AVR reports this compatibility table. The default setting is mode “4K Mode 1” and for this mode the best compatibility in the range 4K 60-24Hz is YCbCr 4:2:2
Yes.
4K Mode 2 = 'Enhanced HDMI Mode support' on some platforms. This allows the higher bandwidth HDMI 2.0 modes to be used that allow HDR support at >4:2:0 subsampling with >8 bit depth.
4K Mode 1 only supports the lower bandwidth HDMI 2.0 modes that are based around 4:2:0 at 2160p50 and above for >8 bit depth.
My Sony UHD TV, with the enhanced modes enabled, is equivalent to 4K Mode 2 on HDMIs 2 and 3, but 4K Mode 1 on HDMIs 1 and 4. My Denon AVR required enhanced modes to be enabled too, to pass through 2160p50 and above 4:2:2 12 bit inputs.
(4:2:0 was initially added to the HDMI 2.0 spec to allow HDMI 1.4 physical hardware in TVs etc. to support 2160p50 and above modes as it squeezed into an HDMI 1.4 bandwidth signal. ISTR nVidia and Sony both added support for 4:2:0 2160p50 and above modes as upgrades to hardware that was otherwise limited to HDMI 1.4)
Display MoreThe summary you posted is correct.
While there's some support for YCC4:4:4 in the video driver (I noticed a function checking if it'd be valid), it doesn't make use of it.
The linux drm subsystem is a bit limited in that regard, there's no connector property that would let you force eg RGB or YCC - it's all up to the driver.
The only drm connector property we can play with is the max_bpc which let's us limit component depth to max 8/10/12 bits per channel.
max_bpc defaults to 8 (so that on desktops you'll get RGB instead of falling down YCC 4:2:2, sharp text is usually more important there than 12bit which might not even be used by the software running on the desktop) and in LE/kodi on the RPi we lift that to 12bit in case of 10bit video content so we can make use of the higher output bit-depth (which is more important here than potential loss in chroma resolution - subtitles and GUI are usually up-scaled from 1080p anyways).
so long,
Hias
Thanks - that all makes sense - and totally understand why desktops would favour RGB over a subsampled 4:2:2 or 4:2:0 mode to avoid chroma smearing on fine detail (whereas video is already encoded in 4:2:0 usually so it's a moot point for video replay)
Out of interest is bit-depth checked irrespective of EOTFs - i.e. does a 1080p50 10-bit HEVC SDR Rec 709 file get output in a 10-bit or 12-bit mode?