Posts by noggin

    Also worth adding that Kodi has never really been optimised for 'interlaced in, interlaced out' replay - as preserving field order, field dominance etc. is tricky and now a pretty niche requirement.

    If you configure Kodi on any platform for 480i output, and play 480i source files, chances are you'll be deinterlacing 480i to 480p and then reinterlacing to 480i for output - rather than preserving 480i. This may be fine for your use case - but a 480i 'straight through' workflow is not straightforward.

    This may be pointing out the obvious - but you are removing the # sign before each one aren't you? The presence of a # sign means the OS ignores everything after it on that line (it's designed for human readable comments that the OS ignores)

    That's now made me question how Rec 601 output is handled (given the very different RGB<->YCbCr matrices that Rec 709 and Rec 601 have) wrt 'default'?

    My Yamaha RX-V6A AVR reports this compatibility table. The default setting is mode “4K Mode 1” and for this mode the best compatibility in the range 4K 60-24Hz is YCbCr 4:2:2

    https://ibb.co/vYWb7RQ

    Yes.

    4K Mode 2 = 'Enhanced HDMI Mode support' on some platforms. This allows the higher bandwidth HDMI 2.0 modes to be used that allow HDR support at >4:2:0 subsampling with >8 bit depth.

    4K Mode 1 only supports the lower bandwidth HDMI 2.0 modes that are based around 4:2:0 at 2160p50 and above for >8 bit depth.

    My Sony UHD TV, with the enhanced modes enabled, is equivalent to 4K Mode 2 on HDMIs 2 and 3, but 4K Mode 1 on HDMIs 1 and 4. My Denon AVR required enhanced modes to be enabled too, to pass through 2160p50 and above 4:2:2 12 bit inputs.

    (4:2:0 was initially added to the HDMI 2.0 spec to allow HDMI 1.4 physical hardware in TVs etc. to support 2160p50 and above modes as it squeezed into an HDMI 1.4 bandwidth signal. ISTR nVidia and Sony both added support for 4:2:0 2160p50 and above modes as upgrades to hardware that was otherwise limited to HDMI 1.4)


    Thanks - that all makes sense - and totally understand why desktops would favour RGB over a subsampled 4:2:2 or 4:2:0 mode to avoid chroma smearing on fine detail (whereas video is already encoded in 4:2:0 usually so it's a moot point for video replay)

    Out of interest is bit-depth checked irrespective of EOTFs - i.e. does a 1080p50 10-bit HEVC SDR Rec 709 file get output in a 10-bit or 12-bit mode?

    For 10bit content the driver will check in the order RGB 4:4:4 12 bit -> YCbCr 4:2:2 12bit -> RGB 4:4:4 10bit -> RGB 4:4:4 8bit if both the sink and the RPi support that format.

    It takes max TMDS bandwidth, YCbCr 4:2:2 and 10/12 (30/36) bit flags from EDID and also max TMDS rate from RPi into account (4kp60 / 600MHz TMDS is opt-in in config.txt) and picks the first supported format.

    so long,

    Hias

    Ah - so on every format change it checks for support of the various flavours - so on most modern UHD HDR displays it will be :

    2160p30 and below - RGB 12 bit

    2160p50 and above - YCbCr 4:2:2 12 bit, or it has to fall back to RGB 8-bit (as there is no support for RGB 10-bit at 2160p50 and above, and the Pi doesn't support YCbCr 4:2:0)?

    Does the Pi ever output YCbCr 4:4:4 ?

    mathmath51 can you test with hdmi_enable_4kp60=1 in config.txt (you also need to enable HDMI Ultra HD Deep Color support in your TV's setting - this is usually HDMI-port specific) on LE 10.0.2?

    LE10.0.2 supports outputting at 10 and 12bit, LE10.0.1 only sent 8bit HDMI.

    So on LE10.0.1 your 4kp23.97 file is transmitted as RGB 4:4:4 8bit, 10.0.2 without 4kp60 enabled transmits at YCbCr 4:2:2 12bit, with 4kp60 enabled it transmits RGB 4:4:4 12bit - all that could make a difference.

    so long,

    Hias

    What is the logic in the latest LE for 2160p 10-bit HEVC SDR/HDR10/HLG replay and YCbCr 4:2:2 vs RGB / YCbCr 4:4:4 at 30fps and lower and 50fps and higher?

    • 4:2:2 12-bit is the only >8-bit format supported at all 2160p frame rates isn't it? This format often requires additional settings to be enabled in TVs and/or AVRs. (There is no 4:2:2 10-bit option in HDMI 2.0)
    • RGB/4:4:4 YCbCr >8-bit is only supported at 30fps and below (and 8-bit at 2160p50 and above) but normally works OOTB on TVs and AVRs.
    • 4:2:0 is supported only at 2160p50 and greater and on some TVs is the only >8-bit format supported at 2160p50 and above (and worked OOTB). However it's not supported by the Pi 4B?

    The “Samsung Travel With My Pet HDR UHD 4K Demo.ts” is darker on the colums at 1:19, compared with same video played on Nvidia Shield using same version of Kodi, LG TV infos says both playbacks are 4K UHD bt.2020.

    I'll see if my HD Fury Vertex reports the same static HDR 10 metadata being sent via both platforms - can you link to the content if it's in the public domain and not copyright?

    I have an RPi 4B and an nVidia Shield TV Pro 2019.

    It's important also to remember that some TVs will do an internal tone map to reflect the capabilities of the display - and this will be informed by the MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Frame Average Light Level) static metadata sent, which is part of the HDR10 standard (and carried by the HEVC codec, and in some cases additionally by the wrapper I believe) (This is assuming no HDR10+ is in play)

    Some platforms don't passthrough MaxCLL and MaxFALL metadata - and this situation is informally referred to as PQ10 rather than HDR10 (as are TVs that ignore the static metadata - though that wouldn't be relevant to this discussion)

    It's really important to separate HDR and SDR in this discussion - as the same titles mastered/graded for the two different dynamic ranges can appear very different, and any HDR->SDR tone mapping (squeezing an HDR source into an SDR signal) conversion will always have side-effects.

    Most HDR displays also have different black level, contrast, brightness etc. settings for SDR and HDR10/HLG formats and thus you need to calibrate separately for both - often on an input-by-input basis. (However Dolby Vision content often inhibits/overrides some of these controls)

    I've seen no evidence on my Pi4B LibreElec install that AVC/h.264 and HEVC/h.265 Rec 709 HD SDR content is replayed with any difference in black levels, white levels etc. (I master content in both formats in standard video levels (aka 'Limited') from broadcast quality masters, which are by default limited range, for personal use)

    There is clearly a difference between the same title mastered in HD Rec 709 SDR h.264 and UHD Rec 2020 HDR10 h.265 - but this is to do with the colour gamut (Rec 709 vs Rec 2020) and the SDR or HDR EOTF (SDR/BT.1886 vs HDR10's PQ ST.2084), not the codec. In some cases where the mastering has taken place with different colourists or the same colourist doing two separate grades, rather than a tone mapped down conversion, different decisions will be taken artistically between the SDR and HDR 'looks'.

    It's important to remember that HDR10's PQ EOTF (the bit that makes it 'HDR') explicitly defines an absolute 1:1 mapping between video levels and output light level from the display, whereas the SDR standard used for regular HD (and some non-HDR UHD) content has no such explicit link and is a relative standard. This difference can often make UHD HDR content look 'dim' or 'dark' compared to the same title mastered for HD viewing in SDR.

    (PQ = Perceptive Quantisation, Colour Gamut = the definition of what actual colour the red, green and blue primaries are in the real world, EOTF = Electro Optical Transfer Function = The relationship between the video levels in the signal and the output levels you see on a display. Tonemapping is the conversion between a wider colour gamut and/or dynamic range and a narrower gamut and/or dynamic range. It's the process that decides what you throw away and how you make the SDR signal either reflect the original scene or how you make it reflect the experience of viewing the HDR mastering on an SDR display - which are two very different approaches)

    Thank you to have looked at the manual.

    Yes HDMI Ultra HD Deep colour is set to ON.

    For the Black Level option I can report best results setting it to Low in the SDR profile and setting it to High in the HDR profile, both profiles are set to Cinema as base settings.

    One interesting thing using the black level option is that used on HDR it changes only the details on the blacks leaving unchanged the colors of the picture, if used instead on sdr it washes out entire picture colors, as the backlight luminance was changed. So from my experieche it should be used only on hdr content.

    Ah - it's a black level control, not a video range control. That's a very different setting. That's like 'Colour' and 'Contrast' - a user control.

    (Back in the days of NTSC composite/S-video there was a 'technical' black level control that handled NTSC's optional +7.5IRE black level pedestal that was used in North America, but not used in Japan. This ceased to be an issue once component and HDMI came into use as they have no black level offset and use the same standards in all versions of 525 and 625 video - though ISTR some North American component devices may have had an out-of-spec +7.5 IRE black level so the adjustment was allowed there too)

    The content is the same and HDR10 (and tv supports only hdr10). Model is LG UM7450PLA, the panel is LCD.

    OK - reading the manual :

    for best performance with the Raspberry Pi you need to be set to ON

    [Picture] [Additional Settings] [HDMI ULTRA HD Deep Colour]

    to be set to ON

    That will enable the input formats on your TV that the Pi supports HDR in for 50/60Hz output (the Pi doesn't output 4:2:0)

    However the manual doesn't seem to include any description about Low and High HDMI levels... So no idea what LG mean by those.

    My guess is that Low = Limited = Standard 16-235 (which is the standard used for video) and High = Full = 0/1-254/255 'PC' range.

    The only other thing to be aware of is that the Apple TV doesn't always pass on HDR10 metadata (it's effectively PQ10 with some media replay solutions) whereas the Pi does - so if your TV pays attention to the HDR10 MaxCLL / MaxFALL metadata that the rendering results could be changed by that.

    Coming from an Apple TV 4k I noticed that on my RPi 4 the blacks are darker, expecially on 4k HDR contents. If I put playback in pause on a black, on the Apple TV 4K is possible to read more details pn picture than on RPi 4. To correct this I set on my LG the Black level option to high instead the default low.

    Reading on this forum I understood it not an issue with the limited RGB color range but I can’t still explain why on Apple TV details on blacks in 4K HDR are better readable with LG default profile settings.

    Are you playing the same HDR10 content on both platforms (and not DV on the Apple TV)?

    DV sends dynamic metadata that will trigger your TV to do more tone mapping than HDR10 does.

    What's the model number of your TV - if I can find the manual online I'll see if it sheds any light on the various HDMI input mode options.

    I'd recommend sticking with 16-235 (or the 12-bit equivalent which will be 256-3760 - best thought of as 16.00-235.00) as that is the level space that consumer video (SD, HD and UHD, and SDR and HDR10/10+/HLG) is distributed in, and it's the core standard for video throughout the production and distribution chain.

    If you scale 16-235 content to 0-255/1-254 you will clip <16 and >235 (or the 12-bit equivalent) content when you scale to full-range.

    Given that broadcast video can often go >235 within the broadcast specs ( https://tech.ebu.ch/publications/r103 details production signal range and why clipping Limited range video and scaling to Full isn't a good idea ) - it's best if you can keep as much processing as possible in the YCbCr 16-235 (or 10-bit or 12-bit equivalents).

    A lot of people assume that 16=0% and a hard clip and 235=100% and video is 0% to 100% and so it's fine and there's no clipping - with nothing going above and below. That's not the case. Content is allowed to go >235 in broadcast specs (and whilst it's less useful in HDR10 as in PQ that content is VERY bright, it can - and often does - carry very useful highlight content in HLG), and whilst you shouldn't see <16 on a properly calibrated display, its presence actually helps you calibrate a display with standard test signals (where using test signals requires that you can see <16 content).

    Scaling to Full range makes using PLUGE (a standard test signal used to set black level on a display) impossible for instance. PLUGE has bars at limited range 12 (sub-black), 16 (black) and 20 (just above black). You adjust your display fed with PLUGE so that you can't see the difference between 12 and 16 to set black level, but to do so you need to be able to see the difference between 12,16 and 20 bars when you are altering your display black level/brightness. If you signal has been scaled to Full range already your 12 and 16 bars will both be clipped at 0 or 1 - rendering the signal pointless for setting black level.

    NB if something else in your Kodi playback chain already processes using a Limited->Full scaling - then it's a moot point what you output in... Hopefully most platforms these days don't do that - but I know x86 stuff often does.

    Here endeth the lesson...

    Depends what 'Auto' you are talking about.

    On my Sony FALD UHD HDR TV using 'Auto' for Limited/Full, Rec 709/Rec 2020, and SDR/HDR10/HLG works OK on almost everything. Some platforms don't signal HLG properly - but there are fewer of those now.

    Most platforms are correctly flagging video range using HDMI InfoFrames etc (Limited vs Full), gamut (Regular Rec 709 vs Wide Color Gamut Rec 2020), EOTF (i.e. HDR flavour) and most TVs correctly interpret these now.

    When it comes to other 'Auto' stuff like noise reduction, 'picture mode', contrast enhancement, black level etc. I disable all of those and set my TV up using calibration discs (and if I have time a probe - though that's easier for SDR than HDR)

    Yes - as standard when plugged into a regular HDMI TV that supports Limited Range video (and tells the Pi it does via EDID - which they all do) - the Pi 4B will output Limited Range (16-235 in 8-bit, 256-3760 in 12-bit - and 64-940 in 10-bit if that applies) which is what is expected as standard by consumer HDMI displays.

    The Kodi 'Limited' menu option is confusing - I believe it's designed for x86 boxes with GPU drivers that won't output 16-235, and only output 0-255, and it allows you to output 16-235-within-0-255 - allowing displays that only expect 16-235 to work correctly.

    (These days most displays and sources will support InfoFrames that mean the source can tell the display whether it's Limited or Full range video, and most consumer TVs will accept both, and flag that they do. As all consumer video - DVD, HD and UHD Blu-ray, OTT streaming, DVB/ATSC/ISDB TV is Limited - whether SDR or HDR - keeping things Limited makes sense)

    Worth being aware of the various flavours of 4K/UHD at 50Hz and above.

    Some displays and/or AVRs will only accept the lower bandwidth 4:2:0 chroma-subsampling format that can be used for 2160p50/59.94/60 output - or need 4:2:2 support to be explicitly enabled.

    Some displays only accept 4:2:2 on certain HDMI inputs (my Sony is 4:2:0-only on HDMI 1 and HDMI 4, but accepts 4:2:0 and 4:2:2 2160p50-60 on HDMI 2 and 3 IF I enabled 'Enhanced HDMI support')

    The Pi 4B will only output RGB/4:4:4/4:2:2 and WON'T output 4:2:0. If you have a 4:2:0-only display, or are using a 4:2:0-only input, or are using an AVR or TV that requires 4:2:2 2160p50-60 support to be explicitly enabled and you haven't - you could have issues.

    (4:2:0 was a kludge added to the HDMI 2.0 spec to allow 2160p50-60 support over HDMI 1.4 bandwidth connections)

    Also NB 4:2:0 is only used at 2160p50 and higher - it's not used for 2160p30 and lower.