Posts by noggin

    I've attempted to get audio passthrough working (with the latest nightly build) on the second hdmi port (HDMI 2) without success, the avr (Denon AVR-4308) is detected in libreelec however no audio is sent to the receiver (video is sent to HMDI 1 which is working), have I misunderstood the passthrough feature? Thank you

    I don't think there is any suggestion that you can take video from HDMI1 and audio from HDMI2 ?


    The HD Audio passthrough option adds the option for 1080p modes (but not 2160p?) to bitstream passthrough HD Audio codecs like Dolby True HD and DTS HD Master Audio and High Resolution Audio (sending the compressed bitstream to your AVR for your AVR to decode to PCM), whereas previously on the Pi they had been decoded in software within Kodi and output as PCM 5.1/7.1 (which apart from Atmos should be lossless - other than losing metadata that some downstream AVR processing may use for loudness control etc.)

    Also some TVs are not good at displaying the video resolution of the BIOS splash screen quickly enough for you to see the BIOS POST menu. PC monitors are often a bit better in this regard in my experience. Also in my experience cabled keyboards are often better than RF remote keyboards as BIOSs sometimes don't recognise the RF receivers quickly enough.

    Just a question ... HDR with PI 4 will come in future? I´m aware, nothing is final, but just on technical aspect it is possible and well get enjoying this feature?

    The Pi is capable of decoding 2160p 10-bit HEVC, and we're told the hardware supports output of 10-bit video with Infoframes to flag HDR and switch TVs into HDR mode. (These are the three things necessary for 'standard' HDR10 replay)


    HD Audio bit streaming (for HD video output) has been added recently to the Pi 4B+'s capabilities - so stuff is happening to develop the platform.


    If you want HDR now - then look at some other platforms (though these are using custom Kodi builds to enable HDR) - if you are prepared to wait then HDR support for the Pi 4B+ should happen at some point - but there is no real timeline in the public domain for when.


    One thing to be aware of is that if your TV only supports HDR UHD at 50/60fps at 4:2:0 the Pi 4B+ is not going to besuitable - as it can't output 4:2:0 (which is required for some TVs). There isn't that much 50/60fps HDR stuff - but BBC iPlayer is 50fps for UHD HDR streaming for instance.

    noggin How's the Herobox working out for you with HDR content?


    The price has just dropped to £152 on Amazon and I'm tempted to try one out.

    I haven't been using it much for LibreElec recently. I've currently repurposed it as a DVB-T2 archive box (it permanently records the UK PSB3 Freeview HD Mux - recording BBC One HD, BBC Two HD, ITV HD, C4 HD and C5 HD from a DVB-T2 USB tuner - giving me instant access to the last approx 14 days of broadcast TV from the UK's main channels)

    Yes.

    Not an issue for me though. The panel is 8-bit with FRC. I will probably get a better result with RGB 8-bit + GPU dithering than with 4:2:0 12-bit.

    If your panel is 120Hz won't feeding your TV 10bit or more and letting FRC do the work be a better job (as the FRC can work at 120fps and thus can dither between the two 60Hz frames, or 5 x 24Hz frames - whereas with a GPU you'll be displaying the same dither twice at 60fps or 5 times at 24fps? )


    I thought the whole point of HDR 24p working pretty well with 8bit+FRC was because you got a lot of frames to do the FRC dithering on for every source frame? If you dither in the GPU then you don't get the same effect ?

    That was actually an issue with my older Samsung TV.

    I just tested with another TV (Samsung UE43TU8000) and 4K 60Hz HDR videos play just fine, no video distortion.


    MVIMG-20200729-235626.jpg


    is that still 8-bit output though?


    You can send 2160p60 8-bit 4:2:0, 4:4:4 or RGB with an HDR EOTF flag and the TV will happily switch into HDR10 or HLG mode - but because you've lost 2 bits of video data you will get banding?


    Some (often early) displays would only accept 2160p60 with 4:2:0 as they had HDMI 1.4b-bandwidth limited "HDMI 2.0" inputs, whilst others have different functionality on different inputs (some Sony sets are not 'full HDMI 2.0' on HDMI 1 and 4, and are only 'Enhanced' on inputs 2 and 3 (3 is usually the ARC HDMI too).

    Nothing yet. The latest #0717 build from Milhouse did play the 3D video (which is an upgrade from LE 9.2.3 which crashed), but the display is still showing two images.


    So that sounds as if 3D MVC decode is working, but not being output as frame packed 1080p24 (i.e. not outputting it as 1920x2205 with 45 lines of blanking between each eye feed).


    Does the Pi 4B output frame packed 3D when converting SBS and TAB for output as frame packed?

    The OSMC remote has the advantage that it is RF-based and likely to be slightly snappier in response terms. I've got one with my Vero 4K and like it for simple use cases.


    However I use Kodi on most of my devices for TV - so need numeric keys, and also like to have transport controls (FF/REW/STOP/PLAY/PAUSE and importantly RECORD) so have a huge number of other solutions ! (PS3 Blu-ray Remote is quite nice and uses Bluetooth, Tivo Slider Pro is nice because of the internal QWERTY keyboard - but you have to get that with the RF receiver dongle, and there are lots of RF remotes that work quite neatly that won't break the bank)


    The OSMC RF remote is very neat if the buttons do what you need though.

    Not in my case :-( My old remote is a Chinese IR MCE copy which worked OK with my old x86 based setup. Basically doesn't work (almost) at all with my new shiny RPi4 system. Pause seems to work if I press it long enough, the other buttons don't do anything.

    Are there any IR remotes you can recommend which are 'plug and play' on a LE RPi setup?


    As chewitt says - a Flirc is often a great solution. It's a USB IR receiver only that you can use with almost any IR remote you already own (including your Chinese MCE remote - the FLIRC would just replace the IR receiver for that remote). You program the FLIRC on a PC or Mac and then plug it into the USB port on any Kodi device you like. The programmability of the FLIRC means you can teach it the IR codes from any remote you own pretty much, and definite what key press each IR command is associated with. It has pre-defined layouts for MCE etc but you can program pretty much any USB HID command with it.


    Flirc also has nice functionality like separate Long Press configs (outside of Kodi) and the ability to define Macros.

    If you say so.


    I've heard people going on about HDR for years, but I've never quite understood what the big deal was.


    I figured it was just a silly fad, like those 3DTV glasses. :p

    HDR goes hand-in-hand with UHD/4K. Lots of people want to watch UHD content on their UHD TVs, and HDR->SDR downconversion is average at best, and terrible at worst. Watching HDR content (which is the bulk of UHD content) really needs HDR output.


    HDR, to me, is far more compelling than 3D. When handled properly in post production it adds a lot in subtle ways to a programme. For live sport it adds even more because you get a less compromised view (watching football with a pitch, or Wimbledon with grass, half in shade and half in bright sunshine looks a LOT better in HDR than a constantly re-irised SDR feed)

    I think the advice at the moment is not to switch to Intel for 4K HDR stuff in Linux as your main platform. This stuff is still experimental, there is no >30fps UHD HDR support (in the UK that's needed for live UHD HDR iPlayer for instance) and by the time the work is in a reliable and mainstream state there may be newer (better value?) Intel platforms to consider.


    I was interested in the status of Intel HDR on Linux, and have some diagnostic kit (and a bit of video standard awareness) so when a low cost Gemini Lake box appeared on my radar I was intrigued enough to buy one. It's not my daily driver (that's a combination of Apple TV 4K with Netflix/Prime and MrMC, and AMLogic boxes running CoreElec)


    At the moment I think HDR10 output support for Intel under Linux is limited to Gemini Lake processors as they are the ones with native HDMI 2.0 output. The issue with other Intel devices is that they use Displayport outputs from the CPU+GPU SoC and convert this to HDMI 2.0 using an LSPCon chip on the motherboard, HDR support for which is currently lacking fully within Linux. That said there are more and more patches appearing (LSPCon devices are now able to trigger HDR 10 mode on connected TVs, but don't display video in that mode at the moment)

    Short term would using a cheap Linux-supported USB 3.0->Gigabit Ethernet adaptor be a solution until the drivers appear?


    (I always have a couple of those, and their older USB 2.0 models, lying around to add Ethernet ports to systems like my Intel m3 Compute Stick, or a second port to a Raspberry Pi or similar)

    smp & noggin love your conversations even if it goes over my head. Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR? Sorry but i'm not fully versed with all the 4:2:2 & 4:2:0 and such. I do remember reading somewhere that at minimum 10bit is what I'm suppose to expect for real HDR.


    If you are playing most movies that are 2160p24 or 2160p23.976 (i.e. 4K at 24fps) then I believe you are getting correct HDR (because you can output RGB 10/12-bit over HDMI at 2160p at 30fps and below)


    The topic of discussion here is about 2160p HDR content at 50fps and 59.94/60fps - which is more for Live TV (BBC iPlayer 2160p50 10-bit HLG HDR) and the occasional Ang Lee HFR releases that are released in 59.94fps (Gemini Man, Billy Lynn's Long Halftime Walk etc.).


    HDMI 2.0/2.0a doesn't support 2160p50 10/12-but output RGB over HDMI, but it seems Intel's drivers prefer 8-bit RGB over 10/12-bit 4:2:0 modes - so getting 2160p50/60 10-bit HDR content output is proving problematic.


    At the moment it's not possible to get proper 2160p50/60 10-bit content output correctly (whether it's HDR or SDR)

    The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.


    10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.


    If it's a choice of 8-bit with or without dithering, then sure add the dither, it will mask the banding by adding extra noise to the picture. But I would always chose clean 10-bit video carried in 10-bit 4:2:0 or 12-bit 4:2:2 over 8-bit RGB/4:4:4 with dither noise.


    So Intel are basically limiting their Linux drivers to HDR for <30fps 2160p only, and only allowing 8-bit output for >30fps at 2160p? That's a crazy limitation. It means that any HDR live TV shot at 2160p50 or 2160p60 isn't viewable on their platform under Linux? (BBC iPlayer 2160p50 live sport in HDR for instance?)


    When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?


    4:2:0 10-bit YCbCr output would be fine for HDR video content (most of this will be delivered 4:2:0 after all - that's what's used for streaming and UHD Blu-ray) - the only compromise would be that a 4K UI might be slightly compromised and 4K artwork and Photos would have 1920x1080 chroma res (4:2:2 would have 1920x2160 chroma res)


    I can understand Intel wanting RGB/4:4:4 to be preferred for desktop use (it avoids the smeary chroma that 4:2:2 and 4:2:0 introduce on very fine coloured picture detail - such as pixel-wide coloured text which would not be nice on a UI monitor - but you don't really need 10-bit or higher bit depth on a UI display, unless you are grading video, or watching Netflix in a window)


    However as there is no >8-bit HDMI 2.0 mode that supports 2160p50 and above in RGB/4:4:4 you have to accept reduced chroma resolution to get the right bit depth to allow clean HDR output of video (the bulk of which will be sourced 4:2:2 or 4:2:0 anyway). I wonder if Intel don't 'get' this issue yet?


    Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?

    What do you think about 4:2:0 12-bit vs. RGB 8-bit with dithering?


    GBM version of Kodi (which is used in HDR builds) does not support dithering due to the lack of OpenGL. However, a hardware dithering can be enabled for 8-bit modes in Intel driver.

    By default the Intel driver does not dither and I see some banding with some of my sample videos @ 2160p60 8-bit output.

    With dithering enabled those videos look much better, much less or no visible banding.


    For what it's worth - my view is that dithering isn't a great idea. Effectively it adds high frequency noise to a picture to mitigate the lack of bit depth. Plasmas used to do it to allow them to make their subfield processing that delivered something like 6 or 7-bit equivalent grey scale look 'good enough' with 8-bit content - but I was always pretty sensitive to the noise it introduced. (I could also see the subfields when I was chewing...)


    Many HDR displays on sale use 8bit panels with FRC (which I think is a kind of dither) to achieve the perceived bit depth required for banding-free HDR. I wonder if that will also be used with 8-bit sources with dither (which the TV won't know about)


    Personally I think 4:2:0 10-bit/12-bit or 4:2:2 12-bit are the only routes really worth following for 10-bit SDR and HDR content at 2160p50 and above.


    If the industry had thought it could continue to get away with 8-bit video paths with dither - I'm sure they'd have kept them!


    Of course 8-bit with dither display of a 10-bit SDR source is likely to be better than 8-bit display without dither... (So it does have some merit)

    Unfortunately I have never seen any tv back end app that could record multiple sub-channels with an ATSC tuner. I know it can be done with DVB tuners, but don't think it can be done with ATSC tuners.

    Does TV Headend not do it? That's really surprising if that's the case. Someone needs to get coding!


    AIUI there's no fundamental difference between ATSC and DVB in that regard (both use a single MPEG2 transport stream per multiplex/RF channel).


    If you can't achieve it by tuning two services in TV Headend you should be able to by streaming the entire RF mux (which will be 19.2Mbs) to stream and record all the video and audio services ?