Posts by noggin

    If you say so.

    I've heard people going on about HDR for years, but I've never quite understood what the big deal was.

    I figured it was just a silly fad, like those 3DTV glasses. :p

    HDR goes hand-in-hand with UHD/4K. Lots of people want to watch UHD content on their UHD TVs, and HDR->SDR downconversion is average at best, and terrible at worst. Watching HDR content (which is the bulk of UHD content) really needs HDR output.

    HDR, to me, is far more compelling than 3D. When handled properly in post production it adds a lot in subtle ways to a programme. For live sport it adds even more because you get a less compromised view (watching football with a pitch, or Wimbledon with grass, half in shade and half in bright sunshine looks a LOT better in HDR than a constantly re-irised SDR feed)

    I think the advice at the moment is not to switch to Intel for 4K HDR stuff in Linux as your main platform. This stuff is still experimental, there is no >30fps UHD HDR support (in the UK that's needed for live UHD HDR iPlayer for instance) and by the time the work is in a reliable and mainstream state there may be newer (better value?) Intel platforms to consider.

    I was interested in the status of Intel HDR on Linux, and have some diagnostic kit (and a bit of video standard awareness) so when a low cost Gemini Lake box appeared on my radar I was intrigued enough to buy one. It's not my daily driver (that's a combination of Apple TV 4K with Netflix/Prime and MrMC, and AMLogic boxes running CoreElec)

    At the moment I think HDR10 output support for Intel under Linux is limited to Gemini Lake processors as they are the ones with native HDMI 2.0 output. The issue with other Intel devices is that they use Displayport outputs from the CPU+GPU SoC and convert this to HDMI 2.0 using an LSPCon chip on the motherboard, HDR support for which is currently lacking fully within Linux. That said there are more and more patches appearing (LSPCon devices are now able to trigger HDR 10 mode on connected TVs, but don't display video in that mode at the moment)

    Short term would using a cheap Linux-supported USB 3.0->Gigabit Ethernet adaptor be a solution until the drivers appear?

    (I always have a couple of those, and their older USB 2.0 models, lying around to add Ethernet ports to systems like my Intel m3 Compute Stick, or a second port to a Raspberry Pi or similar)

    smp & noggin love your conversations even if it goes over my head. Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR? Sorry but i'm not fully versed with all the 4:2:2 & 4:2:0 and such. I do remember reading somewhere that at minimum 10bit is what I'm suppose to expect for real HDR.

    If you are playing most movies that are 2160p24 or 2160p23.976 (i.e. 4K at 24fps) then I believe you are getting correct HDR (because you can output RGB 10/12-bit over HDMI at 2160p at 30fps and below)

    The topic of discussion here is about 2160p HDR content at 50fps and 59.94/60fps - which is more for Live TV (BBC iPlayer 2160p50 10-bit HLG HDR) and the occasional Ang Lee HFR releases that are released in 59.94fps (Gemini Man, Billy Lynn's Long Halftime Walk etc.).

    HDMI 2.0/2.0a doesn't support 2160p50 10/12-but output RGB over HDMI, but it seems Intel's drivers prefer 8-bit RGB over 10/12-bit 4:2:0 modes - so getting 2160p50/60 10-bit HDR content output is proving problematic.

    At the moment it's not possible to get proper 2160p50/60 10-bit content output correctly (whether it's HDR or SDR)

    The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.

    10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.

    If it's a choice of 8-bit with or without dithering, then sure add the dither, it will mask the banding by adding extra noise to the picture. But I would always chose clean 10-bit video carried in 10-bit 4:2:0 or 12-bit 4:2:2 over 8-bit RGB/4:4:4 with dither noise.

    So Intel are basically limiting their Linux drivers to HDR for <30fps 2160p only, and only allowing 8-bit output for >30fps at 2160p? That's a crazy limitation. It means that any HDR live TV shot at 2160p50 or 2160p60 isn't viewable on their platform under Linux? (BBC iPlayer 2160p50 live sport in HDR for instance?)

    When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?

    4:2:0 10-bit YCbCr output would be fine for HDR video content (most of this will be delivered 4:2:0 after all - that's what's used for streaming and UHD Blu-ray) - the only compromise would be that a 4K UI might be slightly compromised and 4K artwork and Photos would have 1920x1080 chroma res (4:2:2 would have 1920x2160 chroma res)

    I can understand Intel wanting RGB/4:4:4 to be preferred for desktop use (it avoids the smeary chroma that 4:2:2 and 4:2:0 introduce on very fine coloured picture detail - such as pixel-wide coloured text which would not be nice on a UI monitor - but you don't really need 10-bit or higher bit depth on a UI display, unless you are grading video, or watching Netflix in a window)

    However as there is no >8-bit HDMI 2.0 mode that supports 2160p50 and above in RGB/4:4:4 you have to accept reduced chroma resolution to get the right bit depth to allow clean HDR output of video (the bulk of which will be sourced 4:2:2 or 4:2:0 anyway). I wonder if Intel don't 'get' this issue yet?

    Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?

    What do you think about 4:2:0 12-bit vs. RGB 8-bit with dithering?

    GBM version of Kodi (which is used in HDR builds) does not support dithering due to the lack of OpenGL. However, a hardware dithering can be enabled for 8-bit modes in Intel driver.

    By default the Intel driver does not dither and I see some banding with some of my sample videos @ 2160p60 8-bit output.

    With dithering enabled those videos look much better, much less or no visible banding.

    For what it's worth - my view is that dithering isn't a great idea. Effectively it adds high frequency noise to a picture to mitigate the lack of bit depth. Plasmas used to do it to allow them to make their subfield processing that delivered something like 6 or 7-bit equivalent grey scale look 'good enough' with 8-bit content - but I was always pretty sensitive to the noise it introduced. (I could also see the subfields when I was chewing...)

    Many HDR displays on sale use 8bit panels with FRC (which I think is a kind of dither) to achieve the perceived bit depth required for banding-free HDR. I wonder if that will also be used with 8-bit sources with dither (which the TV won't know about)

    Personally I think 4:2:0 10-bit/12-bit or 4:2:2 12-bit are the only routes really worth following for 10-bit SDR and HDR content at 2160p50 and above.

    If the industry had thought it could continue to get away with 8-bit video paths with dither - I'm sure they'd have kept them!

    Of course 8-bit with dither display of a 10-bit SDR source is likely to be better than 8-bit display without dither... (So it does have some merit)

    Unfortunately I have never seen any tv back end app that could record multiple sub-channels with an ATSC tuner. I know it can be done with DVB tuners, but don't think it can be done with ATSC tuners.

    Does TV Headend not do it? That's really surprising if that's the case. Someone needs to get coding!

    AIUI there's no fundamental difference between ATSC and DVB in that regard (both use a single MPEG2 transport stream per multiplex/RF channel).

    If you can't achieve it by tuning two services in TV Headend you should be able to by streaming the entire RF mux (which will be 19.2Mbs) to stream and record all the video and audio services ?

    Since my WinTV QuadHD-usb tuner setup was a failure (I didn't notice it was just the PCI-E version that was supported in Linux) I've decided to go with Siliconedust Homerun connects. and Librelec on a Pi 4. I already have a Duo I used to use it with WMC, and would like to purchase a Quatro. I'll be using a USB 3, external hard drive. Will the Pi be able to handle the recording? I don't plan to use the Pi for viewing. I'll use a Linux pc for that. Will the USB3 HD handle 6 recordings and 1 viewing? Will I have to upgrade my network switch to gig switch?

    I'd probably go for a GigE switch between the two tuners and the Pi 4B just to cover yourself with a bit of headroom.

    Also - be aware 6 tuners can mean you can record and stream far more than 6 channels. Subchannels on the same RF frequency as the main station should only use one tuner (at least that's how DVB tuners work in TV Headend with an HD HR).

    If you use a USB 3.0 Hard Drive for recording then you should be able to record many services simultaneously without hitting issues.

    Rpi does not recogneze cards above 32gb. To do this you need to use a tool which will make the exfat to fat32.

    That's not the case - I regularly use 64GB and 128GB microSD cards in all of my Raspberry Pis, from the Zero to a 4B 8GB RAM model.


    And using Etcher or the Raspberry Pi writing tool to flash a .img or similar file will completely zap the Micro SD card, repartioning it and formatting it with the required formats for each partition.

    (You may be thinking instead that if you use uSD cards in a uSD card USB reader with a Pi that Raspbian (or Raspberry Pi OS) won't mount exFAT formatted cards without installing the Fuse exFAT stuff - but LibreElec reads ExFat stuff fine on a Pi.)11

    noggin

    In my last build I forgot to include this patch, so it did not switch to BT.2020 color gamut when playing HDR video.

    I uploaded a fixed build. Kodi is also updated to a current master.

    Thanks for this.

    I hadn't expected it to work with Rec 2020 HLG HDR (so many other things just do HDR10) - but it seems to. My Rec 2020 HLG HDR stuff is output with HLG flagged and my TV switches into HLG HDR mode. This is great news for BBC UHD HDR iPlayer stuff which uses HLG.

    However for some reason my TV's sound device is not detected with this build detected. I'll have a reboot of both my TV and my Gemini Lake box to check it's not an HDMI funny.

    ** Aah - I solved it with this LibreELEC Testbuilds for x86_64 (Kodi 19.0) as I didn't want to alter my BIOS in case I go back to Windows **

    echo "blacklist snd_soc_skl" > /storage/.config/modprobe.d/blacklist-snd-soc-skl.conf

    Installed LibraELEC on a friends Raspberry Pi 4 and ran into an issue. I copied a lot of multichannel 5.1 music in OGG format for him that play properly on my Pi 3 with OSMC but the channels are all mixed up when played back on his. If a log file is needed I'll get one but in the meantime is this quick fixable a known issue? BTW, the channel mapping is FL / C/ FR / SL / SR / LFE. Thanks!

    There are reports of 5.1/7.1 PCM audio channel mapping issues elsewhere too.

    • H.265 (4kp60 decode), H264 (1080p60 decode, 1080p30 encode) might replace outdated H.264 (1080p30)

    Just one thing - the Pi Zero to Pi 3B+ all decode h.264 1080p50 at high bitrates with no major issues in my experience. Whilst 1080p30 decode is the formal limit in the spec, the reality seems to be a bit different.

    I've remastered a lot of 1080i25 4:2:2 high bitrate masters to 1080p50 4:2:0 h.264 at around 40Mbs and they've played back on every Pi I've thrown them at (including a Pi Zero). I've not tried 1080p60 though..