Posts by The Coolest

    The Coolest, 'Tim_Taylor':


    When i set manuelly set 444 10bit then the outbut show this, but i will see what set the "auto detect"?

    After setting 444,10bit, and changing refresh rate\resolution, my LG B7 switches into BT.2020 SDR, which looks terrible, so I don't understand why you want it so badly. In HDR content color looks fine whether you have 444,10bit enabled or not, in 8.2.2.2. The only problem I had was with color in HDR content being undersaturated on older versions of LE.


    kszaq

    Today I noticed sound crackles and video stutters in my IPTV live streaming (IPTV Simple Client). I'm using a Sunvell T95N 2G/8G box and have LE 8.2.2.2 on it. Never had a problem with LE7. I don't recall having this problem with 8.2.1.1, but I haven't used it that much before updating to the latest build.

    The streams don't run out of buffer, but this problem still occurs. Is there any more data you need? (Sadly I can't provide a playlist, as it's a paid service)

    The Coolest : How do you check 8bit or 10bit HDR is active in Kodi?


    How can i set Kodi to 444 10 bits? I can only set the Resolution to 2160p

    You can either SSH into LE, and run the following command:

    echo '444,10bit' > /sys/class/amhdmitx/amhdmitx0/attr

    Or you can simply add it to autostart.sh file, so that it'll be applied every time LE boots up.

    This will take effect once kodi switches refresh rate or resolution if you do it from SSH.

    In my case it caused issues with SDR, as my TV went into BT.2020 mode, and all colors were oversaturated.

    But it seems that it's not needed anymore in the latest builds for correct colors in HDR.

    kszaq


    Just updated my Sunvell T95N to 8.2.2.2.

    Wifi is now back and working. It only has 2.4GHz and no BT, which LE displays correctly. Awesome!


    I also checked HDR10 content with and without setting 4:4:4 10bit. It now seems that color saturation is identical to in both 4:4:4 10bit mode, 4:4:4 8bit mode and in the 'default' mode (after a reboot) on my LG TV. Compared to the TV's built in player, it now looks the same, which is great news to me.

    So I just wanted to ask whether it's something that was fixed after 8.2.1.1? I saw that the rgb_output was brought back in 8.2.1.2, could it be what fixed the issue I had before? (Before I had to enabled 4:4:4 10bit, otherwise in HDR content color was undersaturated, even though the TV would kick into HDR and BT.2020. And now it seems fine)


    Thanks again for the work you and the other devs do for everyone :)

    kszaq And one more request, looks like GDPR-2 has added support for the LCD display on the TX3 Mini. I have a T95M, with the same display.

    I was wondering if it would be possible to add this to your builds as well, or install it later (I assume there needs to be a driver for the LCD + updated tree?)


    Happy Christmas everyone!

    When I press the "Info" button on the remote of my UH6100 it indicate BT.2020 on my HDR content and nothing on my SDR content. So I presume if it is not specified BT.2020, it's BT.709.


    If I put the "echo 444,10bit ..." command in the autostart, 4k SDR and HDR contents is BT.2020. Personnally I don't like BT.2020 colorspace on non BT.2020 contents, color is too saturate... I test it a lot of time.

    Exactly. I will have to test this on my Vizio and see what it does, but LG does seem to behave this way when you enable 10 bit 444 output.

    It's a bit frustrating. When 10bit is not enabled, and I play HDR10 content, even though the TV switches into HDR/BT.2020 mode, color saturation is lacking. I made side by side comparison by playing the same content with the internal player. Once I switched 10bit 444 on, the color saturation is the same with HDR, but then SDR is oversaturated. So the only workaround I came up with is to enable 10bit 444 when I want to watch a HDR movie, and then switch back to 8bit 444. A bit of a pain.


    By running Kodi in 2160p and 444 10bit, I have the same picture settings on my TV for HDR and SDR, accept for the color saturation, this is about 10 clicks higher on HDR. And if your TV is adjusted right, you'l see that HDR is overrated;)


    But if i set Kodi to 1080p, there is no way I can get the same picture as with 2160p on 1080p content, But this might have something to do with the way my TV handles UHD and HD.


    And since my TV has different profiles for HDR and SDR, the settings are applied automatic when changing source between HDR and SDR:)

    The only thing you achieve by doing that is screw with how the TV displays normal SDR content.

    1. You can't do that by eye with accurate results.

    2. This will screw up SDR for other sources if you use the same picture setting across sources.

    3. Why would I want to do that?

    4. HDR is not overrated at all. It highly depends on the content you watch, and the quality and the capabilities of your display.

    Some movies are just not that much different to their SDR counterparts.


    If you ask me personally - I have no idea. ;) Use what looks better to your eyes.

    Is there any way to determine the type of content being player (ie. SDR/BT.709 or HDR/BT.2020) and automatically switch 10bit 444 on and off during the automatic resolution/refresh rate switch? Not being able to do this right now is probably a low priority, but I think that in the long run it's an important one. Please see my reply to mike in this post.


    And another, unrelated, question. I use live TV with the IPTV PVR addon. Now this addon only supports a single playlist.

    There's a fork of the IPTV addon, which allows using up to ten playlists. This would be a very useful addition (perhaps optional? I don't know) to your already great builds. Here's what I'm talking about: pvr.iptvsimple/pvr.iptvsimple at master · AndreyPavlenko/pvr.iptvsimple · GitHub

    I have 2 4k hdr tv, Samsung KS9000 and LG UH6100.

    Thanks. Perhaps the Samsung behaves differently.

    You LG doesn't have WCG, so it's possible that you don't see a difference whether it's in BT2020 color mode or not. Not sure if it's possible in WebOS 3.0:

    Press up on the remote, select the HDMI input and press OK. For me it shows the resolution, audio output format, and BT2020 when I'm in 10 bit mode.

    If you ask me personally - I have no idea. ;) Use what looks better to your eyes.

    What I meant is "Is it feasible to make it happen automatically during resolution\refresh rate change?" :) A manual solution would be an acceptable workaround, though. Maybe some sort of addon? Guess I should start catching up on how to make a Kodi addon.


    In my case if the box stays in 8-bit mode, the TV switches into HDR but doesn't actually go into wide color gamut mode, not to mention that HDR content should be displayed in 10-bit.

    "10-bit" SDR looks terribly oversaturated, so it looks bad.

    "8-bit" HDR looks undersaturated, so it doesn't look right either.

    So I'm pretty much stuck. I can use the built in player in the TV for these movies, but the TV can only properly handle Dolby "Everything" passthrough properly, but it can only handle DTS and not the more advances audio formats.


    On my side I do not put the command "echo ..." in the autostart and libreelec switch to the good colorspace automatically with my hdr contents.

    May I ask what TV make/model you have?
    I've got a LG B7, and I guess my problem has more to do with how the TV handles a 10 bit signal than a problem with LE.

    Hi kszaq


    I upgraded to this build today for its 10-bit support. I noticed that if I execute "echo '444,10bit' > /sys/class/amhdmitx/amhdmitx0/attr", my TV will automatically switch into non-HDR BT.2020 colorspace.

    Now I understand that this isn't a problem with anything you did, but I would like to know whether it's feasible to switch to 10-bit mode only for HDR10 content and then switch out of it when you stop playback?

    kszaq


    My TV was updated to support 50Hz input signal, but it's not exposing it in its EDID right now, so I have the following in my autostart.sh file:

    Code
    1. mount -o bind /storage/.config/disp_cap /sys/class/amhdmitx/amhdmitx0/disp_cap
    2. echo 4 > /sys/module/amvdec_h264/parameters/dec_control

    Each line cancels out the one before it. If I use the custom disp_cap file, it cancels out the dec_control and if I use the dec_control, it cancels out the custom disp_cap file.

    Is there a way to make it work without these two settings cancelling out each other?

    kszaq

    Alright, from a few minutes of looking at it I think that did it, thanks!!!

    I'll check it more thoroughly tomorrow, and if it indeed solves the problem I'll add that command to autostart.sh.

    It shouldn't negatively affect normal content like 720p/1080p/4k encodes, right?

    And one more question if I may, what does this setting do exactly? I don't know much about linux and decoders, so I tried to search for it but didn't really come up with anything informative.

    @kszaq


    I appreciate the updates, thanks.

    Since with K not really working for me and L being back to the MM kernel, some live channels are glitchy.

    I found that disabling amcodec fixes this, when ff-mpeg is used.

    I'm trying to figure out whether it's possible to only use amcodec for 720p and up? Since the glitchy channels are SD and the S905X can handle decoding them in software.