Posts by johngalt

    jd17 To clarify, there's no change between 8 bit output at 10 bit or 8 bit, and there's no conversion occurring while keeping 10 bit output.

    New build:

    Please note IR remotes still won't work.

    Changes from dev build in OP:

    • 10-8 dithering support on 8bit output (fixes the color banding issue that's been posted often for nougat kernel).
    • Audio effectively at previous kernel state (fixes surround sound and passthrough).
    • CEC effectively reverted to MM state (fixes double keypress issue).
    • Resolution switching sets the same mode as current (should fix colorspace not being passed on same mode as well as a possible 10bit output bug on first playback). Edit: fixed, will upload this build: aml/hdmi: if setting attr, set color depth early · amillogical/linux-amlogic@361e832 · GitHub

    One of the benefits of kszaq's excellent work on this kernel is true HDR capability. If you'd like to test, look below.

    10bit output:

    • On a new boot before any other playback has been performed, run the following: echo '444,10bit' > /sys/class/amhdmitx/amhdmitx0/attr
    • This only needs to be set once each boot, so it can be put in /storage/config/autostart.sh


    Known 10bit output bugs:

    • First playback will still have dithering. Some other cases there may still be 10 bit dithering (checking colorspace alone in kernel has issues due to LE resolution switching). This will be fixed. Fixed in new build :)
    • Unsure, but would be surprised if no others exist.

    Download - build updated for dithering issue

    Going top down:

    - Yes, we've got 10 bit output here.

    - It's possible, but requires a bit of work (because of resolution switching in libreelec patched kodi side).

    - It can coexist, but we really don't want it to.

    - No output whatsoever when "420,10bit" is set on that sysfs interface. However, I get 4:2:0 10bit on >50hz after setting "444,10bit". I haven't looked into that issue, and am not sure of the priority of the issue either.

    - It can (and does) benefit, and no dithering is needed. I've watched various media on the box since, and have kept "444,10bit" set. This is something that could be set in autostart.sh for now for those of us on capable hardware. Unfortunately the dc_cap route I wanted to go (where we'd check on boot and set accordingly) has issues, as dc_cap is still misreporting hardware that isn't capable of 10bit input. This also means that as of now the "what is available" route you mentioned isn't available. This will be looked into. There's nothing wrong with setting 4:4:4 for now however.

    --- Wait a bit and test a new build (uploaded soon) before you decide on this. This is something that wouldn't be too difficult, but would require patching that probably wouldn't be liked by most users. As I understand it, your reasoning is because you're doing or using 10bit hevc rips of your blurays? In this case dithering is hardcoded into the rips, and you shouldn't need additional dithering with 10 bit output of them.

    - We must define it in that interface, but there are many other ways we could go about this. With that said, I don't see a reason to avoid defining it.

    Without:

    Pe II: Not sure with vertical banding, but even on my display's internal player and android with proper 10bit output I get some banding on this intro, so I don't think it's a great benchmark. However, when I was enabling/disabling dithering and going back with 8bit output, there was noticeably less banding in this intro, and no banding whatsoever in banding_test.mkv I was using (will reupload). I also recommend you set a 1080 GUI to use your displays upscaling on 1080 media. It will change to 2160 on 2160 media playback. The 2160 GUI is just poorly upscaled from 1080 anyway.

    1080p24 SDR: Wasn't able to recreate after ~15 seconds, would need logs. Most of the stuff I've played back took around 15 seconds to "settle in" and playback smoothly. I'm not sure if these changes are affecting it either from kszaq's original dev build.

    After:

    Pe II: I'm assuming your GUI was set to 50hz? I have a workaround on it's way to go back to always setting the mode. Long term this could potentially go back in the kernel to do so on colorspace change only.

    Passengers: Also can't recreate on what I'm assuming is this same rip. I suspect you're right. First, try a different one, and set that interface immediately on boot prior to any playback. If you still have issues and haven't played proper HDR through this cable previously, try a different one.

    1080p24 SDR: same as "without"

    Thank you all for testing. I noticed an issue if a video is played prior and "444,10bit" isn't set immediately on boot as well. So for those testing in the future, could you please test after a fresh boot so we're all on the same page?

    I've started a build based on your latest (as of this morning, KERNEL_VERSION="ea2a014") and will report back after testing it later today.

    I got a chance to better test current HEAD and saw dithering was still enabled on 10 bit output. Amlogic had it enabled as well on 10 bit output without the changes (which seems strange), but it may have been unset elsewhere that wasn't affecting us due to resolution switching. This time it's tested as disabled by default for 10 bit output (8140a4c).

    For the most part it won't affect your testing, but wanted to make people aware.

    I copy and in putty ssh paste ur command. If gui 1080p, then the "answer" is 16. If play 1080p video then 32. If play 4k video, and no signal (black sbreen), then 93.

    I did it well? Is the "answer" so short?

    My guess for the issue was incorrect unfortunately. Could you post the output from running "dmesg" after playing 4k as well as the kodi.log?

    I recommend running the following two commands and then posting the two links you get

    Code
    dmesg | curl -F 'sprunge=<-' http://sprunge.us
    
    cat /storage/temp/kodi.log | curl -F 'sprunge=<-' http://sprunge.us

    I tired again the new build. If in setting the gui is 1080p, then gui, sd, 720p, 1080p films are ok, 4k go to black screen (lost signal). If I set the gui to 4k, then after boot, screen is black (lost signal). Box is live, ssh, samba is work.

    8.0.1l-mm, 8.0.2a is perfect, sd, 720p, 1080p, 4k is good.

    Have somebody some idea?

    please post dmesg output on a paste site (like bpaste.net) and also post the result of: cat /sys/class/amhdmitx/amhdmitx0/vic

    Do this after the black screen has occurred.

    Good news edit:

    wesk05 PM'd me due to having a new account, and confirmed that 10 bit output is good :) . A note of his however: The command actually sets YCbCr 4:4:4 for <50Hz and 4:2:0 for >50Hz.

    From glancing at source, this is because of bandwidth constraints.

    I also couldn't read earlier while exhausted, and re-enabled dithering on 10 bit output (oops). I just force pushed those branches again.

    I'm sorry to make it seem immediate -- it wasn't. I'll stop tagging you until you're back.

    I've made a bit more progress in testing:

    I just force pushed the kernel and le fork repos to disable dithering if 10bit color depth is set (aml/hdmitx: use 10-8 dithering by default on 8bit · amillogical/linux-amlogic@aa97ed1 · GitHub). This should accomodate guys who don't want 10->8bit banding, and also those who want 10bit output.

    In other words, I can finally say "goodbye" to Android :D

    I don't see anything wrong with "standard" video while having 4:4:4 10bit output set unconditionally on capable hardware. From my understanding, this shouldn't make a difference at all with "standard" video so long as no sampling is used, so perhaps we could just check dc_cap on phy bringup and then set it if the display is capable?

    Testing 10 bit output:

    To test 10 bit output on capable hardware, run this ONCE prior to playback (resolution switch). It will stick as long as the device is on:

    Code
    echo '444,10bit' > /sys/class/amhdmitx/amhdmitx0/attr

    For now this can be set in autostart.sh if you like the results.

    Why 4:4:4?

    4:2:0 10bit has an output issue, and with 4:2:2 10bit I got some flickering in parts of the video that was fixed when setting 4:4:4 10bit on a 4:2:0 10bit source.

    koenkooi and others, would you mind testing this and reporting your findings?

    P.S.There are situations with compression banding I prefer having the dithering enabled on low bitrate 10bit sources. Unfortunately this dithering noticeably reduces PQ in optimal situations with 10bit output. Perhaps a solution could be to expose it via sysfs and allow the 10bit output users to decide if they want to enable dithering unconditionally? This could potentially be decided later in a "tuning" thread.

    Edit2: I may have unintentionally found the bug that was causing some 4k users to get stuck with no video output. I bookmarked and will take a look later on. Could someone with the bug post a dmesg and check the result of: cat /sys/class/amhdmitx/amhdmitx0/vic

    Edit3: realized the last commit revision was unnecessary and the bug that made me add it: On first playback with 10bit output set, there will be dithering still. On all video playback after, there will not be. This is because of how LibreELEC switches resolution. This last revision does nothing new, so if you built with it you don't need to rebuild with the new force push. I was tired and couldn't read apparently. I was wrong here and it was needed.

    Edit4: here's a build with the dithering commit and audio at the previous state (working surround): LibreELEC-S905.arm-8.0-devel-20170601123601-r26033-g579da302e.img.gz | openload

    I've been messing with manually setting color depth and can confirm it really does work correctly. However, color depth is set correctly by this kernel originally when video info is set, but on mode change on playback, color depth gets reset.

    As far as I can see, there are two options:

    1) Expose current color depth in a cleaner way from kernel (with setting all other parameters that go along with color depth), then from Kodi we can check it and set it again on initial resolution switch to maintain it.

    2) Set previous on mode change after initial video info is set. A problem with this solution is that video color depth will persist to the GUI until the next video playback. I'm not sure if this as real world impact or not.

    kszaq, any feedback?

    Edit: or a third option: prior to resolution is set in kodi, read color depth of video (not sure how to do this part from kodi), and set depth prior to resolution switch the same way I've been manually setting it.

    I've taken kszaq's branch and cherrypicked in 3 commits:

    Code
    koen@libreelec:~/LibreELEC.tv-8.0$ git cherry -v | tail -3
    + 50fc937b8d863c9e50fe73fcd3aef2d49a0fcc7a S905: update audio config and patches
    + 44eebb8c7a27a3d2a23299315823582589e33893 S905: use custom test kernel
    + a1031f5ec821245cd2f1edc207cd31fc62643f3f S905: bump kernel

    Most HDR videos play fine, but the Planet Earth II is given me problems, the colours are weird and the picture is full of artefacts. Below is a photo of the TV with the AVR overlay on the bottom and TV overlay on the top.

    One of the differences between the working movies and this is that the working ones are 24/1.001 fps and this one is 25, so no change is needed from the 50Hz GUI.

    For testing you can use this sample.

    I have the same PE Ii videos already and was able to test.

    My display's at 60hz, so at first I didn't have an issue due to the mode being set again. When I set my display to 50hz, I was able to replicate your issue. Try setting your display to a different refresh rate prior to playback and see how it goes. This will also cause issues for me with 60fps HDR videos.

    Also, I realized I was testing manually set depth with the dithering fix prior. This is also needed right now on libreelec, and is separate because it can be reverted when output is correctly set aml/hdmitx: enable 10-8 dithering by default · amillogical/linux-amlogic@95f9dae · GitHub . I amended and force pushed the LE fork kernel bump for people like koenkooi.

    When output is correctly set, I think this issue will go away so I won't try to work on a fix before working on that for now (when I get more time).

    ty wesk05, I suppose the black level changes on some displays are a mystery then? On my Sony when properly configured there is no change between the two either, but on the Sony I was able to somewhat replicate the change I see between the two on my Samsung. Neither is forcing RGB. After this forum, to make sure I wasn't going crazy I even did a quick check with my colorhug to make sure my eyes weren't tricking me (I know it's not great, but just wanted to make sure).

    2) Nougat builds do not have noise in 10-8-bit dithering whereas, there is noise in Marshmallow builds.

    I suspect this is the result of a typo from an amlogic commit updating dithering for GXM and newer boards, which disabled the old 10-8 dithering for <=GXL in the process. I have a fix for it aml/hdmitx: fix 10->8 dithering for <=GXL & cleanup · amillogical/linux-amlogic@3304772 · GitHub and aml/hdmitx: enable 10-8 dithering by default · amillogical/linux-amlogic@95f9dae · GitHub

    re: hdr/sdr conversions, Amlogic "optimizes" their conversions with very different results. I suspect if you revert their "optimizations," you'd get close to standards. Amlogic also has a static saturation boost on sdr -> hdr conversion that was added separately from their conversion "optimizations." When I played with sdr ->hdr, I had to remove the saturation boost to get decent output.

    I've done very little...just messed around with kszaq's awesome work.

    re: dithering, I was curious about having dithering for all content, but I'm not sure about getting the quality on par with other implementations that are on by default so let's just forget about that for now I guess.

    I'm not sure if it is being properly passed (I know very little about the overall architecture). I was able to get 10 bit output, but am not sure if it's 10bit video on the 10bit output. From what I've seen, I think it should be possible. With minimal hardware, I think I can revert a fix to test if it's real 10 bit output or not. I ran out of time for testing, and have too much I need to get done to look into it very quickly. With that said, I'd LOVE to drop android/second box and just use libreelec, and kszaq got us closer than ever to that reality :) .

    johngalt Thank you for the patches! I think I can revert sound to current MM kernel state as it was working acceptably. In think I'll do this later this week, more important non-LE things to do.

    No, thank you!

    I just force pushed to that fork again (Commits · amillogical/linux-amlogic · GitHub), and audio is now inline with your amlogic-3.14.y-new branch (included your revert for the experimental audio changes). I wasn't able to do extensive testing, but everything I threw at it with my surround system (passthrough + nonpassthrough) seemed to work as well as with the experimental audio commits.

    I also included a less hacky dithering fix for <=gxl boards (only tested on gxl aml/hdmitx: fix 10->8 dithering for <=GXL & cleanup · amillogical/linux-amlogic@3304772 · GitHub). I actually think this may have been an amlogic typo they never noticed because no s905x devices formally have nougat other than the khadas, but am not sure. I played around with forced dithering everything and different noise values, but ultimately decided PQ tuning should be left for later with possibly a different thread.

    I tested hdr_banding.mkv posted by a member of this forum awhile back (light intro to a hdr movie), and the dithering looked quite good to me. I then tested a very low size/highly compressed 10bit rip that had compression banding even at correct 10 bit output in android, and found I preferred the 8 bit output in libreelec.

    In addition, this partial HDR support (on correct hardware) might even be good enough for some people since colorspace is now passed.

    Edit: oh wow, manual 10 bit output can work in this kernel. I think I can do a kernel solution to manually setting colorspace and depth for source based on RXCap if I assume there's always going to be a resolution change for playback. It actually looks like we can have full HDR10 on libreelec after all :).

    kszaq I'm working on a better solution for 10bit output, but here's a dithering hack for 10bit->8bit: [HACK] aml/hdmitx: always force dithering · amillogical/linux-amlogic@5c0e548 · GitHub

    I noticed (possibly because of my display), the kernel also thinks deep color has been triggered with some playback (though with libreelec nothing will happen).

    Also, makakam was right about bt2020 being passed :D. It threw me off at first because there's now colorspace correction on the OSD from bt2020, so I didn't think it was set at first. In Android without that additional CSC, the OSD becomes oversaturated.

    Edit: sorry for the double post, I didn't realize.

    Edit2: that commit works, but I made a mistake. I'm also testing out higher quality dithering and setting only for 10bit video now.

    Would it be recommended to always have dithering enabled if we could get quality high enough? On some other platforms it's a Kodi default.

    kszaq, got surround working:

    Commits · amillogical/linux-amlogic · GitHub from 24cbbb3 to HEAD (this is your nougat-wip branch but with the earlier audio commits missing and replaced toward HEAD).

    You'll also want S905: update audio config and patches · amillogical/LibreELEC.tv@d4e39d0 · GitHub (reverting your audio changes in the nougat-wip libreelec branch).

    The audio state is with the experimental commits. To get to your current state le8 nougat state, you'd want to revert 86ff57a and 9c6e4b5 (haven't tested but should work)

    johngalt I have noticed that you forked my kernel and that you are working on some changes. I think I will be able to create some test builds with full Nougat kernel later this week. I you have any improvements to share, I welcome every PR! :)

    Both those branches are very broken in different ways, I recommend holding off a bit before you play with either :). I accidentally learned a bit about the amlogic drivers though. On one test build, I accidentally forced bt2020 and HDR mode on everything, but it gave me a chance to play around with the SDR to HDR conversion. With removing the saturation boost in that mode, it actually looked quite decent.

    Since we know how to set bt2020 colorspace (still 8 bit, but with decent 10bit=>8bit conversion that can be worked on later), I've been thinking a good solution for (near) HDR output would be to read the playback's colorspace, and set the colorspace accordingly (similar to your patch for nougat style framerate automation). I've yet to look into how the colorspace can be read at the kodi level however.

    Given the work required for a full nougat kernel, IMO it would be best to start off slow with your current branches + a few commits/reverts, and go from there. A full nougat kernel could be developed in tandem.

    Unfortunately I can't replicate the other nougat kernel bugs such as black screens, though I've seen some unmerged commits that mention similar.