I actually noticed a problem with the 10-Bit to 8-Bit conversion in the MM build (8.0.2a) today.
A certain range in the lower grayscale flickers.
It can easily be reproduced by setting the screensaver to 5% dim and pause a 10-Bit video during a scene that covers various brightnesses.
Some areas in that frame will most likely show that flicker once the dim kicks in.
Therefore, I am now looking forward to a 10-Bit testbuild even more.
EDIT:
I have to adjust that statement...
Apparently the flicker is also present on 8-Bit sources.
Since I have been using MM Jarvis for a very long time and never saw something like that, it must have something to do with MM+Krypton.
Posts by jd17
-
-
- It's possible, but requires a bit of work (because of resolution switching in libreelec patched kodi side).
That sounds promising, awesome!
Quote- No output whatsoever when "420,10bit" is set on that sysfs interface. However, I get 4:2:0 10bit on >50hz after setting "444,10bit". I haven't looked into that issue, and am not sure of the priority of the issue either.
I asked this question because we need 4:2:0 @ 10-Bit in cases where the HDMI bandwidth is a limitation.
So I thought it might be an issue...
However, if you say that it goes into that mode automatically when the bandwidth limitation is reached and everything is fine then - great.Quote--- Wait a bit and test a new build (uploaded soon) before you decide on this. This is something that wouldn't be too difficult, but would require patching that probably wouldn't be liked by most users. As I understand it, your reasoning is because you're doing or using 10bit hevc rips of your blurays? In this case dithering is hardcoded into the rips, and you shouldn't need additional dithering with 10 bit output of them.
Oh I think you might have misunderstood me here.
My goal would always be as close to the source as possible (like the "best match" setting for audio ).
I was asking this because I assumed we can't use the color bit depth alone as a trigger so I thought if we need to have an external trigger, 2160p made sense.
I know that my Blu-ray encodes will look awesome either way - native 10-Bit output or proper 10-Bit to 8-Bit dithering.Quote- We must define it in that interface, but there are many other ways we could go about this. With that said, I don't see a reason to avoid defining it.
Yes, I agree - that was also a "conditional" question.
Those boxes were always outputting YCbCr 4:4:4 for 1080p anyhow, in 8-Bit of course.
I don't see a reason to change that either, as long as all resolutions, color spaces and color depths are working fine.
Generally speaking, less conversion is always better of course - but the box is already performing quite well on the "Spears&Munsil scoreboard" with the MM build.I am looking forward to your build, just let me know when and what to test.
-
johngalt:
Thanks for all your work to bring us closer to proper UHD playback.
Maybe it is just me, but I am a bit confused regarding your 10-Bit tests and statements.
I'll try to recap what I understood, so please help me out if any (or all) of the following is wrong...- Full Nougat is generally capable of 10-Bit output. (?)
- "Detecting" 10-Bit in the source video and just switching to 10-Bit when necessary is not possible. (?)
- 10-Bit to 8-Bit dithering and 10-Bit output cannot coexist. (?)
I am also unsure about the following:
- What kind of output issue do you see on YCbCr 4:2:0 at 10-Bit? You mentioned this before...
- Are your tests limited to 10-Bit at 2160p + HDR10 + BT.2020? I would also be interested if 1080p / BT.709 / 10-Bit can benefit from a 10-Bit output.
--- However, if you can use the resolution as a "trigger" to enable 10-Bit output, I'd prefer to keep the (good) 10-Bit to 8-Bit dithering on 1080p sources and have 1:1 proper 10-Bit output for 2160p....
- Do you even have to define the color subsampling when you enable 10-Bit? Can you not tell the box to output 10-Bit independent of 4:4:4 or 4:2:0 so it automatically chooses what works and what doesn't?
-
As mentioned above, dynamic range (black level) setting will have no effect on YCbCr input on almost all displays.
It definitely makes a difference on my LG TV (OLED65B6V).
I do get the washed out colors when I set black level to high, both on Marshmallow and Nougat Frankenstein.
That also goes for the RPi2. I always had that set to YCbCr limited, but I could still mess up the range when I set the TV to high... -
Thank you for helping us out here wesk05.
On the other builds, it is set only when refresh rate switching is disabled or the frame rate matches the set desktop refresh rate.
That explains my initial impression of a proper HDR / BT.2020 playback, because I originally tested two demo videos with a 59.94fps refresh rate which did not trigger any switch.
The color difference only became obvious to me when I tested a 23.976fps video.QuoteWhen Cb=Cr, luminance (Y) is simply truncated from 10-8-bit. When Cb and Cr are different, there is some sort of conversion happening along with 10-8-bit dithering. The conversion end up decreasing the color saturation and luminosity.
In Marshmallow Kernel builds, I have not witnessed any decrease in color saturation or luminosity when I playback 1080p / BT.709 / 10 Bit sources (upsampled to 10 Bit from an 8 Bit original).
I did quite a bit of comparing on that because I encode my Blu-rays in x265 10-Bit.QuoteNougat builds do not have noise in 10-8-bit dithering whereas, there is noise in Marshmallow builds.
You are referring to obvious banding in Nougat Frankenstein builds vs. smooth gradations in Marshmallow builds, correct?
We discussed this here:
[BUG] S905X Krypton: Horrible banding in 10 Bit videos
What you call noise looks like proper dithering to me and works quite well to make 10 Bit look close to like it is supposed to.It actually looks just like the 10 Bit to 8 Bit conversion I see in the Windows Kodi builds.
-
Wow, that really sounds amazing, thanks for your great work kszaq & johngalt!
I'm really excited to hear that BT.2020 is being passed, I will too do some testing tonight.
I know this is aimed at devs only, but maybe there is still something I can contribute.Would it be recommended to always have dithering enabled if we could get quality high enough? On some other platforms it's a Kodi default.
Do you mean additional dithering for regular 8-Bit content too?
Or just dithering that is being used on 10-Bit to 8-Bit conversion?
Either way - I'd be happy to do some comparisons if that helps, just send me links to with and without builds.Edit: oh wow, manual 10 bit output can work in this kernel. I think I can do a kernel solution to manually setting colorspace and depth for source based on RXCap if I assume there's always going to be a resolution change for playback. It actually looks like we can have full HDR10 on libreelec after all :).
So you mean 10-Bit could actually be properly passed 1:1?
That would blow my mind!
Thank you so much for looking into this! -
I get that - no worries.
As long as I can get rid of the pink noise by just stopping and retrying or open a different audio codec video in between it's fine.
However, can you answer this? I'd just like to know if that is possible:QuoteIs there maybe some way I can define a delay for FLAC only myself?
I.e. is there a way to define separate delays for different audio codecs?
-
Pelican - Just give up now. It has been determined that any issues you are having are purely in your head and unless you can provide evidence to the contrary, that is reproducible on every variant of s905/s905x board in production, then you can just frig off back to Android.
I get the sarcasm and I did not intend to be rude to you or Pelican.
However, kszaq has been busting his ass to provide stable, high quality builds and we finally have a very good Krypton build.Just try to be excited about that.
Until recently we had frame skips even on regular HD content. 10-Bit banding. 2160p black screens.
All gone now.We also finally know the main root cause for different views regarding picture quality -> the TV's RGB range.
I would like to have proper HDR / BT.2020 / 10-Bit playback as much as everyone else, believe me.
But afaik there is no LibreELEC with that capability out there yet, AMLogic or other. Intel maybe on the way?
Why force it and moan over which Kernel or Kodi version is better at something that is still not right in both cases?
It just does not make sense.Since there is practically no proper UHD/HDR content anyhow, we can afford to be patient in that regard.
That is just my opinion of course... -
Nothing is OK in either case!
BT.2020 is not passed, but being output as BT.709. The colors are not properly mapped.
HDR10 is not properly converted to SDR.
10-Bit is only being converted to 8-Bit.
Jarvis might look better than Krypton for you, but it is far from how things are supposed to look!
So why do you bother with a dubious HDR capture on a non-HDR display, when there is a perfectly fine Blu-ray out there?
What is the benefit?!
The Blu-ray will be displayed 1:1 as it is supposed to be.
However, nothing in the UHD file is being displayed as it is supposed to be. -
But as was already explained to you, the difference does not come from BT.2020 but mainly from the (lack of) HDR to SDR conversion.
You should just not watch HDR content on a non-HDR display.
A normal Blu-ray, mastered to BT.709, will always look better on a non-HDR display than an unnecessary conversion (which is not intended).
Especially considering that this can only be a capture, there is no way to circumvent AACS 2.0, at least not yet. -
Please post MediaInfo video information for those videos in question.
Latest Jarvis and latest Krypton are currently both running on the Marshmallow Kernel - there should be no difference whatsoever. -
Known issues:
- If you use S905X device the screen will be darker than normal on boot. It goes back to normal after you start/stop a video.
kszaq, maybe you can consider adding the following line to the first post below the one in my quote:
- Make sure that your TV's HDMI input for the AMLogic box is set to limited RGB range for proper image reproduction. This setting is often called "HDMI black level" or "black level" and should be set to "low" or "limited" or "16-235".
This might help to avoid future questions/complaints. -
ok as far as I know are u responsible of all this activit?..U r using abusing language "bitch" what is this man...
You are being very annoying, rude and you do nothing whatsoever to actually support the devs to get where you want them to be.
You are unreasonable and you don't listen to constructive advice.
I understand this reaction towards you and I am very close to simply block/ignore your posts.
Try to be more constructive and consider if your post brings any added value to the table before you post it.
Until now, all you have done was complain and demand. -
It does not have to be changed for every movie.
The video output is always limited YCbCr.It might change from YCbCr 4:4:4 on 1080p to YCbCr 4:2:0 on 2160p, but it is always limited RGB range (16-235).
No need to change the TV all the time.
Just set it to limited input once and keep it that way, no matter Marshmallow or Nougat kernel.
The recommendation goes beyond those AMLogic boxes.
Video is always native limited range, so converting to full range should also always be avoided, independent of the source device. -
I think it's a safe bet that whatever is breaking bright/cont settings may also be affecting other things.
A safe bet I have not seen the smallest shroud of evidence to support.
The only real world drawback is the minor overscan on the left edge.
-
It is more than just color range issues for my s905x box. There is something fundamentally wrong with the driver in MM kernel builds.
The best evidence I have is that the brightness and contrast adjustments while using hardware acceleration are totally fubar on MM, but work exactly as expected on the Nougat-y builds.
I can confirm that both brightness and contrast settings are broken, at least in the current MM build.
Thankfully, there is a very simple solution - don't use those sliders!
Those broken sliders do not mean that "there is something fundamentally wrong with the driver in MM kernel builds", because the output is perfectly fine when both sliders are untouched at 50%. -
if range_control stuff is reverted, the 10bit=>8bit banding is also "fixed" in nougat kernel. I'll do more testing, cleanup, and submit a PR.
Sounds great!
QuoteIs this the only bug that's been widely reported in the nougat kernel?
Unfortunately, it's not.
2160p video has "black screens" (like signal losses) in Nougat Frankenstein, even on SDR / 8-Bit / BT.709 content. -
Because it boots into full, I can't set the range to limited on my newer samsung set (stuck on auto due to broken firmware on entire samsung lineup).
I am very sorry to hear that... Is there no way to set the RGB range to limited before you boot up the box?
QuoteI was able to test on my sony and can confirm.
So we finally have an explanation, but I still don't know why people kept claiming everything is darker in MM builds.
That makes no sense in any RGB range scenario.
The only logical explanation is that those users never made it past the initial menu screen, despite of all the warnings...