will i be able to install LibreELEC on it ????
No.
will i be able to install LibreELEC on it ????
No.
better question, why does Hardkernel sells something that can't be used at release
It can be used with out-of-tree Realtek driver. Not to mention Windows 10.
Yeah, looks like you will have to wait for the updated r8169 in-kernel driver.
It's there... It's a hidden .update folder.
You can also use LibreELEC USB-SD Creator tool.
Use a nightly build. RTL8125 network should work with it.
LibreELEC-Generic.x86_64-9.80-nightly-20200625-cc6e86c.img.gz
You can also use one of my builds with experimental HDR support for Intel Gemini Lake that I posted in this thread. RTL8125 should also work.
LibreELEC-Generic.x86_64-9.80-devel-20200624071923-38764c8.img.gz
Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR?
We were only talking about the optimal HDMI output for 50/60hz modes. Don't worry too much about it.
When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?
I don't think RGB can be removed from EDID, looks like it is mandatory.
YCbCr 4:4:4 can be forced with a driver hack but I suppose there is not much use for this mode.
4:2:2 is not even present in the driver source code, so it can't be forced.
4:2:0 12-bit can probably be forced with a driver hack but I'm not sure. Driver assume that RGB 8-bit is the highest quality mode for 2160p50/60 and use it instead of 4:2:0 12-bit.
Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?
It appear to be some sort of compatibility issue with Gemini Lake's HDMI 2.0 implementation and some Samsung TVs.
I wonder if Intel don't 'get' this issue yet?
HDR on Linux with Intel hardware is not a thing yet. I'm sure things will change.
Personally I think 4:2:0 10-bit/12-bit or 4:2:2 12-bit are the only routes really worth following for 10-bit SDR and HDR content at 2160p50 and above.
The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.
10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.
Some new patches for LSPCON HDR just showed up.
[v5,08/11] drm/i915/lspcon: Create separate infoframe_enabled helper - Patchwork
[v5,09/11] drm/i915/lspcon: Do not send infoframes to non-HDMI sinks - Patchwork
[v5,10/11] drm/i915/lspcon: Do not send DRM infoframes to non-HDMI sinks - Patchwork
If you feel like trying them here's an image.
Okay so Nvidia seems a nogo. What about AMD?
Another option is to wait for the Intel discrete card. They should be out soon. Intel's Linux drivers are excellent. AMD is ok, but drivers are not nearly as good as Intel's.
At Archlinux and KODi Forum
By "quality of video is not as good as on X11" he means that some features like dithering and color management are not (yet) implemented in Kodi-GBM because it would require OpenGL.Kodi-GBM use OpenGLES. Normally those features shouldn't be touched anyway, so no big loss.
I don't know where you read that nonsense, the "image quality" is identical.
With X11 you can forget about HDR. HDR is only possible with GBM+DRM PRIME.
The fact that Nvidia will support 10-bit HEVC decode via VDPAU doesn't really make them any less useless.
looks enabled (almost) everywhere in Linux 5.7 configs
Those are RTL8152...
RTL8125 is using the r8169 driver.
If it works on a broad range of cards it might result in a stay of execution
Maybe it would be a good idea to make a special LE branch for nvidia? It doesn't make sense that Intel/AMD is still not using Kodi-GBM because of nvidia.
LE has nothing to do with Ubuntu...
Current LE master branch is based on kernel 5.7 for Generic x86.
Stable 9.2 branch is based on kernel 5.1.16.
It should be already supported by the current mainline kernel (I believe since kernel 5.4).
So do Intel just support 4:2:0, 4:4:4 and RGB output?
If so then 4:2:0 is the only format that will support 2160p50/60 with >8-bit for HDR
What do you think about 4:2:0 12-bit vs. RGB 8-bit with dithering?
GBM version of Kodi (which is used in HDR builds) does not support dithering due to the lack of OpenGL. However, a hardware dithering can be enabled for 8-bit modes in Intel driver.
By default the Intel driver does not dither and I see some banding with some of my sample videos @ 2160p60 8-bit output.
With dithering enabled those videos look much better, much less or no visible banding.
I suspect that a .tar file will not do him any good.
LibreELEC-Generic.x86_64-9.80-nightly-20200617-89db20a.img.gz