Posts by smp
-
-
-
1080p GUI rendering with 4k upscaling is a feature specific to CE/amcodec/amlogic bsp kernel. Kodi on x86 doesn't have such features to begin with, so there is nothing to "force".
-
That GUI scaling stuff has nothing to do with x86 LE.
-
will prime decoding come to the generic build?
I don't think so. It's not needed for x86 and was never finished for that platform.
-
Old versions eg. LE 9.2.8 should work.
-
GL is generally better for x86. Kodi's GL renderer has an improved HQ upscaling compared to GLES. It also allows using high quality BWDIF sw deinterlacing when VAAPI hw decoding is used. This is important for devices that lack advanced VAAPI deinterlacing capabilities (eg. Intel Jasper Lake). In GLES renderer this is broken/not implemented.
5 years ago LE had to switch to GLES because HDR passthrough is not possible when x11 windowing system is used. LE needed to go with Kodi-GBM. Back then it was only possible to build Kodi-GBM with GLES.
Since then there were changes in Mesa that allowed using libglvnd GL library instead of x11 GL libraries. Also, there were changes in Kodi's GL renderer that made HDR passthrough possible.
-
LE10 and older Generic and all Generic-legacy images use GL + x11.
LE11/12/13 use GLES + GBM.
It is currently possible to build GL + GBM image (Generic-gl). LE14 would probably use that.
-
-
even 4K 24 fps HDR videos playback at 4:4:4 8 bits
This means that for whatever reason it refuse to use high TMDS frequencies for HDMI. Also, the fact that you get black screens after switching inputs indicate a possible compatibility issue. Try to test without AVR.
-
Is there a fix someone can think of or that is already known for the HDR being always in 8 bits on Intel Arc?
Diff
Display Morediff --git a/intel_hdmi.c b/intel_hdmi.c index ed29dd0..ef293a3 100644 --- a/drivers/gpu/drm/i915/display/intel_hdmi.c +++ b/drivers/gpu/drm/i915/display/intel_hdmi.c @@ -2060,7 +2060,7 @@ intel_hdmi_mode_valid(struct drm_connector *connector, if (clock > 600000) return MODE_CLOCK_HIGH; - ycbcr_420_only = drm_mode_is_420_only(&connector->display_info, mode); + ycbcr_420_only = drm_mode_is_420_only(&connector->display_info, mode) || clock > 500000; if (ycbcr_420_only) sink_format = INTEL_OUTPUT_FORMAT_YCBCR420; @@ -2265,8 +2265,9 @@ static int intel_hdmi_compute_output_format(struct intel_encoder *encoder, struct intel_connector *connector = to_intel_connector(conn_state->connector); const struct drm_display_mode *adjusted_mode = &crtc_state->hw.adjusted_mode; const struct drm_display_info *info = &connector->base.display_info; - bool ycbcr_420_only = drm_mode_is_420_only(info, adjusted_mode); - int ret; + int clock = adjusted_mode->clock; + bool ycbcr_420_only = drm_mode_is_420_only(info, adjusted_mode) || clock > 500000; + int ret; crtc_state->sink_format = intel_hdmi_sink_format(crtc_state, connector, ycbcr_420_only);
^ This kernel patch would force 4:2:0 12-bit for 4K 50/60Hz. You'll have to build your own custom LE image to apply this.
-
-
-
-
-
my Beelink has hdmi 2.1 and will have the bandwidth for that if I am not mistaken?
Maybe it has something to do with this.
-
-
with 2 processors and 4 GB RAM
Less CPU threads = less RAM requirements. On a 20-thread machine 32GB is barely enough.