Posts by Gedeon30

    Hi everyone.

    Intel HD500, HD600 series…. seem able to decode HEVC and generate 4k images.

    But I have serious doubts about how many of those are truly able to deliver 10bits HDRs through their display interfaces.

    Intel only seems to guarantee HEVC/10bits/4k/HDR in their HD620 model and correctly implemented in Display Port 1.3 / HDMI 2.0b so, probably, not all nucs, motherboards, etc.. support that full features in their firmware specs, ...

    Is there any list of "confirmed platforms" able to deliver 10bits HDR based entirely on Intel integrated graphics cards ?

    Are there some known combinations/platforms which, on the contrary, can't deliver true 10bits/HDR ?

    Maybe the question sounds a bit stupid but I'm truly confused about this issue.

    :cool:

    Thanks in advance!