Posts by mattlach

    In the last 2.5 years Team Kodi has been slowly and successfully driving the Linux codebase towards common standards (GBM/V4L2) and away from vendor-proprietary interfaces (VDPAU, Amcodec, iMX6, OMXplayer, etc.) so there is low interest in adding another nvidia vendor proprietary interface (NVDEC) to Kodi. It's probably not hard to do, but someone has to do it, and even if it's done it's unlikely to be accepted into the codebase.

    Ahh.

    My assumption was that since it is supported by ffmpeg, provided the correct driver is in place (Nvidia binary blob?) it would just be a matter of changing the command line options passed to ffmpeg.

    I'm probably horribly oversimplifying things though.

    I know my way around *nix systems, but I am not a programmer or a developer.

    ...or are they just a series of tradeoffs?

    Conventional wisdom is that x86 has the raw power to keep the interface and everything else running smoothly, but unless you have a Gemini Lake or Gemini Lake refresh, that means no HEVC 10bit HDR due to the limitations of older HDMI standards.

    Some ARM boxes and systems on boards have HEVC 10bit HDR recording, but it is difficult to keep up on what is actually working, and what is temporarily broken and decoding in software. (which seems to be a lot of Amlogic stuff temporarily due to moving to new kernels) I don't know how people keep up.

    If everything in it were decoding properly (not sure if it is) the Odroid N2+ looks like it would be pretty awesome. Quite a powerful ARM chip with HEVC and HDR decoding support (once implemented)

    Then there are the little Gemini Lake x86 boxes. They may be small Atom based cores, not the big Core cores, but they do have more advanced decoding.

    Big X86 cores can probably brute force most things, but then we are talking heat and fan noise...

    If you were shopping for a LibreElec box today, and wanted the best possible experience, price - within reason - not being a limiting factor, what would YOUR choice be?

    Coming with the NVIDIA 450 Linux driver series besides CUDA 11.0 RC compatibility are:

    - HEVC 10/12-bit "decode only" support has been added to the VDPAU driver

    Well, that was certainly unexpected.

    They don't happen to say what hardware will be required? It might might make a DDR3 version of the GT 1030 a worthwile investment for those with older Intel hardware who just want to add the newer decode support. They can't be too expensive used on ebay...

    Hmm, the VDPAU announcement of 10/12-bit support is interesting. If it works on a broad range of cards it might result in a stay of execution. If it only works on the latest cards it might not be so appealing (our stats show nvidia is mostly legacy users). It might also require someone to tweak Kodi in some areas; and there's low desire to work on nvidia things among the current core Kodi devs.

    I wonder if there is a bit of a chicken and egg effect there.

    The community has been saying VDPAU is dead, and that Nvidia is stubborn uncooperative and suggesting Intel hardware for such a long time now, that I wonder if that has just driven people to move away from Nvidia.

    If Nvidia support is coming back with this driver, maybe the userbase will come back too?

    I wonder what it would take to add this to LibreElec. Just a drop in Nvidia driver update?

    I've also never quite understood the difficulty in adding NVDec, considering Kodi uses FFMPEG for playback, and FFMPEG supports NVDec. But that may be a question for another thread...

    Most of the team still regard x86_64 as the "true" Kodi experience and is still the best for GUI (due to raw CPU performance) and will be the first to properly crack HDR support. That said, ARM devices are now approx 85% of our userbase (75% is RPi of some kind) and whlie ARM means there is usually some kind of compomise somewhere, they're increasingly capable devices.

    Good to know my investments in hardware are not completely wasted :p

    I get frustrated though. An x86 CPU/Motherboard/RAM setup lasts for many years these days. For HTPC use what drives upgrades more than anything else is compatibility for hardware accelerated decode. You used to be able to just do this by buying a cheap low end video card of the latest generation, but low end video cards are more and more rare (the GeForce GT720 was really the last good one).

    Neither AMD nor Nvidia make true low end GPU's anymore, and Nvidia's GPU's are pointless for this purpose due to the end of VDPAU and the lack of NVDec support.

    (And yes, I understand that is Nvidia's fault for pushing their stupid way of doing things, not the Kodi or LibreElec projects, but still, annoying)

    So what that means is that those of us on x86 are going to have to buy new motherboards and CPU's every few years when a new codec comes out, which is a shame.

    Maybe I'll transition the bedrooms to ARM when they next need an upgrade, but keep my main HTPC up to date on x86. That's really the only place I care about the flagship experience.

    Controversial topic maybe?

    I have been using Kodi in one way or another for several years. First my own builds on top of Ubuntu, later OpenElec, and then migrated to LibreElec.

    For all that time I have use x86 hardware. First on my desktop, but later in purpose built HTPC's

    First dedicated was using on board graphics. First one was an AMD A10-7850k. Most things worked, but for whatever reason, when cable signal quality was weak, the on board GPU would crash the MythTV plugin at the time, so I moved on.

    Next I tried a Intel Haswell era Celeron G1820. Again, back in 2014 the MythTV plugin kept crashing with the Intel On Board graphics as well. Then I tried adding a cheap Nvidia GeForce 720 GT, and everything worked beautifully. I built three boxes with this configuration, main one in the livingroom, one for the bedroom and one for the guest room.

    Then Nvidia discontinued VDPAU. In order to get newer H.265 decodes, I needed to move away from Nvidia. I tried dropping all the G1820's back down to integrated graphics last year. It turns out in the 5 years since I first tested, integrated graphics now works fine with the MythTV plugin. One of the motherboards didn't have an HDMI port, so for shits and giggles last year I took advantage of a MicroCenter combo deal and got a motherboard and a completely overkill Coffee Lake i5-9400 for my main home theater HTPC. I was shooting for something lower end, but all the small CPU's were sold out at the time. It has been pretty good, but I have had the early adopter penalty from a hardware compatibility perspective, with occasional passed through sound channels getting mixed up (center channel going to rear right, etc.) requiring reboots. Shame on me for forgetting that Linux rarely works right with new hardware :p

    Anyway, this was just a long and roundabout way of getting to the point that I like x86 hardware. I've enjoyed building systems, but I am starting to wonder if it makes sense anymore, or of it really has made sense at all in the last 5 years.

    Is there anything I would be giving up (display quality, sound quality, playback capabilities, etc.) by going with - say - a Raspberry Pi 4 or some other cheap ARM-based media box compared to my x86 boxes? Maybe the next time I am forced to upgrade hardware (probably due to H.266/VVC encoding) maybe I should just give up on all this stuff, and transition to ARM?

    Appreciate any thoughts.

    Assuming you've tried the typical things like cables and connections i would start with trying any old external video card in the system to see if the issue is actually the embedded hdmi interface or not... any old card would do as long as its got the hdmi out as the exercise is to see if the issue goes away when using a external card... if it does then you know its actually the embedded interface on the motherboard or not...

    HDMI may be a digital interface but there are still ways of introducing noise into the signal streams its being used to transmit...


    Motherboard quality these days wanders around quite a bit and its not uncommon to find MB's that appear to work ok in most aspects till at some point you find some small quirk that seems no right...

    If its proven to be the MB Interface it could be just a bad motherboard (bad from a noisy point of view) that functions ok otherwise or maybe it just needs a vcc or timing tweak...

    Appreciate the suggestions.

    Since I had a working setup without issues prior to the heardware upgrade I did not try swapping out the cables, but I have checked the connections.

    Interesting that VCC may impact noise. I;m used to overclocking having unpredictable results, but this thing is running at stock....

    I will play around with it and see if anything changes.

    Hey all,

    I have a HTPC setup in my livingroom where a HTPC running Kodi (on top of LibreElec) passing both video and audio via HDMI to my surround receiver, usually via either passthruough, or multichannel PCM.

    For the longest time the system was running on an old dual core Haswell system, utilizing an Nvidia GPU for video output. When this was the case my system was working fine.

    However, the douchebags over at Nvidia decided that they were going to discontinue VDPAU (their video decode hardware acceleration API under Linux) and it's replacement NVDEC has pissed off all of the open source projects, so they are refusing to use it. (AMD and Intel use VAAPI, but Nvidia sees a need to be proprietary and different as they always do...)

    So, due to VDPAU having been discontinued and stagnating, it is not getting the latest features and formats anymore, at the same time all the Kodi and LibreElec devs are saying that on x86 they are primarily developing on and for Intel integrated graphics these days, so I decided to switch things up.

    A couple of months ago I picked up an Asrock B365M Pro4 motherboard, and because all the low end CPU's for the platform were sold out for some reason, I wound up with the total overkill for simple video playback duty, Intel i5-9400

    It's a great little platform, but it introduced a new problem. Audio using HDMI straight off the motherboard has noise problems at low volumes which were never there before. I usually cant hear it over loud scenes, but when nothing is playing it is super obvious, and sometimes audible during very quiet scenes.

    It's bugging the hell out of me.

    Since HDMI is a digital format, it shouldn't have any analog noise going out over it, which is why I am suspecting some sort of odd ground loop. I've been looking for HDMI ground loop isolators without success.

    Does anyone have any suggestions?

    Why Coffee Lake? A Gemini Lake ITX board would be a more obvious choice for a HTPC.

    Eww, Atom cores.

    That and I don't trust anything embedded.

    I want to be able to assemble everything myself.

    I understand that big desktop cores may be a little bit overkill for this application, but I like building things myself, I get frustrated with integrated stuff, and this way, if I change my mind, I can re-purpose the board elsewhere in one of my other projects.

    It would just be a drop in replacement, keeping the same drive, case, psu, etc. I already have. The bundle deals at Microcenter are good enough that its pretty cheap.

    I'll always choose something I can build myself over something pre-built, and I'll always choose big cores over Atom cores.

    That, and I bought this nice fanless all aluminum Streacom case years ago, that I just can't bring myself to stop using.

    That all depends on your definition of "fully".

    For example, HDR is still a work-in-progress AFAIK.

    Ah, fair.

    I don't care much about HDR yet. All of the screens in my house are still pre-HDR sets, so I don't need that feature just yet.

    As long as I can hook it up, get display output, audio passthrough and hardware decode for popular formats I'm very happy

    By the time I upgrade the old Panasonic Plasma in my living room to something with HDR, maybe they wi ha e sorted the HDR side of things out.

    Well,

    It turns out I had no choice but to stick with the GT1030 for now. I never noticed this before, but apparently the Asus H81M-K motherboard in my HTPC does not have an HDMI port for the on board GPU, only DVI and VGA, and without it I can't get sound in my setup.

    I guess a motherboard and CPU upgrade is in my future, but for now I'm staying on the Nvidia GPU.

    I guess it was too difficult for Georgia Tech to indicate which infection(s) specifically was encountered?

    Or do they block the download of every file unknown to them?

    Your guess is as good as mine...

    My best guess is that one of the mirrors for th eproject resides on a gatech server, and it got caught up in some overly zealous geatech IT department security program.

    I was eventually able to get the system to pick a different mirror for me, and download the file.

    I've been trying to download the latest version of LibreElec to upgrade the old version on my HTPC, but the download keeps failing. Instead of downloading the file, it just brings up this error message:

    I'll point out that I am not now, nor have I ever been at Georgiatech, so I don't know what this message is all about.

    I'm guessing something isn't configured quite right on the hosting end. Maybe a bad mirror?

    --Matt

    Hey all,

    So, I have a box with Librelec 8.2.5 on it that I have been meaning to upgrade to the latest release for a while but just not gotten around to.

    All of a sudden today it is no longer booting (failing to start Xorg for some strange reason, considering nothing has changed). I'm going to guess the drive got corrupted or something.

    I figured its time to install the newer version anyway, so no great loss.

    My question is this. I understand VDPAU is no longer being developed.

    I have been running my Kodi boxes using Nvidia GPU's (a 1030GT in the main living room, and GT720's on the master bed and guest room) for some time now, as the old MythTV plugin used to constantly crash on the Intel IGP when I originally set things up 5 years ago

    I don't think the crashing problem with Intel IGP is still there, so my question is, as I reinstall LibreElec, am I better off leaving the GT1030 in the machine, or pulling it and using the Haswell IGP on the i5-4570T instead?

    With the GT1030 and VDPAU I had at least some limited 8bit Hevc decode capability, but I have no idea what the capabilities on these older Haswell graphics chips are, and googling has just led to confusion.

    The Wikipedia page on Intel Quicksync groups Ivy Bridge and Haswell together, stating that they only decode mpeg2 and avc, but then I have found other stories suggesting that a partial VP9 and full 8 and 10 bit decode was added in via a driver update I. 2015. U clear what capability is present in Linux drivers.

    If anyone can help un-muddy the waters it would be greatly appreciated.

    There is some interesting discussion about NVDEC and how it is supported in FFMPEG now over here.

    Looks like it would be the way to go, but FernetMenta doesn't seem to have much interest, which is a shame.

    You'd think with support in FFMPEG, most of the heavy lifting would already have been done...

    I can't speak for the Kodi Android repo but the pvr.mythtv frontend is both our 8.2 and 9.0 repos for a number of ARM devices.

    Good to know, thank you.

    This definitely wasn't the case when last built all of my HTPC boxes (two Haswell Celeron G1840's with GT 720's for bedroom and guest room, and Haswell i5-4570 recently upgraded to GT 1030 in my livingroom.

    The GT1030 is installed and running with latest stable LibreElec now. Does quite well in 8bit HEVC, but as mentioned, does not help with 10 bit. Looks like a lot of 4k blurays are compressed with 10bit HEVC, which is kind of a bummer.