Kodi Community Forum

Full Version: I'm so confused. Minix U1 or Nvidia Shield?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3
I currently run Kodi on my Minix X8-H Plus and it has stood me in good stead for the past 1½ years. But I recently purchased a 4K TV (the Hisense 55H9B2, marketed in Europe as 55K720). To take full advantage of it, I intend to upgrade my box (probably remaining with android). The specs of the Nvidia Shield Android TV are more impressive than the Minix U1 and there are rave reviews (in this forum and elsewhere) ... but I've been reading of some problems with the Nvidia SATV. For example, comments on the GeForce forum such as this excerpt:
Quote:I think the Nvidia Shield TV picture quality, in a purely video sense (i.e Kodi, Netflix, Youtube, iplayer etc) is garbage. My Minix plus is far an away the better device & alot cheaper.... Kodi seems so basic on this box. Jarvis is not even native 4K now, downscales to 1080p....​
FYI: My new TV has native Netflix 4K capability and I'm not into gaming. For me, such boxes are almost entirely for Kodi. [BTW, in addition to the Minix U1 (based on the Amlogic S905 chip), I'm also considering waiting for an S912-based box.]

Your thoughts, anyone? Thanks!
I'd go w/one of these:

- [Good] Minix U1. But, it has issue playing back certain 10-bit videos
- [TBD] Upcoming WeTek Hub, which will have updated chip to resolve above's issue
- [Best] nVidia Shield. HDR support should be added in future firmware update
- [Best] HiMedia Q10 Pro. Plays 10-bit, but HDR needs a fix. Also supports HDR-competitor DolbyVision

Whomever posted Shield as "garbage" for video is mistaken.

All above just my opinions of course Smile.
Thanks for your input, hdmkv!

- The HiMedia Q10 Pro is outside my price range. [I'm seeing $300 listings on Ebay, shipped from China. I doubt it will be less whenever it reaches Amazon.]

- I'll keep my eye out for more information on that upcoming WeTek Hub. Most of the specs seem to be slightly better than the Minix U1, but its 1 GB RAM is a turnoff. [The current WeTek Core does not impress me. The specs are no better than my current box.]

- The price of the Nvidia Shield is on the high side for someone like me who isn't into gaming. But I'll pay it, if indeed it's all it's cracked up to be. And I respect that you say it is. I hope you realize, however, that (since I don't know you) I need to pay some attention to the nay-sayers as well. Not just the "garbage" man, but there are others also who complain about the Shield's picture quality with 4K and 1080p video.

- I think I may hold off on the Minix U1. My Minix X8-H Plus plays 1080p (and lower-resolution) content just fine. And there ain't a lot of 4K content yet (beyond Netflix, which my TV itself has).

Thanks again, hdmkv!
A few facts:

- the nVIDIA Shield did have colorspace issues with Full vs Limited range color output, this has since been fixed with a Firmware update I believe and it is now user selectable. What is completely incorrect with that statement is you cannot get far superior 4K/1080p Netflix or VP9 YouTube video playback out of the MINIX at all.
Personally I would not buy a Shield if I needed quality deinterlacing for broadcast TV, there are too many user complaining about deinterlacing issues currently here on the forum.

- Android Lollipop and Marshmallow are far and away better OS's than the old Android KitKat. There are OS runtime optimisations, better memory management, better garbage collection, the list goes on. A WeTek Core with the same AML S812 running Lollipop will be quicker than a MINIX X8-H Plus running KitKat, just due to these improvements alone. I have not even mentioned HD Audio passthrough available yet on the Core.

- I happen to have a number of AML testboards and devices, and if you are using them for Kodi media playback only then 1GB of memory is just fine, especially if you are running OpenELEC or LibreELEC, which do not need much memory at all.
Android Lollipop memory management takes care of any memory issues if you are hitting the limit when using that OS.

- The new AML S905 is slightly faster and a tiny bit smoother than the older S812 so long as they are both running on Android Lollipop. Its the Android Lollipop OS that makes a difference. I'm testing a S905 WeTek Hub at the moment so am able to give direct back to back comparisons and also developing the lean and mean LibreELEC on the S905 ODROID C2. Which is faster again than the Android Lollipop OS.

- Kodi Does NOT need 2GB when running on Android Lollipop, you are after all only streaming video, not running a full blown PC. I have 250MB free when running Kodi on the test box 1GB WeTek Hub. There are NILL performance issues with 1GB of memory. Decent Firmware is very important too.
Semi decent 3D Gaming, well then yes 2GB or more would probably be prudent for all those textures.

- An interesting S905 vs S812 comparison. There really is not a lot of difference between them, performance wise.

- Yes, bugger all 4K content available apart from 10-bit HDR test clips and Netflix and Amazon streaming. UHD Bluray copy protection needs cracking first to even Rip you own for home use. You may be able to Hardware decode 10-bit content with a new box but by the time proper home ripping is sorted out there will be new Hardware that will be both cheaper and more mature and bug free than what is currently available. Your new TV will decode 10-bit content with its built in media player too.
(2016-04-19, 04:54)wrxtasy Wrote: [ -> ]- An interesting S905 vs S812 comparison. There really is not a lot of difference between them, performance wise.

Whilst you are right about general computing performance being similar between the two, there is one quite major difference though - which may be particularly relevant to Kodi users.

The S812 is limited to HDMI 1.4b so maxes out at 3840x2160/30p (i.e. can output 4K only at 'film' frame rates), whilst the S905 has HDMI 2.0 so can deliver 3840x2160/50p and 59.94p (i.e. output 4K at 'video' frame rates) I believe that the H265/HEVC decoders are also similarly limited, so the S812 can't play 2160/50p or 59.94p content even if you are running with 1080p output.

This is a reasonably significant issue - as one likely source of UHD 3840x2160 (aka 4K) content is broadcast, which will used 50/59.94p. Sport, Entertainment etc. run at 50p or 59.94p and sport will be a big driver for UHD.

This is from the end of the CNX article you quote above.
Quote:The main advantage of Amlogic S905 over S812 is support for HDMI 2.0 ports allowing 2160p @ 60 Hz video output, and 4K H.265 hardware video decoding up to 60 fps, while both are limited to 30 Hz on S812.
Thank you for those facts, wrxtacy, and for your clarifications, noggin!

For my purposes, I feel no need to examine further the respective merits of the the Minix X8-H Plus and WeTek Core. Since I have one of them, I have no interest in buying the other.

As for the forthcoming WeTek Hub: Although I'm not a technical person, for my own purchases, I need to apply my own common sense to the technical information I gather. It just doesn't make sense to me that – all other things being equal – 1GB RAM would as effective as 2GB or 3GB RAM (e.g., in buffering streaming videos). Since Minix is such a known (and beloved) quantity to me, I'd rather pay $130 for the Minix U1 than $300 for the WeTek Hub.

Regarding the 3.0 update of the Nvidia Shield: I have read many complaints about problems. Many folks seem to be reverting to the original firmware (and thereby losing that Full/Limited RGB selectability). I'm not yet clear whether the 3.1 update eliminates those problems.

Indeed, I think I'll sit tight with my current very effective device until Minix comes out with a box based on the S912 chip. If I were just starting out, I'd get the U1 (and I do recommend it to others), but it doesn't seem superior enough over the X8-H Plus for me to spend $130 of my very limited budget on it. [NOTE: If I were into gaming, I certainly would pay $200 for the Nvidia Shield and cross my fingers that it would handle my Kodi streaming needs okay. Koying of SPMC obviously likes it.]
(2016-04-19, 17:23)Don Grimme Wrote: [ -> ]Regarding the 3.0 update of the Nvidia Shield: I have read many complaints about problems. Many folks seem to be reverting to the original firmware (and thereby losing that Full/Limited RGB selectability). I'm not yet clear whether the 3.1 update eliminates those problems.
I don't know how exactly you will be using the Shield or another device, but based on your posts on this forum, Shield, Minix forums, I can say that the issues people were reporting with Shield update 3.0 are less likely to affect your Kodi usage. The only real concern that I have with you getting the Shield is compatibility with your Hisense TV. There is a known issue with 4K 60Hz mode with Vizio TVs and another similar issue with Philips/2016 LG OLED TVs. I have not seen anything about Hisense TVs. You may be the first one to test it out.
(2016-04-19, 18:07)wesk05 Wrote: [ -> ]I don't know how exactly you will be using the Shield or another device, but based on your posts on this forum, Shield, Minix forums, I can say that the issues people were reporting with Shield update 3.0 are less likely to affect your Kodi usage. The only real concern that I have with you getting the Shield is compatibility with your Hisense TV. There is a known issue with 4K 60Hz mode with Vizio TVs and another similar issue with Philips/2016 LG OLED TVs. I have not seen anything about Hisense TVs. You may be the first one to test it out.
If you've seen my posts on those three different forums, you are very well-read, wesk05! Yes, all I use these boxes for is Kodi. [My TV has native 4K Netflix capability and I'm thrilled with how it looks.] I'm going to have to take a closer look at those reported problems with the Nvidia Shield 3.0 update (and the 3.1 update) to confirm your insight.

As for Hisense 55H9B2 compatibility: as I understand it, it has to do with whether or not that TV's two HDMI 2.0 ports are HDCP 2.2. I've read conflicting claims. And I'm not sure how to determine what it is on my unit ... short of spending $200 on the Shield and plugging it into the TV.
I have 2x Shield TV (a 16GB and a 500GB) and no issues w/3.0 or latest 3.1. Some users complaining also tend to be the most vocal ones. BTW, I have the Vizio @wesk05 mentioned, even updated to latest firmware, and no issues.
(2016-04-19, 19:13)Don Grimme Wrote: [ -> ]If you've seen my posts on those three different forums, you are very well-read, wesk05! Yes, all I use these boxes for is Kodi. [My TV has native 4K Netflix capability and I'm thrilled with how it looks.] I'm going to have to take a closer look at those reported problems with the Nvidia Shield 3.0 update (and the 3.1 update) to confirm your insight.

As for Hisense 55H9B2 compatibility: as I understand it, it has to do with whether or not that TV's two HDMI 2.0 ports are HDCP 2.2. I've read conflicting claims. And I'm not sure how to determine what it is on my unit ... short of spending $200 on the Shield and plugging it into the TV.
I have replied to your post on those forums! If you are going to rely on your TV's apps for 4K streaming, you really don't need HDCP 2.2. The one clear advantage that Shield will have over Minix U1 is, if and when it is updated to support HDMI 2.0a/HDR10 support, you will be able to play such videos from the Shield. Minix U1 will probably never get that update. You will have to wait for the S912 ones for that feature.
I rely on the TV's apps for Netflix 4K streaming. But I plowed through all of Netflix's 4K content during my first free month and have let my subscription lapse. [Yes, I watch a lot of television. Wink ] Virtually all of my TV watching is with Kodi ... although I'll probably switch to SPMC at some point. There's not much 4K content available yet, but I would want any new device to be capable of it (full 4K, 60Hz ... although not necessarily HDR, since my TV is not HDR). Otherwise, what I would look for is outstanding 1080p performance (and compatibility with my 55H9B2 TV, of course). Since my current box does deliver good 1080p with my TV, I think I'll stick with that for awhile.

Thanks so much for all your advice, wesk05 ... here and elsewhere! And thank you, hdmkv, for your reassurance about the Nvidia Shield. The question about its compatibility with my Hisense TV remains.
I don't think you'll have an issue w/Shield with your Hisense. Only other streamers that can do 4K Netflix are Fire TV (gen2) and Roku4 I think. But, Shield also has 4K via YouTube and UltraFlix. And, unfortunately only Fire TV and Roku streamers have Amazon VOD/Prime in 4K.
Yeah, I know about the restrictions of 4K Amazon Prime ... either with my TV or with most boxes. Too bad, because I do have an Amazon Prime account (for other purposes). But I'm not going to get a Fire TV box just to play those few 4K videos. Its lack of gigabit ethernet makes it slower than the Minix U1 and Nvidia Shield (and even my X8-H Plus) for Kodi/SPMC streaming. If I had 100 bucks to throw around, I suppose I could use it as a supplemental box plugged into the second HDMI 2.0 port ... but I've not heard of any significant increase planned for social security benefits. Big Grin

I appreciate your reassurance about the Shield with my Hisense 55H9B2, hdmkv, but you don't say that you have experience with it. Actually, I'm not sure anyone has ... or, at least, has written about it. That model is not ubiquitous. A price I must pay for buying a lesser-known brand/model. But I did get a lot for my money ($750) ... and it's working just fine with 1080p Kodi on my X8-H Plus and its own 4K Netflix. [BTW, I like its curve. I sit directly in front of the TV (so there's no degradation) and the curved look is pleasing aesthetically, whether or not it is actually more immersive.]

I'm pretty much persuaded by the discussion here that the Shield works very well with SPMC. But whether it would work well with my particular TV remains to be seen.

Thanks, all!
I'm obviously not sure Shield and your Hisense will play nice, but as both are HDMI 2.0a/HDCP2.2, I believe they should. Also, no HDR nor 10-bit in the mix right? Seems similar to my 'budget' Vizio D65u-D2 4K. I use the sole 4K@60 input on it w/my Shield and works well with all 4K content I've tested to date... mostly ones I've collected in the Kodi a/v samples wiki here.
Actually, I just received an answer to my question about Nvidia Shield compatibility that I posted on the Amazon page for the 55H9B2: "YES! The picture is nice, almost live. I love it."

What the hell! When my monthly SSA deposit is made tomorrow, I just may spend the 200 bucks for the Shield. Who knows! I might even get into gaming. Cool [In the past, I was an avid player of computer and PS2 RPGs.]

Thanks again, everyone!
Pages: 1 2 3