• 1(current)
  • 2
  • 3
  • 4
  • 5
  • 11
Intel NUC - Skylake (6th Generation CPU)
#1
Lightbulb 
Intel® NUC - Skylake (6th Generation CPU)
Image
Dimensions WxHxD: 115 x 34/49 x 111 mm
Quote:Barebone, WLAN, SDXC, Core™ i3-6100U: Intel® NUC Kit NUC6i3SYH/K290, Review,
Barebone, WLAN, SDXC, Core™ i5-6260U: Intel® NUC Kit NUC6i5SYH/K390
System-SSM (6Gb/s): Plextor M6G 128GB €70 / Crucial MX200 250GB/500GB €100/175
System-SSM (10Gb/s): Plextor M6E-2280 128GB/256GB/512GB €100/190/340
Ram: SO-DIMM Kit 8GB €40

Option:
Data-HDD (2,5", 24/7): Western Digital Red 750GB / 1TB / 2TB €50 / €65 / €90
Remote Control: Xbox 360 Media Remote $25
Reply
#2
Probably a good idea to make a new thread for it. As has been mentioned, 1 x mini HDMI 1.4a is not very reassuring, particularly not seeing the roadmap shows no difference between Rock Canyon and Swift Canyon with regards to 4K support, still only mentions 4K support over miniDP. Undecided
Reply
#3
That miniHDMI version number appears in the text of the related article, it does not appear anywhere on the actual Intel roadmap slide, therefore the information that it's HDMI 1.4a could well be wrong, however it indeed doesn't look good by only mentioning 4K for the miniDP connection.
Reply
#4
As posted in the other thread maybe lack of hdmi 2.0 i due to this

Intel often make strange decision like no usb 3.0 for p55 chipset
Reply
#5
no hdmi 2.0 up to 2017, makes this really disappointing and NOT an UHD-proof HTPC.

If it did include it, I'm sure they'd mention it next to the mHDMI.
Reply
#6
Display port to HDMI 2.0 solution?
Reply
#7
(2015-06-03, 01:06)myst4ry Wrote: no hdmi 2.0 up to 2017, makes this really disappointing and NOT an UHD-proof HTPC.

If it did include it, I'm sure they'd mention it next to the mHDMI.

I too believe they'd mention it if it was 2.0, but I guess we'll see.

(2015-06-03, 01:41)onizuka Wrote: Display port to HDMI 2.0 solution?

Once I've seen reports of one confirmed working without issues, maybe, but that doesn't make me buy into Skylake. As it is, I'm more likely to wait for the next generation now, and its specs.

Now if all the TV manufacturers would just start bringing out TVs with display ports, and actually make use of the technology, then that would be a different matter. I'd jump on Sony's current UHD range with display port added.
Reply
#8
Anandtech's Skylake review is out: http://www.anandtech.com/show/9483/intel...neration/4

It will support HEVC Main (8-bit) profile Hardware encode/decode but no hardware support for Main10 (10-bit).

Regarding video outputs and supported resolutions/framerates:
Image
Reply
#9
This is strange play by Intel as Nvidia X1 supports 10bit HEVC 4k decode.
So if you can live with Android its a better choice for a media player. Of course if you need other functions that only a PC can give you then your stuck.
Reply
#10
So unless there's a DP 1.2 to HDMI 2.0 converter on the mainboard, this version of NUC will just bring HEVC decoding. That'd be slightly disappointing, but then again Intel has been upgrading the NUCs one small step at a time...

I'd like to see USB-C in addition to HDMI 2.0, but I guess I'll need to wait for the Kaby Lake then.
Reply
#11
Intel (and AMD) are lagging behind in their HEVC implementations... Even VP9 seems to supported in hybrid manner while the Tegra X1 has hardware support for it.

If encoding HEVC is not a requirement I would rather opt for an Intel Braswell box instead of Skylake. Braswell hardware decodes HEVC 8-bit and it's surely cheaper. Having said that, Skylake does offer a chance of supporting 4K@60Hz via DP to HDMI 2.0 converter while Braswell can't go beyond 4K@30Hz as it uses DP 1.1 (which can be a problem for live / recorded TV).
Reply
#12
(2015-08-06, 11:40)Nekromantik Wrote: This is strange play by Intel as Nvidia X1 supports 10bit HEVC 4k decode.
So if you can live with Android its a better choice for a media player.

Personally I've had so many Android devices that have ended up worthless because there's no more updates for them that I do hold certain reservations when it comes to these devices. With standard PC hardware you know you can always install the latest kernels and the user as well as developer base is huge, so in case you run into problems it can be easier to troubleshoot them.

In the PC world, NVidia GTX 960 is good (10-bit HEVC decoding, HDMI 2.0) - I just installed it on my desktop PC, but I just don't want a big PC with a big graphics adapter in my living room (the GTX 960 alone pulls more watts than my NUC). Plus it's certainly not a cheap card.

I guess it'll take still time before these things are in a NUC-size PC.
Reply
#13
Quote:In the PC world, NVidia GTX 960 is good (10-bit HEVC decoding, HDMI 2.0) - I just installed it on my desktop PC, but I just don't want a big PC with a big graphics adapter in my living room (the GTX 960 alone pulls more watts than my NUC). Plus it's certainly not a cheap card.

You can feed in 10 bit HEVC, but it is transforming it to 8 bit and then decoding and outputting 8 bit. Don't get fooled with this 10 bit stuff - I don't know _any_ hw decoder yet, that really decodes the 10 bit directly.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#14
(2015-08-06, 13:20)fritsch Wrote: You can feed in 10 bit HEVC, but it is transforming it to 8 bit and then decoding and outputting 8 bit. Don't get fooled with this 10 bit stuff - I don't know _any_ hw decoder yet, that really decodes the 10 bit directly.

Indeed, but that's better than not decoding the 10-bit HEVC feed at all. I think none of the NVidia consumer graphics adapters support 10-bit output anyhow (let alone most of the displays yet). I think a lot of anime content gets encoded in 10-bit format.
Reply
#15
(2015-08-06, 13:20)fritsch Wrote:
Quote:In the PC world, NVidia GTX 960 is good (10-bit HEVC decoding, HDMI 2.0) - I just installed it on my desktop PC, but I just don't want a big PC with a big graphics adapter in my living room (the GTX 960 alone pulls more watts than my NUC). Plus it's certainly not a cheap card.
You can feed in 10 bit HEVC, but it is transforming it to 8 bit and then decoding and outputting 8 bit. Don't get fooled with this 10 bit stuff - I don't know _any_ hw decoder yet, that really decodes the 10 bit directly.

Does this also apply to the Tegra X1?
Reply
  • 1(current)
  • 2
  • 3
  • 4
  • 5
  • 11

Logout Mark Read Team Forum Stats Members Help
Intel NUC - Skylake (6th Generation CPU)1