• 1
  • 2(current)
  • 3
  • 4
  • 5
  • 126
Intel Apollo Lake
#16
(2016-04-21, 21:25)fritsch Wrote: What I meant above was: There is a whole lot Android hw out there ... which pretty impressive data sheets. But with current level of kodi support for it - it's just paper-weight.

Intelwise I wait for Broxton and this one I have most likely confused with the chip above. Broxton has hevc-10 bit support.

Guess we won't be waiting for Broxton any longer

http://www.anandtech.com/show/10288/inte...-cancelled
Reply
#17
(2016-04-21, 21:25)fritsch Wrote: What I meant above was: There is a whole lot Android hw out there ... which pretty impressive data sheets. But with current level of kodi support for it - it's just paper-weight.

Intelwise I wait for Broxton and this one I have most likely confused with the chip above. Broxton has hevc-10 bit support.

Looks like it may be a wait for Apollo Lake instead. Broxton and Sofia both canned.
Reply
#18
Hope they don't can Kaby Lake.
[H]i-[d]eft [M]edia [K]een [V]ideosaurus
My Family Room Theater
Reply
#19
Intel must really be feeling the pain from the explosion of excellent bang for the buck ARM SoC's.

I mean C'mon the year is 2016 and we still have not seen HDMI 2.0 or HDMI CEC control from Intel for a media player Sad
When half decent SoC manufacturers like AMLogic finish modernising their Kernel (its already started) then the situation is only going to get worse.

Reply
#20
(2016-05-01, 06:51)wrxtasy Wrote: Intel must really be feeling the pain from the explosion of excellent bang for the buck ARM SoC's.

I mean C'mon the year is 2016 and we still have not seen HDMI 2.0 or HDMI CEC control from Intel for a media player Sad
When half decent SoC manufacturers like AMLogic finish modernising their Kernel (its already started) then the situation is only going to get worse.

It's also possibly linked to the Android vs Windows issue. Windows 10 really only works on x86 (Intel and AMD) hardware (ignore Windows Surface RT and Windows Phone...), Android runs on a LOT more...

Android is the platform of choice at the low end for tablets and phones not Windows 10... I think the canning of Sofia (which was the Intel phone SoC) probably means that they've realised ARM have got total control of the Smartphone space now, and Broxton is probably them (and Microsoft) giving up on the cheap tablet market (think <US$150) that the Bay Trail Z-series and Cherry Trail x3-8500 SoCs were aimed at.
Reply
#21
wasn't skylake doing 8bit hardware and 10bit hybrid hevc decode?
Reply
#22
On windows, yes.

Btw. the intel people told me, that Apollo Lake _is_ what they internally use as BXT, so that should have hevc 10 bit support and VP9 decoding support.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#23
Fritsch,

I hope you can help me out with some "confusion" about HDR.
Does a 10-bit capable system automatically means HDR support? Or is that a whole different hardware implementation.

I understand that you need 10bit and minimum HDMI 2.0 for that wider color gamut (BT.2020), so I can conclude that Kaby Lake will support that natively?
Can i make the same conclusion about HDR?

In other words: Does a 10bit, hdmi 2.0 system meets the requirements to play "Ultra HD Premium certificate" material? (Image Resolution: 3840×2160, Colour Bit Depth: Minimum 10-bit signal, Colour: BT.2020 colour representation, High Dynamic Range: SMPTE ST2084 EOTF)

If you have a link to a good article, just put me through. I will read it, cause I don't understand that part very well of the whole 4K-story...
Windows 10 Pro (64bit), Kodi v19.1 "Matrix"
Intel NUC8i3BEH (Samsung 970 Evo, G. Skill Ripjaws 8GB)
Samsung UE49KS7000, Logitech Harmony remote 350
AudioEngine D1, Synology DS218j NAS (SMB protocol)
Reply
#24
(2016-06-20, 11:00)Zokkel Wrote: Fritsch,

I hope you can help me out with some "confusion" about HDR.
Does a 10-bit capable system automatically means HDR support? Or is that a whole different hardware implementation.

I understand that you need 10bit and minimum HDMI 2.0 for that wider color gamut (BT.2020), so I can conclude that Kaby Lake will support that natively?
Can i make the same conclusion about HDR?

In other words: Does a 10bit, hdmi 2.0 system meets the requirements to play "Ultra HD Premium certificate" material? (Image Resolution: 3840×2160, Colour Bit Depth: Minimum 10-bit signal, Colour: BT.2020 colour representation, High Dynamic Range: SMPTE ST2084 EOTF)

If you have a link to a good article, just put me through. I will read it, cause I don't understand that part very well of the whole 4K-story...

TL;dr IMHO it's too early to tell how HDR will be handled by Kodi, Hardware manufacturers, SoC producers etc.

This is quite a good primer on the issues surrounding HDR (clue - 10 bits isn't really enough - so you still need some processing to get a 12-14 bit signal into 10 bits) http://downloads.bbc.co.uk/rd/pubs/whp/w...WHP283.pdf Part of the paper is general, then it moves into the BBC/NHK Hybrid Log Gamma arena I think. (Just skimmed it)

This also looks worth a read : http://www.lightillusion.com/uhdtv.html ST2084 EOTF is only part of the equation. Dolby Vision and HDR-10 both use it (BBC/NHK HLG doesn't AIUI as their HLG replaces it?)

Dolby Vision approach here http://www.dolby.com/us/en/technologies/...-paper.pdf Looks like explicit Dolby Vision support is required (not just 10 bit decode) I assume the 12 vs 10 bit arguments are a (not so) subtle dig at HDR-10?

When it comes to HDMI output : http://www.flatpanelshd.com/news.php?sub...1457513362 AIUI HDMI 2.0a (or internal connectivity) is required for HDR-10, but Dolby Vision 'tunnels' through an HDMI 2.0 connection? Both are stuck with static grade metadata (one of the aspects of HDR is that metadata is sent to the display telling it what to do with the video it receives AIUI) whereas HDMI 2.1 should add scene-by-scene metadata support (though if Dolby Vision tunnels does it need HDMI 2.1?) Or am I misunderstanding this.

However it appears that HDR-10 and Dolby Vision both require specific metadata handling in addition to just sending 10 bit video over HDMI, so 10 bit decode and output may not be enough.
Reply
#25
Wow, Thanks,

I've got some reading to do :-)
Windows 10 Pro (64bit), Kodi v19.1 "Matrix"
Intel NUC8i3BEH (Samsung 970 Evo, G. Skill Ripjaws 8GB)
Samsung UE49KS7000, Logitech Harmony remote 350
AudioEngine D1, Synology DS218j NAS (SMB protocol)
Reply
#26
Just read some comments on the HDMI 2.1 article - and as I thought most people think the proprietary 'tunnelling' approach that Dolby Vision uses over HDMI (which means it can be carried over HDMI 1.4b connections in some cases) mean that Dynamic metadata should be possible before HDMI 2.1 with DV. However if you look at the diagrams you need a lot of Dolby magic dust at every stage. (and a licence for each element no doubt...) I'm assuming some of that dust handles the tunnelling of metadata...
Reply
#27
After some fast reading I come to this 'simplified' conclusion:

Since the information for HDR en WCG is in the metadata, Kaby Lake will be able to handle that within the known decoders (HEVC, AVC)
It will be limited to 10-bit: no problem for HDR10 standard, but Dolby Vision...

"Dolby Vision can be decoded by a standard HEVC decoder, then post-processed using a Dolby Vision module to produce the full range 12 bit Dolby Vision signal" (whatever that module may be)

But you're right, this has nothing to do with Kaby Lake anymore. Will read that HDMI 2.1 article...

Discussion move to http://forum.kodi.tv/showthread.php?tid=266870&page=2
Windows 10 Pro (64bit), Kodi v19.1 "Matrix"
Intel NUC8i3BEH (Samsung 970 Evo, G. Skill Ripjaws 8GB)
Samsung UE49KS7000, Logitech Harmony remote 350
AudioEngine D1, Synology DS218j NAS (SMB protocol)
Reply
#28
(2016-06-20, 13:57)Zokkel Wrote: After some fast reading I come to this 'simplified' conclusion:

Since the information for HDR en WCG is in the metadata, Kaby Lake will be able to handle that within the known decoders (HEVC, AVC)
It will be limited to 10-bit: no problem for HDR10 standard, but Dolby Vision...

Assuming that the metadata is passed from the HEVC/AVC decoder to the HDMI output stages - then yes I'd agree. Though that is an assumption that I don't know we can confirm yet. I assume it will require driver support to allow the metadata to be passed to the HDMI output sub-systems?

Quote:"Dolby Vision can be decoded by a standard HEVC decoder, then post-processed using a Dolby Vision module to produce the full range 12 bit Dolby Vision signal" (whatever that module may be)

But you're right, this has nothing to do with Kaby Lake anymore. Will read that HDMI 2.1 article...
I think that they are saying that they have different post-processing with a different metadata path? (Or does HDR-10 not have post-processing and instead just have display metadata - or is this just semantics?)
Reply
#29
These are a lot of question marks Big Grin
Time will tell, time will tell...
Windows 10 Pro (64bit), Kodi v19.1 "Matrix"
Intel NUC8i3BEH (Samsung 970 Evo, G. Skill Ripjaws 8GB)
Samsung UE49KS7000, Logitech Harmony remote 350
AudioEngine D1, Synology DS218j NAS (SMB protocol)
Reply
#30
Apollo lake boards are coming! Big Grin
not particularly interesting, but a 30-40% boost in performance is welcome for people loading them with things to do Smile

http://www.asrock.com/ipc/overview.asp?Model=IMB-157

This one with 4/6 intel nics mmmm pico itx... diy nas?
http://www.asrock.com/IPC/overview.asp?Model=NAS-9402

http://www.asrock.com/IPC/overview.asp?Model=SOM-P101

http://www.asrock.com/ipc/overview.asp?Model=SBC-230

http://www.asrock.com/ipc/overview.asp?Model=SOM-Q101


New nucs

http://www.fanlesstech.com/2016/07/exclu...e-nuc.html

http://www.anandtech.com/show/10492/inte...-lake-socs


Quote:Apollo Lake vs Braswell Geekbench Comparison

Highest Pentium N3710 (4C/4T up to 2.56 GHz) Windows x86 Score:
Single-Core Score: 1030
Multi-Core Score: 3541

https://browser.primatelabs.com/geekbench3/5378070

Highest Pentium N4200 (4C/4T up to 2.4/2.5 GHz) Windows x86 Score:
Single-Core Score: 1418
Multi-Core Score: 4601

https://browser.primatelabs.com/geekbench3/7251504

Goldmont cores should bring a nice boost to ST performance, great news for budget notebooks.
Reply
  • 1
  • 2(current)
  • 3
  • 4
  • 5
  • 126

Logout Mark Read Team Forum Stats Members Help
Intel Apollo Lake9