• 1
  • 2
  • 3(current)
  • 4
  • 5
  • 21
Intel Gemini Lake
#31
(2018-01-25, 18:43)HDGMA Wrote: Do you think Intel will provide some "random Indian guy" with NUC review samples & early access to their latest driver? 
Probably not, but that still does not make him an Intel employee and the source of his statement remains unclear. I'd assume that if he is actually in contact with someone at Intel then it's probably just the PR department. And when you think about that HDR processing workflows thing, this sounds more like something from a PR guy. As I said earlier to me this sounds more like the workflow in professional photography than anything video/display related. HDR workflows for professional photography will suck at any low-end platform, that's without doubt. My guess is that this is just a misunderstanding.

The statment from Ville Syrjälä about HDR on Gemini Lake sounds more realistic to me. He is working at Intel, he is a driver developer, he already has access to the hardware and he knows the full specs. So I still don't get your point, sorry.
 
(2018-01-25, 18:49)DenisDA Wrote: Support for HDMI 2.0 a does not mean that HDR is supported. It only means the ability to output video format 2160p@60Hz. Apollo Lake plays well video format 2160p@60Hz 10 bit, but without HDR. Alas, in Gemini also does HDR as InTru 3D is a lowend product.
Well 2160p@60Hz is already supported in HDMI 2.0 even at 10 bit where HDMI 2.0a (which is newer than 2.0) means 2160p@60Hz with HDR metadata support. Apollo Lake only has HDMI 2.0 (no a), which pretty much rules out HDR support. But you are right that HDMI 2.0a does not automatically mean HDR support, it is just more likely. That's why I took this as a clou that HDR could be supported.

Still, nobody has access to Gemini Lake hardware yet, so these are all speculations. But from the links I posted above it's still looking good I think. I don't have a crystal ball handy for checking, though.
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#32
(2018-01-25, 19:24)direx Wrote:
(2018-01-25, 18:43)HDGMA Wrote: Do you think Intel will provide some "random Indian guy" with NUC review samples & early access to their latest driver? 
Probably not, but that still does not make him an Intel employee and the source of his statement remains unclear. I'd assume that if he is actually in contact with someone at Intel then it's probably just the PR department. And when you think about that HDR processing workflows thing, this sounds more like something from a PR guy. As I said earlier to me this sounds more like the workflow in professional photography than anything video/display related. HDR workflows for professional photography will suck at any low-end platform, that's without doubt. My guess is that this is just a misunderstanding.

The statment from Ville Syrjälä about HDR on Gemini Lake sounds more realistic to me. He is working at Intel, he is a driver developer, he already has access to the hardware and he knows the full specs. So I still don't get your point, sorry.
 
(2018-01-25, 18:49)DenisDA Wrote: Support for HDMI 2.0 a does not mean that HDR is supported. It only means the ability to output video format 2160p@60Hz. Apollo Lake plays well video format 2160p@60Hz 10 bit, but without HDR. Alas, in Gemini also does HDR as InTru 3D is a lowend product.
Well 2160p@60Hz is already supported in HDMI 2.0 even at 10 bit where HDMI 2.0a (which is newer than 2.0) means 2160p@60Hz with HDR metadata support. Apollo Lake only has HDMI 2.0 (no a), which pretty much rules out HDR support. But you are right that HDMI 2.0a does not automatically mean HDR support, it is just more likely. That's why I took this as a clou that HDR could be supported.

Still, nobody has access to Gemini Lake hardware yet, so these are all speculations. But from the links I posted above it's still looking good I think. I don't have a crystal ball handy for checking, though.   
A space in the expression "Support for HDMI 2.0 a does not mean that HDR is supported." superfluous. I meant HDMI 2.0a!!!! 
The implementation of HDMI 2.0a at the Apollo and Gemini are the same across the Converter from DisplayPort.
If you do not know, do not try to prove something
Reply
#33
(2018-01-26, 07:46)DenisDA Wrote: A space in the expression "Support for HDMI 2.0 a does not mean that HDR is supported." superfluous. I meant HDMI 2.0a!!!!

You said "HDMI 2.0 a [...] only means the ability to output video format 2160p@60Hz". And I said this is wrong, because the a in the HDMI spec actually stands for HDR metadata. And I also said that you are probably right about the fact that HDMI 2.0a does not necessarily mean the device will actually support HDR for sure - it's just an indicator that it might.
 
(2018-01-26, 07:46)DenisDA Wrote: The implementation of HDMI 2.0a at the Apollo and Gemini are the same across the Converter from DisplayPort.
If you do not know, do not try to prove something 

Oh man, please do your homework (do you school's homework first of course). There are so many wrong things in just these two sentences. Where should I begin?
 
  • Apollo Lake graphics architecture is based on Skyake. So if Skylake does not support HDR, Apollo Lake also cannot support HDR.
  • Apollo Lake only has HDMI 1.4 and DP 1.2. You are right that many manufacturers solder an LSPCon chip on their mainboards which converts the DP 1.2 to HDMI 2.0. The MCDP2800BC is a famous LSPCon chip which even supports HDMI 2.0a with later firmwares. That still does not help us if the graphics engine cannot handle HDR information, so it still means no HDR on Apollo Lake (and Skylake). Apollo Lake NUCs for example are only advertised with HDMI 2.0 (no a). Gemini Lake NUCs are being advertised with HDMI 2.0a.
  • Gemini Lake graphics architecture is based on Kabylake (Gen 9.5). It's actually newer, as Gemini Lake already has Gen10 display (see this). This means Gemini Lake has native HDMI 2.0a and does not need that crappy LSPCon chip any more.
  • So we know for sure that Gemini Lake has Kaby Lake graphics (fewer execution units, but same media engine) and we also know that the actual Core i* Kaby Lakes do support HDR.
  • And I never tried to prove that Gemini Lake will have HDR for sure. From my very first post in this thread I just said that it is way too early to give up on HDR support on GL, as there are strong indicators that it might be supported – despite the fact that one random guy claims he got his information straight from Intel.

But I am not going to argue any longer with you people about that. Sooner or later we'll know if HDR will be supported or not.
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#34
(2018-01-25, 09:30)direx Wrote:
(2018-01-24, 23:23)honcho Wrote:  It's virtually the same as apollo lake. 

No HDR, sorry. This is from Intel.
 Please don't make claims without proof. And don't mix up the CPU family with the GPU architecture. Let me get things straight:
  • Apollo Lake has Gen8 graphics (basically the same GPU architecture that Broadwell has). This means no HDR support.
  • Gemini Lake has Gen9.5 graphics with Gen10 display (Kabylake GPU architecture with native HDMI 2.0). This means everything for HDR support is nicely in place.
And since you are saying "this is from Intel", all I can say that this is from Intel:

https://communities.intel.com/thread/118457

Quote:
Quote:With Windows® 10 RS3 (Version 1709) and Intel® Graphics driver version 15.60.0.4849 installed, all 7th generation Intel® NUC Kits and Mini PCs will support HDR natively.
 And you know what? Gemini Lake NUCs are 7th generation Intel NUCs, which is why they have a 7 in their product name (NUC7CJYH/NUC7PJYH).

So please give proof when you are saying HDR is not supported. And no, that one random guy that everybody was quoting from earlier in this thread is no proof at all.

I think that Gemini Lake NUCs will make a very good 4K HTPC with Kodi  Nod 
 I don't seem to be the one misinformed: 

Image
Reply
#35
Gemini Lake is not a chip of the 7th generation as Kaby Lake. The graphic part in this cut, starting with Intel InTru 3D. Cut and other parts. I created a theme with a request for Intel information about HDR https://communities.intel.com/thread/121993
And add screen block-shem Gemini Lake:
Image
The Apollo and Gemini have the same graphical components Gen9. New in the chart only the replacement of the HDMI1.4b on HDMI2.0
In all other respects identical
Reply
#36
(2018-01-26, 21:16)honcho Wrote:  I don't seem to be the one misinformed: 

Image
What do you mean? This block diagram shows Apollo Lake.
(2018-01-27, 08:44)DenisDA Wrote: Gemini Lake is not a chip of the 7th generation as Kaby Lake. The graphic part in this cut, starting with Intel InTru 3D. Cut and other parts.
Strange. According to the Intel Website the Pentium J5005 has a Intel® UHD Graphics 605. And according to this datasheet this GPU is of generation Kaby Lake (Gen 9.5), such as all UHD Graphics 6xx.
(2018-01-27, 08:44)DenisDA Wrote: I created a theme with a request for Intel information about HDR https://communities.intel.com/thread/121993
OK, this will be interesting. Thank you.
(2018-01-27, 08:44)DenisDA Wrote: And add screen block-shem Gemini Lake:

The Apollo and Gemini have the same graphical components Gen9. New in the chart only the replacement of the HDMI1.4b on HDMI2.0
In all other respects identical

For a very long time in this thread this is an actual fact (that's why I decided to step back into the discussion). And the block diagram actually does not look good in terms of HDR support. What I still don't understand is why the model number (UHD Graphics 605) of the GPU suggests that we're dealing with a Gen 9.5 GPU. Strange. And there are sites which mentioned 10 bit VP9 support for Gemini Lake, which also sounds like Gen9.5 (kaby Lake) and not Gen9 (Skylake).

I just checked the current Linux situation with Gemini Lake and found this. It is the current HDR playground for Intel Linux graphics and the interesting thing are these lines in drivers/gpu/drm/i915/intel_hdmi.c:

c:

if (INTEL_GEN(dev_priv) >= 10 || IS_GEMINILAKE(dev_priv))
intel_hdmi_set_drm_infoframe(encoder, crtc_state, conn_state);
}

What they do there is setting the HDR infoframe for Gen10 or Geminilake. Doesn't that look promising? This is another pro for HDR support, while the block diagram which @DenisDA posted is another con. Huh
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#37
(2018-01-28, 20:26)direx Wrote: I just checked the current Linux situation with Gemini Lake and found this. It is the current HDR playground for Intel Linux graphics and the interesting thing are these lines in drivers/gpu/drm/i915/intel_hdmi.c:

c:

if (INTEL_GEN(dev_priv) >= 10 || IS_GEMINILAKE(dev_priv))
intel_hdmi_set_drm_infoframe(encoder, crtc_state, conn_state);
}

What they do there is setting the HDR infoframe for Gen10 or Geminilake. Doesn't that look promising? This is another pro for HDR support, while the block diagram which @DenisDA posted is another con. Huh 
 
Am I missing something?

All that is needed for real-world HDR video playback support is the following - unless I'm mistaken?

1. HEVC 10bit decoding and a clean 10bit path through the video subsystem and out to the HDMI port.
2. The ability to set the correct info frame elements in the HDMI signal, to flag HDR10 (and carry the right levels metadata) or HLG EOTFs, and to properly flag colour space and white point? (Possibly also colour primaries?)

AIUI 1. has been the case with Apollo Lake and similar via Display Port (at least under Windows) - but the lack of an integrated HDMI 2.0 solution has meant that it hasn't been possible to do 2. and insert the right metadata because the DP1.2->HDMI 2.0 external chipset doesn't support it, or Intel haven't supported it.

Am I missing something ? Aren't they the two basic things required - technically - for HDR support - there's nothing else, other than driver access to the HDMI system to generate the right info frames etc.?

(Technically you could probably get an Apollo Lake to do HDR replay IF you used something like an HD Fury Vertex to inject the right HDR EOTF and colour space metadata, and had a 10 bit path through Windows? Or am I missing something?)
Reply
#38
(2018-01-28, 23:28)noggin Wrote: Am I missing something?

All that is needed for real-world HDR video playback support is the following - unless I'm mistaken?

1. HEVC 10bit decoding and a clean 10bit path through the video subsystem and out to the HDMI port.
2. The ability to set the correct info frame elements in the HDMI signal, to flag HDR10 (and carry the right levels metadata) or HLG EOTFs, and to properly flag colour space and white point? (Possibly also colour primaries?)
Correct. BTW, this has been my point for the entire time in this thread Smile
(2018-01-28, 23:28)noggin Wrote: AIUI 1. has been the case with Apollo Lake and similar via Display Port (at least under Windows) - but the lack of an integrated HDMI 2.0 solution has meant that it hasn't been possible to do 2. and insert the right metadata because the DP1.2->HDMI 2.0 external chipset doesn't support it, or Intel haven't supported it.
Pretty much. As far as I understand the HDR infoframe handling needs a tiny bit of support in the GPU, it's not just the LSPCon. The thing is that this tiny bit is missing in the Skylake-era GPUs (such as Apollo Lake). It is present in the Kaby Lake GPUs and the big question now is if Gemini Lake has a Gen 9 GPU (Skylake) or a Gen 9.5 GPU (Kaby Lake) or maybe even a Gen 9 GPU with support for HDR infoframe handling (maybe due to Gen10 display connections or firmware differences).

Since some sites claim that 10-bit VP9 decoding will be possible with Gemini Lake it even looks like we could be dealing with a Gen 9.5 GPU. This would be great, but the block diagram that @DenisDA posted says a different thing. On the other hand I have posted multiple facts earlier which look like HDR could be supported.
(2018-01-28, 23:28)noggin Wrote: Am I missing something ? Aren't they the two basic things required - technically - for HDR support - there's nothing else, other than driver access to the HDMI system to generate the right info frames etc.?
Correct, HDR support is actually just a tiny piece. The much bigger thing is the 10 bit decoding, which will work for sure on Gemini Lake (at least for HEVC).
(2018-01-28, 23:28)noggin Wrote: (Technically you could probably get an Apollo Lake to do HDR replay IF you used something like an HD Fury Vertex to inject the right HDR EOTF and colour space metadata, and had a 10 bit path through Windows?
 I have never seen a HD Fury Vertex but my guess is that it could enable HDR support even on Apollo Lake. You just have to inject the metadata into the HDMI stream somehow and that's what that device seems to do.
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#39
(2018-01-29, 22:48)direx Wrote:
(2018-01-28, 23:28)noggin Wrote: (Technically you could probably get an Apollo Lake to do HDR replay IF you used something like an HD Fury Vertex to inject the right HDR EOTF and colour space metadata, and had a 10 bit path through Windows?
 I have never seen a HD Fury Vertex but my guess is that it could enable HDR support even on Apollo Lake. You just have to inject the metadata into the HDMI stream somehow and that's what that device seems to do.   

I have one - the GUI lets you inject custom Infoframes etc. into an HDMI stream - including adding ST.2084 EOTF flags, selecting Rec 709, Rec 2020 or DCI-P3 Primaries, specing the White Point, and adding the HDR10-style metadata for Max/Min Luminance and MaxCLL and MaxFALL.

It has a PC GUI that lets you do this - and also has a comprehensive OSD and OLED display in the device to detail the precise properties of input and output signals etc.  It's a very useful bit of kit. Not cheap, but incredibly useful.  (It also lets you create custom EDIDs to get sources to do what you want them to)

https://www.hdfury.com/docs/HDfuryVertex.pdf
Reply
#40
HEVC 10-bit plays fine Apollo Lake, this Lake Gemini does not differ.

All disputes can be finished, is the answer from Intel - Gemini Lake does not support HDR
Reply
#41
https://www.intel.com/content/dam/suppor...odSpec.pdf

Intel NUC7CJY/NUC7PJY Technical Product Specification.
Reply
#42
(2018-01-31, 11:30)HDGMA Wrote: https://www.intel.com/content/dam/suppor...odSpec.pdf

Intel NUC7CJY/NUC7PJY Technical Product Specification.

The tech specs leave room for speculations. It looks like we could actually be dealing with a Gen9.5 GPU, as the specs mentioned full VP9 hardware decoding support (this was rather incomplete on Gen9). And what also speaks for Gen9.5 graphics is the Intel Linux Driver stack. This is what they say in the changelog:
Code:
LIBVA-INTEL-DRIVER-1.8.3
Add support for Gemini Lake (aka. GLK)
- Decoding: H.264/MPEG-2/VC-1/JPEG/VP8/HEVC/HEVC 10-bit/VP9/VP9 10-bit
- Encoding: H.264/MPEG-2/JPEG/VP8/VP9/HEVC/HEVC 10-bit/AVC low power CQP mode
- VPP: CSC/scaling/NoiseReduction/Deinterlacing{Bob, MotionAdaptive, MotionCompensated}/ColorBalance/STD

The VP9 10-bit decoding is only available in Gen9.5 (Kaby Lake graphics). And we know that Gen9.5 is HDR-capable.

BTW and slightly OT: The Kaby Lake NUC spec does not mention VP9 at all, which also does not make sense. And the Kaby Spec of the NUC7I3BNK for example also does not list HDR as a feature of the GPU, although we know it is supported.

And what also bothers me is the part about HDMI compliance. The tech spec for the GLK NUCs confirms that „the HDMI ports are compliant with the HDMI 2.0a specification“. I don't have access to the actual compliance requirements, but these are some interesting facts from the Kaby Lake NUCs:
 
  • Initially the Kaby Lake NUCs were marketed with HDMI2.0
  • MegaChips release a firmware update for the LSPCon in the NUCs which added support for HDR infoframe handling
  • Intel then updated the specifications for the Kaby NUCs and changed the HDMI version to 2.0a (you can even see that in the cangelog of the specs)
So you see that HDMI 2.0a is actually related to HDR. But since I don't have access to the HDMI spec I don't know if 2.0a only describes the capabilities of the interface or if it means that HDR must actually be supported.
(2018-01-30, 06:34)DenisDA Wrote: HEVC 10-bit plays fine Apollo Lake, this Lake Gemini does not differ.
All disputes can be finished, is the answer from Intel - Gemini Lake does not support HDR

Okay, this is another strong fact which acutally speaks against HDR support. Still this is could be a marketing thing from Intel or it could even be an intentional castration of their Windows driver for market positioning reasons.

If the latter is true then things could be looking more friendly for Linux users, once the infrastructure for general HDR support is in place (independent from the GPU model and vendor). As I mentioned earlier we already have proof-of-concept HDR support for Gemini Lake (although nobody was able to try it yet in the wild) and we also have that statement from an Intel Linux graphics driver developer that you need at least Gemini Lake hardware for HDR support...

I say the situation remains interesting, although I admit that the post in the Intel forum is a bummer.
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#43
(2018-01-31, 20:04)direx Wrote: So you see that HDMI 2.0a is actually related to HDR. But since I don't have access to the HDMI spec I don't know if 2.0a only describes the capabilities of the interface or if it means that HDR must actually be supported.
For HDMI adopters to market their product with a certain HDMI version, the product must comply with the particular HDMI specification. HDMI interface has to comply with CTA 861.3-A or CTA 861-G to meet HDMI 2.0a certification.
Reply
#44
(2018-01-31, 20:17)wesk05 Wrote:
(2018-01-31, 20:04)direx Wrote: So you see that HDMI 2.0a is actually related to HDR. But since I don't have access to the HDMI spec I don't know if 2.0a only describes the capabilities of the interface or if it means that HDR must actually be supported.
For HDMI adopters to market their product with a certain HDMI version, the product must comply with the particular HDMI specification. HDMI interface has to comply with CTA 861.3-A or CTA 861-G to meet HDMI 2.0a certification.   
Still, the question which remains is: if a product is marketed with HDMI 2.0a does it mean that HDR has to be supported by the product? The Gemini Lake NUCs are marketed with HDMI 2.0a, so I was wondering if that has any implications in terms of HDR support.
Main: CPU: Intel Core i7-4790K, GFX: AMD RX480 8GB OC, Debian Linux
HTPC: CPU: Intel Core i5-3475S, GFX: Intel HD4000, Gentoo Linux (Kernel 4-stable), Kodi with 4K@30Hz and HD audio, AVR: Denon X6300H with 7.1.4 setup, TV: Samsung UE55HU7590@SEK3500
Fun: Microsoft XBox, with XBMC :)
Reply
#45
(2018-01-31, 20:47)direx Wrote: Still, the question which remains is: if a product is marketed with HDMI 2.0a does it mean that HDR has to be supported by the product? The Gemini Lake NUCs are marketed with HDMI 2.0a, so I was wondering if that has any implications in terms of HDR support.
If a product is marketed as HDMI 2.0a it does have to support HDR static metadata to pass HDMI 2.0a compliance testing. This only means that it has to be supported under some OS. It could be Windows or Linux. Intel in the past has modified technical data specifications of a product while it was in production. So, we can never be certain unless it has been independently tested and verified.
Reply
  • 1
  • 2
  • 3(current)
  • 4
  • 5
  • 21

Logout Mark Read Team Forum Stats Members Help
Intel Gemini Lake1