• 1
  • 22
  • 23
  • 24(current)
  • 25
  • 26
  • 37
Android "Google Chromecast with Google TV" dongle with a new "Google TV" ecosystem and UI
(2021-01-20, 07:24)rexian Wrote: What version of desktop Kodi doesn't recognize HDR10 stream? I am still running Matrix Beta 2 on Win10 and it plays HDR10 videos fine (including P7 and P81 samples) and my 4K TV recognizes the HDR10 signal.

Sorry if I was misunderstandable. What I wanted to say that only HDR10 type HDR files realized in desktop Kodi as HDR files. So the opposite what you wrote. HDR10 could be recognized by Kodi but Dolby Vision not. Also files with 10 bit encoding and wide color gamut but without such special HDR type (HDR10, or Dolby Vision) also not handled as HDR videos.
Update#1: So I not mentioned that HDR capable TV could not be able to handle HDR10 streams. I just mentioned that Kodi itself (not the display device) is only able to handle HDR10 streams as HDR (I mean can tonemap these streams to SDR to view on SDR displays). None of the other HDR 'types' handled as HDR (in Kodi! not in your display device).

So from a fresh steril viewpoint:

Currently/usually we have 3 type of 10 bit (or more in theoretically) files:
- 10 bit files with wide color gamut encoded without any special HDR standard - desktop Kodi not handles as HDR video
- 10 bit files with wide color gamut encoded as Dolby Vision standard - desktop Kodi not handles as HDR video
- 10 bit files with wide color gamut encoded as HDR10 standard - desktop Kodi handles as HDR video
Update#2: Ok it is inaccurate. I'm using Linux, so my desktop Kodi references refers to Linux desktop Kodi.

But these 'problems' are not GCGTV relevant... at least not only for... Will test them further if I have some time...
Update#4: @ the end of 2019 from: https://forum.kodi.tv/showthread.php?tid...pid2868023
Quote:HDR10 and perhaps eventually HDR10+ are possible as these are open standards. However Dolby Vision is proprietary so the only Kodi way will be able to do that is if there's a hardware decoder available on the device Kodi runs on, this already possible on some Android boxes I believe.
So Dolby Vision in theory should works on Dolby Vision displays if the playback device support it from HW. Kodi itself will never do tonemap so I no need to do further investigations. Will check on GCGTV at evening. In this case if GCGTV support it, Dolby Vision files on SDR TV should be resulted as SDR content after some HDR>SDR conversion made by GCGTV itself as 'black-box' insted of Kodi's soft tonemap. I think.

But '10 bit streams without HDR standard(s)' is still question...

(But there is one more interesting test result: Those Dolby Vision MP4 containered files what really identified as Dolby Vision files by MediInfo simply could not be properly re-encapsulated (remuxed) into MKV container. Video stream wrongly identified and therefore the result is not a standard HEVC video stream. 'Wrong file'.)
Update#3: Remux made by MKVToolnix
Reply
Test results: https://docs.google.com/spreadsheets/d/1...sp=sharing

From GCGTV aspect is interesting. With HW decoding:

Dolby Vision works fine.
- HDR10 not works.
- 'Generic' 10 bit content with some file works but with some files not (this was our previously investigated case...).

Off topic but Linux desktop Kodi also works interestingly. Details in the table.
Reply
(2021-01-20, 11:07)Tamas.Toth.ebola Wrote: Currently/usually we have 3 type of 10 bit (or more in theoretically) files:
- 10 bit files with wide color gamut encoded without any special HDR standard - desktop Kodi not handles as HDR video
- 10 bit files with wide color gamut encoded as Dolby Vision standard - desktop Kodi not handles as HDR video
- 10 bit files with wide color gamut encoded as HDR10 standard - desktop Kodi handles as HDR video
Update#2: Ok it is inaccurate. I'm using Linux, so my desktop Kodi references refers to Linux desktop Kodi.

What type of file are you describing as :
Quote:- 10 bit files with wide color gamut encoded without any special HDR standard - desktop Kodi not handles as HDR video

What you are describing here sounds either as if it is Wide Colour Gamut Rec 2020 SDR - either using a BT.1886 SDR EOTF or a legacy 2.2-2.4 Power Law Gamma EOTF? 10-bit SDR has been a thing for decades - even 720x576 and 720x480 Rec 601 SD video can be 10-bit - Or you are describing a file that contains WCG Rec 2020 video captured or graded to an HDR10 ST.2084 PQ EOTF BUT not correctly flagged as such? The two are VERY different.

You would certainly not expect Rec 2020 WCG video that is encoded with an SDR EOTF to be handled as HDR - though you would expect it to be handled as WCG and tone mapped to Rec 709 if output to a Rec 709 display. (Rec 2020 WCG doesn't equal HDR - though a lot of HDR uses WCG. Just as an HDR EOTF can be used with Rec 709 gamut content too - some Sony cameras will capture Rec 709 HLG HDR for instance.)

HDR10 = Rec 2020 WCG and an ST.2084 PQ EOTF (the latter defines the HDR, the former defines the WCG)
Reply
(2021-01-21, 13:27)noggin Wrote: What you are describing here sounds either as if it is Wide Colour Gamut Rec 2020 SDR - either using a BT.1886 SDR EOTF or a legacy 2.2-2.4 Power Law Gamma EOTF? 10-bit SDR has been a thing for decades - even 720x576 and 720x480 Rec 601 SD video can be 10-bit - Or you are describing a file that contains WCG Rec 2020 video captured or graded to an HDR10 ST.2084 PQ EOTF BUT not correctly flagged as such? The two are VERY different.

What I mentioned as such thing is 10 bit encoded videos with BT.2020 gamut. As you also wrote 10 bit videos in their own are not special. Yet I really not dig enough deeply into the color management side of video encoding (while I think I have more then enough information in this topic in still images and visualization technology) therefore I don't really know hot these informations encoded into the video streams. But if video files are something like still images the color space information in the file is just a simply metadata to define the 'absolute positions' of the RGB primaries. So without such technical knowledge I assume that 10 bit video files basically have 10 bit wide luminance resolution while color gamut is just a metadata to show in theory, in originally those 10 bit wide 'color values' in which color gamut need to be interpreted.

From these aspect 10 bit BT.2020 color target videos are basically HDR videos without any further special definitions. I think Dolby Vision and HDR10 also relies on this basis just have their own standard tone curve for encoding and some more other such details _(Will read about it if I will have enough time)_. What is interesting that this is just my 'theory'. But on the net there are also such videos, exactly how I mentioned. Files defined as 10 bit or HDR videos while they doesn't really have Dolby Vision or HDR10 encodings. They are basically just 10 bit videos with BT.2020 color space flag. And here I am. There are such files what are properly handled by GCGTV and there are also files which are not properly handled. It could be the fault of the files, sure, but the interested aspect of it that there are really lot of such files. Is it possible that all of them are wrong/fake? And one another aspect. If these files are not really HDR files _(I mean that they are just simply SDR videos with wrong encoding therefore with seems HDR results)_ Kodi's tonemapping should 'distort' it's dynamic range also but of course with wrong result _(DCR-ed SDR content)_.

Of course the whole mess could be the result of my missing knowledge. In my mind based on still images HDR videos is wide color gamut videos and because 8 bit is simply not enough to represent such range without banding _(it there is no 'proper' dithering)_, we need 10 bit videos to show that color informations. So if HDR as concept is basically just the more then 8 bit WCG videos what I wrote and mentioned is correct. _(In still images this is a little bit different as WCG in not mandatory part of the HDR concept. There could be HDR image with standard REC.709/sRGB gamut. In practice HDR images are simply those images which have more 8 bit resolution. Just if your display device could render it enough wide luminance scale the result will roughly banded. Of course exactly because this last sentence HDR as concept should not bypass this lumunance question circle but in the practice in the past HDR images not must be in WCG.)_

But back on topic. What I mentioned HDR files are neither Dolby Vision, neither HDR10 files. MediaInfo tells just thet 10 bit encoded with BT.2020 target. From my aspect those files are HDR files, and because there are really such files on the net with HDR 'name', I think not I'm the only one who think the same. Of course I don't know that those files have/use or not real wide gamut, but clearly shows that.

The other problem is the decoding process. Independently from this HDR 'state mess', a 10 bit HEVC is a 10 bit HEVC. If we don't care about proper colors and the dynamic range itself, all 10 bit HEVC files should be properly decoded from performance aspect if the HW can handle such things. But now we have more problems. Performance, colors, dynamic ranges and that there are more standards for the same thing. And why I hate electronics from the 80's Smile

I just would like to draw a pattern which files are correct and which are not yet to help on the further development and on my 'clairvoyance'.

What's your opinion?
Reply
(2021-01-21, 17:03)Tamas.Toth.ebola Wrote:
(2021-01-21, 13:27)noggin Wrote: What you are describing here sounds either as if it is Wide Colour Gamut Rec 2020 SDR - either using a BT.1886 SDR EOTF or a legacy 2.2-2.4 Power Law Gamma EOTF? 10-bit SDR has been a thing for decades - even 720x576 and 720x480 Rec 601 SD video can be 10-bit - Or you are describing a file that contains WCG Rec 2020 video captured or graded to an HDR10 ST.2084 PQ EOTF BUT not correctly flagged as such? The two are VERY different.

What I mentioned as such thing is 10 bit encoded videos with BT.2020 gamut. As you also wrote 10 bit videos in their own are not special. Yet I really not dig enough deeply into the color management side of video encoding (while I think I have more then enough information in this topic in still images and visualization technology) therefore I don't really know hot these informations encoded into the video streams. But if video files are something like still images the color space information in the file is just a simply metadata to define the 'absolute positions' of the RGB primaries. So without such technical knowledge I assume that 10 bit video files basically have 10 bit wide luminance resolution while color gamut is just a metadata to show in theory, in originally those 10 bit wide 'color values' in which color gamut need to be interpreted.

From these aspect 10 bit BT.2020 color target videos are basically HDR videos without any further special definitions. I think Dolby Vision and HDR10 also relies on this basis just have their own standard tone curve for encoding and some more other such details _(Will read about it if I will have enough time)_. What is interesting that this is just my 'theory'. But on the net there are also such videos, exactly how I mentioned. Files defined as 10 bit or HDR videos while they doesn't really have Dolby Vision or HDR10 encodings. They are basically just 10 bit videos with BT.2020 color space flag. And here I am. There are such files what are properly handled by GCGTV and there are also files which are not properly handled. It could be the fault of the files, sure, but the interested aspect of it that there are really lot of such files. Is it possible that all of them are wrong/fake? And one another aspect. If these files are not really HDR files _(I mean that they are just simply SDR videos with wrong encoding therefore with seems HDR results)_ Kodi's tonemapping should 'distort' it's dynamic range also but of course with wrong result _(DCR-ed SDR content)_.

Of course the whole mess could be the result of my missing knowledge. In my mind based on still images HDR videos is wide color gamut videos and because 8 bit is simply not enough to represent such range without banding _(it there is no 'proper' dithering)_, we need 10 bit videos to show that color informations. So if HDR as concept is basically just the more then 8 bit WCG videos what I wrote and mentioned is correct. _(In still images this is a little bit different as WCG in not mandatory part of the HDR concept. There could be HDR image with standard REC.709/sRGB gamut. In practice HDR images are simply those images which have more 8 bit resolution. Just if your display device could render it enough wide luminance scale the result will roughly banded. Of course exactly because this last sentence HDR as concept should not bypass this lumunance question circle but in the practice in the past HDR images not must be in WCG.)_

But back on topic. What I mentioned HDR files are neither Dolby Vision, neither HDR10 files. MediaInfo tells just thet 10 bit encoded with BT.2020 target. From my aspect those files are HDR files, and because there are really such files on the net with HDR 'name', I think not I'm the only one who think the same. Of course I don't know that those files have/use or not real wide gamut, but clearly shows that.

The other problem is the decoding process. Independently from this HDR 'state mess', a 10 bit HEVC is a 10 bit HEVC. If we don't care about proper colors and the dynamic range itself, all 10 bit HEVC files should be properly decoded from performance aspect if the HW can handle such things. But now we have more problems. Performance, colors, dynamic ranges and that there are more standards for the same thing. And why I hate electronics from the 80's Smile

I just would like to draw a pattern which files are correct and which are not yet to help on the further development and on my 'clairvoyance'.

What's your opinion?

I work in broadcast video areas - so I deal less in opinion and more in international standards Smile

Colour Gamut defines the R, G and B primaries in CIE colour space - i.e. what actual colour your primary red, green and blue are.  This is the gamut.  Rec 2020 defines different RGB primaries to Rec 709, and Rec 601 had slightly different colour primaries (technically) for 525 and 625 systems (aka 480 and 576 lines).  NTSC and PAL also had different primaries to each other.  (There are also additional differences like what colour temperature you defined as white)

Separate to the definition of the primary RGB colours used in each gamut standard, there are also different RGB->YCbCr (aka "YUV") matrix standards for Rec 2020, Rec 709 and Rec 601 - these define the relationships between RGB camera and display values and the Y (Luminance) and Cb (B-Y) and Cr (R-Y) colour difference signals (as the Cb and Cr signals are often subsampled to reduce their resolution and the bandwidth required to carry them) used in most video signals (both at baseband and in compressed form).

These quantised RGB and YCbCr values can be carried at a number of bit depths - SD video was often quantised in 8-bit resolution (16-235 and 16-240 for Y and CbCr respectively) but 10-bit versions were also defined (64-940 for Y for instance, sometimes also described as 16.00 to 235.00 I believe). SDR HD video is largely produced in the 10-bit domain these days (though some editing codecs are still 8-bit), and 12-bit representation can also be used.  Some Rec 2020 may not use YCbCr representation - other subsampled options are now available... (Dolby in particular can sometimes use these alternate colour spaces - albeit within the same gamut - Rec 2020)

The Colour Gamut stuff is not directly related to dynamic range in SDR/HDR terms - these are defined by EOTF (display) and OETF (camera) transfer functions (Electro Optical - i.e. the relationship between the electronic video signal and optical output, and Opto Electrical i.e - the relationship between light hitting a camera sensor and the electronic video signal created).  The EOTF is used to define the dynamic range of a video signal for a display (i.e. how to map the digital values it receives to the brightness values on a picture).  There are EOTFs that are defined for SDR video (BT.1886) and for HDR Video (ST.2084 for PQ HDR and ARIB-B67 for HLG HDR for instance). Before BT.1886 there was a less-well defined SDR EOTF - as all video was largely assumed to be displayed on CRT, and that EOTF was implicit in the display tech that CRT used I believe (Power Law Gamma of 2.2-2.4 typically?)

Both Colour Gamut and EOTF are metadata in a video signal BUT that metadata is vital in displaying them correctly - as the EOTF defines the relationship between video level and light output - and not usually linearly.  

If you have a video signal without either implicit or defined EOTF and gamut metadata you have literally no idea how to display it correctly.   HDR content can exist in both wide Rec 2020 and narrow Rec 709 gamut - though in consumer video pretty much all HDR content is Rec 2020.  Similarly you can use Rec 2020 primaries with both BT.1886 SDR and ST.2084 PQ or ARIB B-67 HLG HDR EOTFs.

As you say - 10-bit HEVC (if you ignore metadata embedded within the HEVC stream) is the same video and compression whether it is 10-bit SDR or 10-bit HDR, and whether it is 10-bit Rec 709 or 10-bit Rec 2020 (*).  This is why the meatada is vital - if you don't know the gamut and dynamic range of the signal (and in the case of YCbCr encoding, also the RGB<->YCbCr matrix - which is part of the gamut standard) you are stuffed.

(*) You have to know whether the video is Rec 709 or Rec 2020 not just because the primaries are different but also because the two standards use different RGB<->YCbCr transfer matrices - if you transfer Rec 2020 video using a Rec 709 matrix, or Rec 709 using a Rec 2020 matrix, your RGB results will be mathematically wrong.

Running every file you want to know more about through Media Info will tell you a lot about the metadata in that file - Gamut, EOTF etc.  I also only really use content I know and trust for that kind of stuff - not random stuff I downloaded.
Reply
Quote:I work in broadcast video areas - so I deal less in opinion and more in international standards 
I'm glad to hear a person who know the relevant theoretical background as profession. What I currently investigate is just how real life experience could broke any not strictly handled industrial standard.
Quote:HDR content can exist in both wide Rec 2020 and narrow Rec 709 gamut...
This is exactly what I wrote about still images. Basically all images which have more then 8 bit luminance resolution in the file itself defined as HDR images independently from their targeted color gamut and also independently from their target real word luminance. Sadly as wrote in this case there is no real word luminance defined as any type of standard so at the end there could be simply such 10 bit images what have no real high dynamic range in their content and also no have wide color gamut informations _(I mean referenced to the source of the image)_.
Quote:- though in consumer video pretty much all HDR content is Rec 2020.
And here we are. Seems in consumer practice HDR content is that content what use more then 10 bit resolution to store luminance values and have wide color gamut (perhaps P3 in HDR10 and BT.2020 in Dolby Vision but you know it better). So technically one 10 bit and originally BT.2020 targeted file in theory is a HDR file. On SDR display devices these files need DCR (it's type is irrelevant). And at this point we did not talked about the tone curve to store those color*luminance values properly aligned to real world luminances. Just got the facts that the file 10 bit and it's color values need to targeted to BT.2020 RGB primaries. And here I directly not go further as from this point the viewing environment also such important that the file content itself (and the display device). For proper dynamic range we should define one exact viewing environment and use a display device what can really reach some required luminance results, etc. I think (and you now) Dolby Vision and HDR10 as standards contain these type of regulations. But we from the aspect of Kodi sadly not yet arrived here.

Of course how you already wrote these metadatas are vital for the fine final result but the original problem was not exactly that with some files the colors are off or the dynamic range seems thin or tight. The problem is that seems there are situations where Kodi maybe not properly handled the content of the file or worse can not decode/render them. But I was inaccurate as with HW decoding this should be not the problem of Kodi but GCGTV itself.

If we always could get fluid 4k HEVC playback the whole investigation would not have started. So all the informations what you wrote is important. Important for a fine result. But I think now we are not 'there' yet. If Dolby Vision uses special tone curve to store the data and we decode those data without that proper tone curve of course we could get wrong colors. No problem (of course this is a problem, but not yet). But for example seems that Dolby Vision files are perfectly works while HDR10 files not. HDR10 files simply kill my Kodi on GCGTV. On desktop they are perfect on SDR display device also with working Kodi's tonemap function. So our problems not really those problems what you wrote. At least not yet. You know lot more informations about the video standards as me, especially as I'm knowing almost nothing about them just generic knowledge about A/V technic.

As I wrote I really do not expect frozen Kodi with any type of 4k HEVC files (just because their HDR types) as their metadata content should really not modulate the ability of the playback (the 'wrong colors' is another aspect...). Of course in the practice there could be problems and as we see there are problems.

In short. In theory we should get fluid playback performance from all 4k HEVC files with working HW or SW tonemap (or any other type of DRC) on SDR display devices.
The real visual result (what you wrote about as basic technical background) would be just the next step.

Anyway thanks for your written details as I'm really glad to read them as my soft hobby...
Reply
(2021-01-21, 22:13)Tamas.Toth.ebola Wrote:
Quote:HDR content can exist in both wide Rec 2020 and narrow Rec 709 gamut...
This is exactly what I wrote about still images. Basically all images which have more then 8 bit luminance resolution in the file itself defined as HDR images independently from their targeted color gamut and also independently from their target real word luminance. Sadly as wrote in this case there is no real word luminance defined as any type of standard so at the end there could be simply such 10 bit images what have no real high dynamic range in their content and also no have wide color gamut informations _(I mean referenced to the source of the image)_.
In video - bit depth has no relationship to display dynamic range - 10-bit video can be used to carry SDR and HDR content. In consumer video - DVD and HD Blu-ray are both 8-bit formats, but I have quite a lot of 10-bit SDR HD content I have mastered myself from 10-bit broadcast video masters (with massively reduced banding artefacts as a result). In consumer devices h.264/AVC decode is usually limited to 8-bit, but h.265/HEVC can be 10-bit (in both SDR and HDR dynamic ranges), My SDR 10-bit HD content is in Rec 709 gamut, in h.265/HEVC.

In video - there should always be a real world dynamic range that is either implicit or defined. For Rec 601 and Rec 709 - if no EOTF is defined in the video, then it is expected to be a Power Law Gamma (of around 2.2-2.4 I believe), as that is the implicit definition in the respective Rec 601 and Rec 709 standards.

If BT.1886 EOTF is defined as the dynamic range, that is the newer SDR standard (which is often used for SDR Rec 2020 WCG content).

If ST.2084 EOTF is defined as the EOTF, then this is a PQ standard that defines a precise relationship between display pixel absolute light output (in nits) and the source video value. This allows you to set a pixel to 100nits, 50nits, 500nits etc. (with the display handling pixel values that it cannot display correctly)

If HLG (ARIB B-67) EOTF is defined then this flags that the video is in Hybrid Log Gamma format, which doesn't mandate a fixed light-output to pixel-value mapping, but does define how this content should be displayed on a high dynamic range display.

HLG is backwards compatible with SDR (the EOTF tracks the SDR Power Law Gamma EOTF for around 60-70% of the dynamic range) within the same colour gamut, whereas ST.2084 is totally different to SDR Power Law Gamma EOTF.

In other words - HLG treated as SDR looks 'OK' (within the same colour gamut), HDR10 doesn't.
Quote:
Quote:- though in consumer video pretty much all HDR content is Rec 2020.
And here we are. Seems in consumer practice HDR content is that content what use more then 10 bit resolution to store luminance values and have wide color gamut (perhaps P3 in HDR10 and BT.2020 in Dolby Vision but you know it better). So technically one 10 bit and originally BT.2020 targeted file in theory is a HDR file. On SDR display devices these files need DCR (it's type is irrelevant). And at this point we did not talked about the tone curve to store those color*luminance values properly aligned to real world luminances. Just got the facts that the file 10 bit and it's color values need to targeted to BT.2020 RGB primaries. And here I directly not go further as from this point the viewing environment also such important that the file content itself (and the display device). For proper dynamic range we should define one exact viewing environment and use a display device what can really reach some required luminance results, etc. I think (and you now) Dolby Vision and HDR10 as standards contain these type of regulations. But we from the aspect of Kodi sadly not yet arrived here.

Afraid I don't understand what you are saying here. HDR10 and Dolby Vision (if you ignore their IQ standard), doesn't include any tailoring for viewing environment - it's PQ-based (1:1 defined mapping between display light output and video signal levels), with additional metadata in the stream to let the display optimise its output for out-of-range values (i.e. a display that can only go to 700nits has to be able to cope with 1000nit sources and display them in a way that looks OK). These standards are 'display referred'. HLG has ambient viewing conditions compensation incorporated into the standard, and doesn't mandate a 1:1 light-output to pixel-value EOTF, and is 'scene referred' (as is SDR Power Law Gamma) so this gives the standard a lot more flexibility in end-user options for display within the standard.

Dolby Vision includes a new-ish "IQ" variant I believe - which is kind of an admission that PQ isn't ideal for home display viewing... (i.e. you can't mandate fixed pixel light values for a display that is viewed in darkness and in bright sunshine...)

My guess is that most files that 'look wrong' in Kodi are probably amateur re-encodes where the HDR metadata has been lost in conversion. (ffmpeg can be tricky to use to retain HDR metadata)

And to get back on topic - Kodi on a number of platforms handles HDR10 and HLG HDR content that has been correctly mastered with few problems - either by handing off the content to Android and letting it deal with it, or by Kodi and/or the OS having the correct device-specific support implemented (AMLogic, Rockchip etc.). Support on x86 hardware is more patchy in Linux, though HDR10 on Windows is entirely possible (again because there is OS-level support for HDR10, though this may not pass-through some display optimisation metadata specific to the content being played that is contained in addition to the basic EOTF and Gamut metadata)
Reply
Woah so much text and many posts Big Grin

So... just wanted to remind on what I have written earlier.

Used stuff:
HDR TV: LG CX
SDR TV: Sony X9005A
Soundbar: LG SN11RG

HEVC + HDR (10bit) on HDR TV  -> Working
HEVC + HDR (10bit) on SDR TV -> not working... stuttering as hell
HEVC removed HDR (10bit -> 8bit) on SDR TV -> working
Any other HEVC 8bit on SDR TV -> working


So in general I still think it seems sth about the HDR->SDR Mapping. Can't be any general issues with 10bit decoding because it is working on a proper HDR capable TV.

Can check deactivating HW acceleration for MediaSource, but I think I had all combinations and it didn't really help.

Also wondering, why HD audio passthrough is not working from inside Kodi. On Netflix I got Dolby Atmos.
Connected directly into the SN11RG soundbar, no E-ARC or sth.
Reply
(2021-01-26, 20:28)myblade Wrote: HEVC removed HDR (10bit -> 8bit) on SDR TV -> working

What specifically does this mean? I assume you mean you are playing an HEVC 10-bit file but have removed any Rec 2020 and PQ EOTF flags from the file - or have found an SDR 10-bit HEVC file - so Kodi/Android TV think it is playing an HEVC 10-bit Rec 709 SDR EOTF file? (So has no need to tone map?)
Reply
(2021-01-26, 20:28)myblade Wrote: HEVC + HDR (10bit) on HDR TV  -> Working
HEVC + HDR (10bit) on SDR TV -> not working... stuttering as hell
That is my experience as well. I tried with "media surface" disabled for the SDR TV as Tamas suggested but wasn't happy with the playback and not all the files worked either. This is the same experience as Plex so at this time GCGTV isn't ready for my bedroom (SDR TV) or living room (DTS-HD/TrueHD capable AVR) for personal media. For Youtube and Disney+ users this is awesome.
LG E6 · Denon X4100w · XOLORspace 23421 · ATV 4K · 2019 Shield Pro
Chromecast w/ GTV · FTV 4K Stick · HTPC Ryzen 3600 · 1050 TI
Reply
hi guys

i was reading some news about a new google tv device, is there anyone knowing something about it?
i mean, it's strange for google to present a new device just few months later the new chromecast. i understand that google want all new devices to be AV1 compatible, but i don't think they are going to announce somethin' new
Reply
hi,

is there any fix/workaround for the raised blacks of HDR10/DV content?

or another media player that does not have the issue with raised blacks?

this is the only thing that really bugs me out with this device.
Reply
(2021-01-26, 20:28)myblade Wrote: Also wondering, why HD audio passthrough is not working from inside Kodi. On Netflix I got Dolby Atmos.
Connected directly into the SN11RG soundbar, no E-ARC or sth.

Dolby Atmos does not imply lossless audio. There are EAC3 Atmos and TrueHD Atmos.

Is the HDR to SDR tone mapping still an issue with the latest firmware (about 3 days ago)?
Reply
Seems like the new firmware does NOT help with 4k video stutter on non-4k TVs.
Reply
(2021-02-12, 15:28)quacked Wrote: Seems like the new firmware does NOT help with 4k video stutter on non-4k TVs.

Wait, is the stutter issue only on non-4k TV or also on non-HDR 4k TV? Often people assume that 4k TVs are HDR but it is not necessarily the case.
Reply
  • 1
  • 22
  • 23
  • 24(current)
  • 25
  • 26
  • 37

Logout Mark Read Team Forum Stats Members Help
"Google Chromecast with Google TV" dongle with a new "Google TV" ecosystem and UI0