• 1
  • 7
  • 8
  • 9
  • 10
  • 11(current)
First box: what to look for, what to avoid?
OK, I didn't read a single page from this thread. (I knew where it was going)

Hansolo and others:

Star Wars - attack of the clones and onwards, was filmed in 12bit RGB and was in some ways a forerunner to todays DCi standard. I'm not sure if some movies were filmed in 10bit after AoTC, but it's safe to say that not too long afterwards, pretty much everything was standardized on 12bit. At least for the big Hollywood productions. So this happened 16 years ago, close to 2 years before even the first Xbox was launched. A 32" widescreen CRT of high quality still cost an arm and a leg. Etc etc. We've come a long way since then Smile

Regarding Rec.2020 gamut, think of it as a container. NTSC sRGB, Adobe RGB, DCI P3 etc, All of them fit almost perfectly inside the Rec.2020 gamut. What this means is that it's futureproof. We don't want to go over this again in 10 years time. Rec.2020 is also a standard. What's more is that interlaced formats are no longer supported, progressive formats is supported up to 120fps. Expect to upgrade to HDMI 3.0 in a few years Smile Everything should be in this colorspace, and everything will be in this colorspace since this is the way forward and part of the UHD spec. But there will be metadata that will describe the usable colorspace-inside-the-rec2020-colorspace, hence I'm referring it to as a container.

We're all a little unsure on how all this is going to pan out, so it's probably best to stay on the fence just a little longer.
Reply
(2016-02-16, 00:11)Soli Wrote: OK, I didn't read a single page from this thread. (I knew where it was going)

If you didn't read a single page, how come you are posting on something that was mentioned in certain posts? HuhRofl
Reply
some people can be dicks on this forum just because they think they know more or got the "bigger"or"better" box . i'm new to the forum and to the "media box game" per say and i asked one question and got talked to like a kid and was told to read something i already did. i ended up buying a $50 box just to experiment with it and more than likely fuck it up. that's how i learn, hands on. if i don't know something i ask, if you don't want to give me the best to your knowledge answer, then don't comment. i know this isn't the right place to say my 2 cents, but there was some things said on here that pissed me off. thanks,
Reply
(2016-02-15, 23:18)noggin Wrote: I don't agree.

10 bit has been used in studio production for many years now, because it is acknowledged that 8 bits isn's sufficient for good quality rendition - particularly on saturated blue content. When I started working in digital TV in the late 80s we were all warned 'beware blue grads'... Quantel introduced Dynamic Rounding as a way of minimising banding (effectively it is a pseudo random dither seeded by content) during 10 bit to 8 bit conversion (and higher), and Sony included that in DigiBeta decks.

These days the main bottle neck in bit-depth in live production has been playout servers (which for a long time used 8 bit codecs) - but now ProRes and DNXHD offer 10 bit that's less of an issue. HD Cam SR is 10 bit, as was DigiBeta.

10 bit in production pre-dates Rec 2020 by many years. It's been being argued that 1080/50p and 10 bit ITU 709 would be a really clean upgrade to HD (even without HDR) - but the numbers in marketing and the nuanced differences (which are definitely there on some content) are difficult to sell compared to 4K (MOAR pixels) and HDR (BRIGHTER and DARKER!)...

Thanks, I always feel like I learn a lot from your posts. There are a few areas that are still unclear to me:
  1. For a fixed bit-depth, does banding increase as the color space is enlarged (e.g. Rec. 709 to Rec. 2020)?
  2. What bit-depth is needed to exceed human vision (i.e. perfectly render a noiseless, high resolution, CIE 1931 gamut image)? In other words, where does bit-depth stop being a bottleneck?
  3. Isn't the visible bit-depth of most media reduced by 16-235 luma levels?
  4. You mention upgrading Rec. 709. Would increasing the bit-depth have a bigger impact than increasing the color space? Both seem like subtle changes to me and I wonder what matters more.

Quote:The control rooms and edit suites I see are mainly OLED these days - at least for quality monitoring. (With Dolby displays in very high end grading suites)

Don't OLEDs have relatively poor motion resolution compared to CRTs? My understanding was that OLED phosphors are significantly dimmer than CRT phosphors (~10x) and must resort to sample & hold to get a bright enough image. Is that an issue for editing video?
Reply
(2016-02-16, 02:26)wesk05 Wrote:
(2016-02-16, 00:11)Soli Wrote: OK, I didn't read a single page from this thread. (I knew where it was going)

If you didn't read a single page, how come you are posting on something that was mentioned in certain posts? HuhRofl

I didn't make popcorn and read the whole thread from page 1 to the endSmile just a fyi in case I missed something obvious. I did read some of the last posts, obviously. I saw noggin had commented, and he's always got something interesting to say, and I thought I chimed in on that account, but jumped over the flaming bitsTongue
Reply
(2016-02-16, 03:20)Soli Wrote: I didn't make popcorn and read the whole thread from page 1 to the endSmile just a fyi in case I missed something obvious. I did read some of the last posts, obviously. I saw noggin had commented, and he's always got something interesting to say, and I thought I chimed in on that account, but jumped over the flaming bitsTongue

You should have made that popcorn. This was a really entertaining thread movie!
Reply
(2016-02-15, 22:32)wesk05 Wrote:
(2016-02-15, 10:06)hansolo Wrote: This is strange, why/when Rec.2020? There are no movies graded with REC.2020 extended gamut, so it must be a mistake.
Hevc 10b makes sense as codec efficiency.

Dolby Cinema movies were graded in Rec. 2020 HDR (Star Wars: The Force Awakens, Inside Out, Tommorowland etc.). Christie dual laser 4K projection is capable of covering the entire Rec. 2020 gamut. As for Netflix, it is their own productions that are in 4K UHD Rec. 2020 now. I am not sure whether they were graded in DCI-P3 or Rec. 2020, but they are using Rec. 2020 color space for encoding and streaming.
My mistake, I was referring to existing Netflix content, are you sure it's rec.2020?
No current TV support this, current ,,premium HDR" are at 90% or more DCI P3, which is less than rec.2020. A good article here
Maybe it's just a Nvidia Shield settings, with no more relevance than RGB output.


(2016-02-16, 00:11)Soli Wrote: Regarding Rec.2020 gamut, think of it as a container. NTSC sRGB, Adobe RGB, DCI P3 etc, All of them fit almost perfectly inside the Rec.2020 gamut. What this means is that it's futureproof. We don't want to go over this again in 10 years time. Rec.2020 is also a standard. What's more is that interlaced formats are no longer supported, progressive formats is supported up to 120fps. Expect to upgrade to HDMI 3.0 in a few years Smile Everything should be in this colorspace, and everything will be in this colorspace since this is the way forward and part of the UHD spec. But there will be metadata that will describe the usable colorspace-inside-the-rec2020-colorspace, hence I'm referring it to as a container.
Rec.2020 is NOT a container, it's a standard.
EOTF functions for extended gamut/HDR are different and source must be graded for a specific workflow (a good explanation here).
As I mentioned, there are no current commercial TV that suport Rec.2020, so there are no need for this extended gamut.
For current state of 4k/HDR it's funny and informative to look at AVSForum
Sure, it's the future but it's still at infancy.

(2016-02-16, 03:00)ZwartePiet Wrote: Don't OLEDs have relatively poor motion resolution compared to CRTs? My understanding was that OLED phosphors are significantly dimmer than CRT phosphors (~10x) and must resort to sample & hold to get a bright enough image. Is that an issue for editing video?
For OLED TV the lower motion resolution is from sample-and-hold current technology. Only professional OLED uses PWM (which has better motion resolution), it's about current cost and life span of OLED.
Reply
I said think of it as a container. Being a container and standard at the same time is entirely possible. Now you might not agree on the wording "container", but then we're not discussing semantics, are we? As I said, the metadata defines the gamut within rec2020, so for the foreseeable future we'll see films graded in dci-p3, but they'll contain metadata that defines the gamut boundaries within the rec2020 colorspace. In the future when everyone has laser projectors, we don't have to switch to another standard.

But this switching-on-the-fly opens a whole bottle of worms regarding calibration of such tv sets. Do we calibrate for rec.709 and let the tv interpolate to dci-p3 and ultimately the full rec.2020 color space. Will we be able to calibrate for rec709, dci-p3, and rec2020 at the same time, and if some movies are using some weird colorspace, will the tv be able to use our calibration data to interpolate correct settings on-the-fly for that colorspace. And what about HDR? Will it be able to shift eotf to encompass different mastering NITS, correctly? That, my friend, is the question. And nobody knows the answer quite yet.
Reply
After a bit of research, I've come to a tentative conclusion about the following:

Quote:1) For a fixed bit-depth, does banding increase as the color space is enlarged (e.g. Rec. 709 to Rec. 2020)?
Yes, increasing the color space increases the number of visible colors. The bit-depth must have enough resolution to account for all perceivable colors within that color space. True color (24-bit) can resolve ~16 million colors; Deep color (30-bit) can resolve ~1 billion colors. However...

Quote:2) What bit-depth is needed to exceed human vision (i.e. perfectly render a noiseless, high resolution, CIE 1931 gamut image)? In other words, where does bit-depth stop being a bottleneck?
Numerous studies have shown that humans are only capable of perceiving ~10 million colors, therefore 24-bit color already exceeds human vision. It appears to me that banding is caused by the loss of information during image processing. Something as simple as a color space conversion degrades the image if done entirely in 8-bit.

A video shot and produced in deep color, then mastered in true color would presumably show no banding when played back on a true color reference monitor. The move to 10-bit then seems primarily for allowing an image enough headroom to be manipulated (and degraded) without an end-user noticing.



(2016-02-16, 08:45)hansolo Wrote: For OLED TV the lower motion resolution is from sample-and-hold current technology. Only professional OLED uses PWM (which has better motion resolution), it's about current cost and life span of OLED.
My understanding is that all OLEDs use PWM to control brightness. After perusing a few reference monitor brochures, it doesn't appear that they're doing anything fundamentally different to increase motion resolution.

A little bit of research tells me that motion resolution is determined by the amount of information encoded in the source -- 24fps film inherently has low motion resolution while 60fps video has high motion resolution. How that limited amount of motion resolution is presented must rest somewhere on a continuum between judder and blur. In other words, an accurately rendered image will always be somewhere between sharp/stuttery and blurry/smooth. The television controls the presentation of motion resolution through:
  1. Pixel response time
  2. Sample & hold length (SAH)
I believe that pixel response time determines how accurately (i.e. lines of resolution) a television can display motion and SAH controls where along the judder/blurry continuum the image lies. OLED panels have fast pixel response times (i.e. accurate motion), but long sample & hold lengths owing to their low peak brightness (~600 nits). Since eye tracking exceeds 24fps motion, images appear blurry as the eye traverses a stationary image. Without a high frame-rate source or motion interpolation, OLEDs sit close to the smooth/blurry end of the spectrum.
Reply
(2016-02-16, 18:59)ZwartePiet Wrote: After a bit of research, I've come to a tentative conclusion about the following:

Quote:1) For a fixed bit-depth, does banding increase as the color space is enlarged (e.g. Rec. 709 to Rec. 2020)?
Yes, increasing the color space increases the number of visible colors. The bit-depth must have enough resolution to account for all perceivable colors within that color space. True color (24-bit) can resolve ~16 million colors; Deep color (30-bit) can resolve ~1 billion colors. However...

Quote:2) What bit-depth is needed to exceed human vision (i.e. perfectly render a noiseless, high resolution, CIE 1931 gamut image)? In other words, where does bit-depth stop being a bottleneck?
Numerous studies have shown that humans are only capable of perceiving ~10 million colors, therefore 24-bit color already exceeds human vision. It appears to me that banding is caused by the loss of information during image processing. Something as simple as a color space conversion degrades the image if done entirely in 8-bit.

A video shot and produced in deep color, then mastered in true color would presumably show no banding when played back on a true color reference monitor. The move to 10-bit then seems primarily for allowing an image enough headroom to be manipulated (and degraded) without an end-user noticing.
No, increasing the color space does not increase the number of visible color, it increases the range of possible colors. Increasing bit depth increases the number of visible colors.(and btw, for 8bit video there are actually about 10.5 million colors, not counting rare wtw levels).

The problem is not the amount of different colors per se, but amount of luminance steps. So with 8bit, video only has 219 steps, which is hardly enough to get a smooth grayscale even with only 0-100cdm/2.

As for color: We don't have discrete levels for saturation/hue (another word for exact color) *and* another for the luminance of that color on our displays. In other words the saturation/hue/luminance is always connected, and to get a given color/luminance combination means this itself is always a different mixture of RGB luminances. Which means you can never get the same exact color for different luminance levels with 8bits. (you never can, but with higher bitdepth we can approximate it better)

Now, it's clear that 8bit is not even enough for Rec709 calibrated to around 100 cd/m2. So it's only logical that we need to increase bitdepth if we are to increase the colorspace, *and* we have to increase the bitdepth when expanding the luminance levels (HDR), plus with HDR we need a new eotf (gamma curve) to make best use of our increased bitdepth, or we would be wasting unneccesary bits on specular highlights.
Reply
(2016-02-16, 18:59)ZwartePiet Wrote: My understanding is that all OLEDs use PWM to control brightness. After perusing a few reference monitor brochures, it doesn't appear that they're doing anything fundamentally different to increase motion resolution.

A little bit of research tells me that motion resolution is determined by the amount of information encoded in the source -- 24fps film inherently has low motion resolution while 60fps video has high motion resolution. How that limited amount of motion resolution is presented must rest somewhere on a continuum between judder and blur. In other words, an accurately rendered image will always be somewhere between sharp/stuttery and blurry/smooth. The television controls the presentation of motion resolution through:
  1. Pixel response time
  2. Sample & hold length (SAH)
I believe that pixel response time determines how accurately (i.e. lines of resolution) a television can display motion and SAH controls where along the judder/blurry continuum the image lies. OLED panels have fast pixel response times (i.e. accurate motion), but long sample & hold lengths owing to their low peak brightness (~600 nits). Since eye tracking exceeds 24fps motion, images appear blurry as the eye traverses a stationary image. Without a high frame-rate source or motion interpolation, OLEDs sit close to the smooth/blurry end of the spectrum.

No, all current OLED TV made by LG are not using PWM, but sample & hold with some weak interpolation.
A good reading about motion blur and OLED: here
Reply
  • 1
  • 7
  • 8
  • 9
  • 10
  • 11(current)

Logout Mark Read Team Forum Stats Members Help
First box: what to look for, what to avoid?0