Kodi Community Forum
HEVC (also known as h.265) - Review - Printable Version

+- Kodi Community Forum (https://forum.kodi.tv)
+-- Forum: Discussions (https://forum.kodi.tv/forumdisplay.php?fid=222)
+--- Forum: Kodi related discussions (https://forum.kodi.tv/forumdisplay.php?fid=6)
+--- Thread: HEVC (also known as h.265) - Review (/showthread.php?tid=170084)

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28


RE: HEVC (also known as h.265) - Review - AbRASiON - 2015-02-21

(2015-02-20, 23:51)nickr Wrote: Every re-encode loses quality.

Exactly, we're losing quality from .264 to .264 and it's unfortunate that the source material is encoded in such a way.
Does anyone know if they use high colour stuff (is it 10 or 12bit or some such) on the blu ray encodes? Is there an industry standard that ensures the blu rays themselves are well encoded?


RE: HEVC (also known as h.265) - Review - nickr - 2015-02-21

Now would you prefer them encoded? H.264 was certainly the best option at the time. Would you have preferred mpeg2 at double the bitrate or half the quality? Or maybe no compression and shuffle half a dozen discs to play 1 movie?

I am sure the basic requirements for bluray compatibility are easy to find. Start at wikipedia.


RE: HEVC (also known as h.265) - Review - DJ_Izumi - 2015-02-21

(2015-02-21, 00:23)AbRASiON Wrote: Exactly, we're losing quality from .264 to .264 and it's unfortunate that the source material is encoded in such a way.
Does anyone know if they use high colour stuff (is it 10 or 12bit or some such) on the blu ray encodes? Is there an industry standard that ensures the blu rays themselves are well encoded?

You mean the Motion Pictures Expert Group? Tongue


RE: HEVC (also known as h.265) - Review - AbRASiON - 2015-02-21

(2015-02-21, 00:32)nickr Wrote: Now would you prefer them encoded? H.264 was certainly the best option at the time. Would you have preferred mpeg2 at double the bitrate or half the quality? Or maybe no compression and shuffle half a dozen discs to play 1 movie?

I am sure the basic requirements for bluray compatibility are easy to find. Start at wikipedia.

Obviously h.264 which is the best option, however in the past 3 or 4 years I've seen many posts about all the different and subtle features buried in advanced h.264 encoding. The anime guys are generally the ones to take advantage of all the best features.
Just throwing bitrate at a problem won't always solve it. I'm probably being paranoid but it would be nice to know that the blu rays are at least generally, always encoded with the best colour depth possible for example. (is it 10:10:10 or some such?)


RE: HEVC (also known as h.265) - Review - nickr - 2015-02-21

don't forget blurays are designed not to run on general purpose computers, but on specialised low power chips in bluray players. They must play on lowest common denominator hardware chip. There is not a lot of room to experiment in the way anime fiends do.

Pretty sure bluray maxes at 8 bit colour.


RE: HEVC (also known as h.265) - Review - poplap - 2015-02-21

Just so you know 4k (Well UHD, sadly not true 4k) Blu rays will use HEVC with 10-bit color depth, HDR support, and up to 60 fps, plus increase the video max standard disc size to 100GB Blu Ray discs. Though apparently there will be more DRM tech stuffed into it.

The video libraries are going to get so much larger Tongue


RE: HEVC (also known as h.265) - Review - AbRASiON - 2015-02-21

Gotta be honest, 4k doesn't interest me until I've got 3x the internet quota, 4x the internet speed and a 10TB HDD is under $149
furthermore, I need a 100" TV minimum :/

I do like better compression / quality in the video tho


RE: HEVC (also known as h.265) - Review - jjd-uk - 2015-02-21

Re-encoding from H.264 to H.265/HEVC simply makes no sense at the moment, at about 6 hours per encode it's going to take you quite some time if you've a sizable collection. However personally I never see the need to re-encode with Handbrake or whatever, we now live in times where big storage is cheap, so why waste hours, days, weeks encoding, when the cost of a new hard disk is less that the value of my time that would be spent encoding.


RE: HEVC (also known as h.265) - Review - nickr - 2015-02-21

(2015-02-21, 10:51)jjd-uk Wrote: Re-encoding from H.264 to H.265/HEVC simply makes no sense at the moment, at about 6 hours per encode it's going to take you quite some time if you've a sizable collection. However personally I never see the need to re-encode with Handbrake or whatever, we now live in times where big storage is cheap, so why waste hours, days, weeks encoding, when the cost of a new hard disk is less that the value of my time that would be spent encoding.
+1


RE: HEVC (also known as h.265) - Review - DJ_Izumi - 2015-02-21

(2015-02-21, 04:44)poplap Wrote: 10-bit color depth, HDR support

10bit IS high dynamic range, so listing both is redundant. 10bit will be an OPTION. However you're going to find that the vast majority of content will be 8bit. To be frank, 10bit won't really add much to the viewer experience, sure most cinematic cameras do 14bit color and the workflow is at high bit depth to allow greater latitude in manipulating the color in post production. The majority of displays are 8bit, most storage formats for distribution are 8bit and that many more colors doesn't really add to the experience. (If you actually look at 14bit REDRAW footage, it'll actually -look- flatter in color than 8bit until you add some color correction is applied in the workflow. 8bit really represents more colors than your eye can see and anything higher doesn't offer much. Go out looking for 'amazing looking HDR images' and you're still looking at a post processing effect, cause all them pretty images you Google are still stored to 8bit JPEG. Tongue


RE: HEVC (also known as h.265) - Review - BLKMGK - 2015-02-22

I'm starting to rip new files with HEVC and file sizes are all over the place depending on settings. I'm using Handbrake as the front-end for this and only compressing the video after ripping from a BluRay. What settings are folks finding work best? There doesn't seem to be many to tweak without putting in settings by hand in the Advanced window. I've been compressing with constant quality between 17-16.5 and am now trying the Main10 Profile. Tips and tricks to try?


RE: HEVC (also known as h.265) - Review - poplap - 2015-02-22

(2015-02-21, 16:59)DJ_Izumi Wrote:
(2015-02-21, 04:44)poplap Wrote: 10-bit color depth, HDR support

10bit IS high dynamic range, so listing both is redundant. 10bit will be an OPTION. However you're going to find that the vast majority of content will be 8bit. To be frank, 10bit won't really add much to the viewer experience, sure most cinematic cameras do 14bit color and the workflow is at high bit depth to allow greater latitude in manipulating the color in post production. The majority of displays are 8bit, most storage formats for distribution are 8bit and that many more colors doesn't really add to the experience. (If you actually look at 14bit REDRAW footage, it'll actually -look- flatter in color than 8bit until you add some color correction is applied in the workflow. 8bit really represents more colors than your eye can see and anything higher doesn't offer much. Go out looking for 'amazing looking HDR images' and you're still looking at a post processing effect, cause all them pretty images you Google are still stored to 8bit JPEG. Tongue

HDR os not 10bit. HDR is a technique of basically taking more than one image or frame (usually 3 or more) and putting them together in such a way that no part is over or underexposed. It use complex algorithms to achieve this effect, HDR video is a new thing though since camera tech needed to improve. HDR is post processing nothing happens on the camera itself other than taking different exposures of the same image, which is hard with video and thats why its new. When you iPhone takes HDR images its taking 3 shots really quickly then stitching them together. 10bit is referring to the color range and yes cinematics is already higher than 10bit, hell the stuff that you see in the movie theaters are higher than that. You can have HDR in 8bit space just fine, I should know I have made plenty of HDR photos before.

As to the need for 10bit vs 8bit I'd say it really depends, I am fine with 8bit but I'm not going to say no to more data in the video. Though I will say all video looks flat until color correction is completed thats why the do that anyway. 10bit will be part of the standard though so in theory all new blu rays that use the new standard will be in 10bit, though HDR is optional (though I'm not sure why it's listed as part of the standard since it doesn't really affect the blu ray itself). As well the bit doesn't change the color space just the range of the gradient in said color space, think of it more as the resolution of the color space then representing more. Most color spaces used on computers don't encompass the full range of the human eye (usually cutting the green off a bit). Cinema standards (usually DCI for newer digital stuff) has a rather large coverage but still does not cover all the colors the human eye can see (we see a hell of a lot of green).

Also I will say not every photo is 8bit JPEG on google, since well google doesn't store the photos themselves, I have uploaded photos in tiff, JPEG, etc in various bit types. And ya consumer screens don't really do 10bit, hell some don't even do true 8bit they might be doing 6bit or even 4bit with some post processing on the screen driver to clean it up though that's usually the cheaper or older screens these days. But hell if people are upgrading to 4kUHD they could easily upgrade the color depth to 10bit.


RE: HEVC (also known as h.265) - Review - DJ_Izumi - 2015-02-22

(2015-02-22, 03:07)poplap Wrote: HDR os not 10bit. HDR is a technique of basically taking more than one image or frame (usually 3 or more) and putting them together in such a way that no part is over or underexposed. It use complex algorithms to achieve this effect, HDR video is a new thing though since camera tech needed to improve. HDR is post processing nothing happens on the camera itself other than taking different exposures of the same image, which is hard with video and thats why its new. When you iPhone takes HDR images its taking 3 shots really quickly then stitching them together. 10bit is referring to the color range and yes cinematics is already higher than 10bit, hell the stuff that you see in the movie theaters are higher than that. You can have HDR in 8bit space just fine, I should know I have made plenty of HDR photos before.

No, that's HDR bracketing which is a tried and tested way of getting more dynamic range out of a sensor limited in it's dynamic range. There are sensors that can capture higher dynamic range on a single frame, such as the commercially available Red Dragon which reaches about 16 stops without even using it's HDRx mode (Which is a form of bracketing). Similarly, in the computer generated renders I do at work, we see monstrously high dynamic range as we spit out our renders at 32bit float because that level of dynamic range gives is a tremendous amount of latitude in compositing.


RE: HEVC (also known as h.265) - Review - AbRASiON - 2015-02-22

(2015-02-21, 16:59)DJ_Izumi Wrote:
(2015-02-21, 04:44)poplap Wrote: 10-bit color depth, HDR support

10bit IS high dynamic range, so listing both is redundant. 10bit will be an OPTION. However you're going to find that the vast majority of content will be 8bit. To be frank, 10bit won't really add much to the viewer experience, sure most cinematic cameras do 14bit color and the workflow is at high bit depth to allow greater latitude in manipulating the color in post production. The majority of displays are 8bit, most storage formats for distribution are 8bit and that many more colors doesn't really add to the experience. (If you actually look at 14bit REDRAW footage, it'll actually -look- flatter in color than 8bit until you add some color correction is applied in the workflow. 8bit really represents more colors than your eye can see and anything higher doesn't offer much. Go out looking for 'amazing looking HDR images' and you're still looking at a post processing effect, cause all them pretty images you Google are still stored to 8bit JPEG. Tongue

What you're saying makes sense and I realise I've created a really pedantic subthread here, but I'm being a perfectionist because sometimes it's good.
We may well 'throw out' (delete?) our media in the next 5 years, I don't know but ultimately in the next 3 to 10 years, we're going to have OLED displays which have insane colour depth, perhaps the full 14bit colour range could be realised (I don't know, this is complete speculation on my part)

Wouldn't it be nice to know that the files we're encoding and infact the media we're buying contains the best possible version of the movie / TV available?
I certainly know I see some weird stuff in some movies on my plasma and I have to wonder if people haven't encoded the films using an LCD thinking "this is good enough"


RE: HEVC (also known as h.265) - Review - twelvebore - 2015-02-22

(2015-02-22, 10:03)AbRASiON Wrote: Wouldn't it be nice to know that the files we're encoding and infact the media we're buying contains the best possible version of the movie / TV available?

You're talking about an industry that gave you the opportunity to buy its products on VHS, then pay for it again on DVD (which was "better"), then pay for it again on Blu-ray (because it was "better"). You think they're going to give you the future "better" now, and miss out on the opportunity of having you pay for it a fourth time?