Android Popcorn Hour RockBox Claims 4K@60 HDR 10bit w/Kodi fork RKMC
#1
Stumbled upon this. It's a Rockchip RK3328 based Android box claiming to handle 4K@60 HDR 10bit w/Rockchip's Kodi fork RKMC. Anyone have any insights? CloudMedia/Popcorn Hour has done a decent job to date w/non-Android boxes like A-400/500 series, but I'm skeptical of this one.

Only found this brief impression by a user at AVSForum.
[H]i-[d]eft [M]edia [K]een [V]ideosaurus
My Family Room Theater
Reply
#2
Why can it run hlg with mali 450. Odroid C2 has mali 450 too and HDR or HLG is not possible

Gesendet von meinem ONEPLUS A3003 mit Tapatalk
Reply
#3
(2017-11-22, 19:23)flyingernst Wrote: Why can it run hlg with mali 450. Odroid C2 has mali 450 too and HDR or HLG is not possible

Gesendet von meinem ONEPLUS A3003 mit Tapatalk
Isn't it the VPU that dictates most of the video functionality not the GPU? The Mali 450 is the GPU - but it's the VPU that handles hardware video decoding.  They may have the same GPU - but I doubt they share the same VPU.  However as a GPU-rendered GUI would also need to be in PQ or HLG I don't know where that falls into play?  

ISTR - I may be wrong - that the C2 (and other S905 devices) have 10-bit decode but only 8-bit output paths from the VPU?  So HDR10/PQ and HLG content - which requires a 10-bit path to meet the standard wouldn't be properly played back.  Could also be that stuff like the HDMI InfoFrames or VICs that carry the HDR flagging can't be set on the older S905 VPU/GPU/HDMI output system combo?
Reply
#4
(2017-11-23, 12:43)noggin Wrote: Could also be that stuff like the HDMI InfoFrames or VICs that carry the HDR flagging can't be set on the older S905 VPU/GPU/HDMI output system combo?

Chips below < GXL (S905x) don't have amvecm (AmLogic Video Enhancement Module).
This limits the level of post-processing, particularly with regards to colour management, that can be performed.

The HDMI TX module is the same; although there are soft checks which will prevent signalling AV InfoFrames which would not have any effect.
Reply
#5
regarding "10-bit decode but only 8-bit output paths"

Does that mean some of the video features, such as HDR, wide colour gamut, colour volume etc are degraded because of the 10 to 8bit reduction?

Are there any SoC's that have a complete 10bit path? S912 maybe ?
Reply
#6
(2017-11-24, 12:38)JustAnotherUser Wrote: regarding "10-bit decode but only 8-bit output paths"

Does that mean some of the video features, such as HDR, wide colour gamut, colour volume etc are degraded because of the 10 to 8bit reduction?

Are there any SoC's that have a complete 10bit path? S912 maybe ?
S905x and S912 both have amvecm
No DV on S905x and not really working at this time on S912.
Reply
#7
(2017-11-24, 12:38)JustAnotherUser Wrote: regarding "10-bit decode but only 8-bit output paths"

Does that mean some of the video features, such as HDR, wide colour gamut, colour volume etc are degraded because of the 10 to 8bit reduction?

Are there any SoC's that have a complete 10bit path? S912 maybe ?
 I thought the HDR10-compatible S905X and S912 had 10-bit decode and output - it was only the non-HDR S905 that may be limited to 10-bit decode/8-bit output.

(HLG compatibility in replay is also possible on the S905X and I guess the S912 - but I don't know if it is flagged in the HDMI InfoFrames.  HLG doesn't have any metadata requirements and is identical to an SDR Rec.2020 video signal in carriage terms (as it is backwards compatible with SDR Rec.2020). (If you play HLG content in SDR mode, and then switch to HLG mode on an HDR TV you can see how it works.  As. you switch most of the picture looks identical, but all the highlights suddenly get brighter and have more detail!  Which is the one of the most clever bits of HLG.  For most TV viewing set-ups HDR-10 and Dolby Vision probably offer no real advantages over HLG, and may in fact be less advantageous.  You really do need 10 bit paths though.)
Reply
#8
(2017-11-25, 11:14)noggin Wrote:
(2017-11-24, 12:38)JustAnotherUser Wrote: regarding "10-bit decode but only 8-bit output paths"

Does that mean some of the video features, such as HDR, wide colour gamut, colour volume etc are degraded because of the 10 to 8bit reduction?

Are there any SoC's that have a complete 10bit path? S912 maybe ?
I thought the HDR10-compatible S905X and S912 had 10-bit decode and output - it was only the non-HDR S905 that may be limited to 10-bit decode/8-bit output.

Even I'm not sure. Just did a Spears 2160p HEVC Rotating 8bit and 10bit Quantization Artifact test:

https://drive.google.com/file/d/0B68jIlC...sp=sharing

And I'm definitely seeing reduced color banding in the 10bit window vs the 8bit when played on the S905 C2 and displayed on my 10bit 4K TV (8bit+FRC)
This is also using a standard non-HDR modded AML Linux Kernel (Marshmallow based)

As Sam N. said, you definitely need a S905X / S912 for HDR BT.2020 output from the upgraded AMLogic Video Enhancement Module and naturally a compatible HDR AML Linux (Nougat) Kernel.

Reply
#9
Why get it when you can get something you can put LibreELEC and actual Kodi on, like a S905X or S912? Seems like you're just asking for problems, and you won't get actual Kodi.
Denon X6500H 7.2.4 -> LG OLED65C9P
Main:
NVIDIA Shield Pro (2019)
Other Devices: Apple TV 4K, FireStick 4K Max (2023), Homatics Box R 4K
Retired devices: Zidoo X9S, Xiaomi Mi Box, All the old RPi’s
Reply
#10
Man I remember that last Popcorn Hour device and all the users from their forum over here. They all got banned right?!

Reply
#11
(2017-11-25, 12:29)wrxtasy Wrote:
(2017-11-25, 11:14)noggin Wrote:
(2017-11-24, 12:38)JustAnotherUser Wrote: regarding "10-bit decode but only 8-bit output paths"

Does that mean some of the video features, such as HDR, wide colour gamut, colour volume etc are degraded because of the 10 to 8bit reduction?

Are there any SoC's that have a complete 10bit path? S912 maybe ?
I thought the HDR10-compatible S905X and S912 had 10-bit decode and output - it was only the non-HDR S905 that may be limited to 10-bit decode/8-bit output. 

Even I'm not sure. Just did a Spears 2160p HEVC Rotating 8bit and 10bit Quantization Artifact test:

https://drive.google.com/file/d/0B68jIlC...sp=sharing

And I'm definitely seeing reduced color banding in the 10bit window vs the 8bit when played on the S905 C2 and displayed on my 10bit 4K TV (8bit+FRC)
This is also using a standard non-HDR modded AML Linux Kernel (Marshmallow based)

As Sam N. said, you definitely need a S905X / S912 for HDR BT.2020 output from the upgraded AMLogic Video Enhancement Module and naturally a compatible HDR AML Linux (Nougat) Kernel. 

I've got both S905 and S905X boxes so I'll have a look.  

I've currently got a 10bit+FALD Sony XE900 display so may be able to see some differences (though beware the current firmware on that TV has major issues in many modes with 50/59.94Hz content as it drops and repeats 50p frame pairs... )
Reply

Logout Mark Read Team Forum Stats Members Help
Popcorn Hour RockBox Claims 4K@60 HDR 10bit w/Kodi fork RKMC0