YCbCr 444 vs YCbCr 422... Why the difference?

Discussion in 'ZIDOO X9S' started by litlgi74, Jan 9, 2018.

  1. ursiel

    ursiel Active Member

    Regarding the banding, what puzzles me is that if I play the UHD/HDR test file with the strong banding directly in my TV's media player, from a USB 3 thumb drive, there's no banding issues (it's extremely mild). That's just how it looks with the zidoo, if I use AUTO, YCbCr 444, or YCbCr 420 (which might actually be sending out YCbCr 444), but only if I use 12bit. If I use 10bit with those settings, it's not terrible, but I do have a bit more banding compared to 12bit. That makes no sense, because the file is not 12bit color, it's 10bit. So it should look the same whether Zidoo is set to 10bit or 12bit. But 12bit definitely looks better - as good as when I play it in my TV. So my panel is certainly capable of displaying the file with minimal banding. I just have to set Zidoo to 12bit to get that, and I don't know why 10bit doesn't look just as good. My panel certainly isn't 12bit, so 10bit should be all I need to send.

    And of course in YCbCr 422 or RGB 444 no deep color setting has any effect - the banding is extremely bad regardless of deep color setting. This was all listed in my post on banding in the 2.0.5 firmware thread.
     
  2. n_p

    n_p Active Member

    Its probably correct. ;) The only thing that made my mind do logic overrides is the following - in the same play with screen I can choose between

    - ZidooPlayer (default)
    - DVDPlayer

    DVDPlayer is the usual Kodi Player core (has nothing to do with DVDs per se), ZidooPlayer is the one Zidoo developed, and should be the same that gets used, when you start a file via the X9S File Browser.

    DVDPlayer may have been renamed to Video Player in Kodi 17.

    There is no third "ZDMC player", we are either talking about the player developed by the Kodi team, or the player developed by Zidoo. On that video we apparently get different defaults between different Kodi versions (my ZDMC is still based on Kodi 16), but yeah - ok, thats apparently a thing. :)

    Also - don't start second guessing your observations. I did the same tests (on an LG B6) and got the same results. Its Zidoos fault.

    They probably do the internal rendering in 12 bit, then downconvert the result to 10 bit, probably with an 8 bit algo.
    Also god knows what they do in 4:2:2 - because the deep color setting obviously doesnt work in that mode (yet all 10 bit content actually is mastered in 4:2:2).
     
    ursiel likes this.
  3. tosh123

    tosh123 Member

  4. ursiel

    ursiel Active Member

    Okay, so according to that chart:

    10bit 420 only supports 4k@50/60. It does not support 4k@24/25/30.
    10bit 444 only supports 4k@24/25/30. It does not support 4k@50/60.
    12bit 444 and 4:2:2 supports everything - 4k@24/25/30/50/60.

    So it sounds like the Zidoo is doing the right thing maybe? If you set your player to output 10bit 420 and try to play 4k@24/25/30, it can't due to specs, so it forces 444? If so, when setting the player to 10bit 444 and trying to play 4k@50/60, it should do the same thing - force to a compatible color space mode (420 in this example). Does it do that? Can somebody check to see if the Zidoo forces 422 or 444 with 4k depending on which framerate the video is? If the player is indeed handling this properly, it would be nice to know so we could scratch it off our list of issues. Of course, Zidoo could also chime in here and confirm as well, but yeah...

    Can somebody check on this then? Is it possible that the Zidoo box knows to switch to 444 when playing 24/25/30p, and that's why it seems to force 444? Is this why it seems to output 444 when setting to 422?
     
  5. ursiel

    ursiel Active Member

    According to the chart linked to above, 422 doesn't support 10bit at all, with any frame rate. 12bit supports 444 and 420 in all frame rates.

    Interesting, your theory on them rendering everything at 12bit, and then 10bit just downconverting using less-than-ideal algorithms. This could possibly explain why 10bit looks worse than 12bit, even though the content is 10bit so both 10bit and 12bit should look the same when playing 10bit content.

    Ideally we should just use 12bit 444, as that seems to support all frame rates, but due to the frame dropping issue, that's a problem.

    I'd love to hear your thoughts on all this.
     
    Last edited: Feb 28, 2018

Share This Page