Video MP4 DV in 8 Bit instead of 10bit on my receiver !!!

Discussion in 'HDD Media player(RTD 1619DR)' started by OlivierQC, Sep 12, 2020.

  1. OlivierQC

    OlivierQC Well-Known Member

    Hello,

    I need your expertise because I do not know if it is normal or if my equipment has a problem or a bad setting.

    Here are my devices :

    Sony 77A9G and all HDMI are set on Enhanced (the Tv is up to date)

    Monster cable black platinium

    Pioneer SC-LX704

    Monster cable black platinium

    Zidoo Z9X with the last FW


    Here are the settings of my Zidoo :

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    -----------

    When I watch an UHD HDR mkv movie, the Pioneer gives me this information :

    [​IMG]

    -----------

    When I watch a Dolby Vision MP4 movie, the Pioneer gives me this information :

    [​IMG]

    The Pioneer indicates 8bit instead of 10Bit with MP4 video file.

    -----------

    Is this my Pioneer receiver which does not recognize the 10 bit signal of the DV video file?

    And when i compare the 2 versions (MKV HDR and MP4 DV) on my 2 zidoo devices I do not distinguish any difference in quality on the MP4 video file, the colors are as beautiful as the MKV HDR version, there is no banding, etc ... as if it were well in 10bit but that the receiver indicate 8bit ...

    -----------

    Here is the NFO of the MP4 video

    [​IMG]

    -----------

    Could you give me your opinion / advice ?


    Thank you in advance for your help

    OlivierQC
     
    Last edited: Sep 13, 2020
  2. DennisTheMenace

    DennisTheMenace Active Member

    Last edited: Sep 12, 2020
    OlivierQC likes this.
  3. DennisTheMenace

    DennisTheMenace Active Member

    The method Dolby Vision (DV) uses to transport the signal over HDMI is referred to as “RGB Tunneling”. The 12-bit ICtCp DV signal + Metadata is encapsulated inside the regular RGB 8-bit video signal. The DV “tunneling” carries 12-bit YCbCr 4:2:2 data in an RGB 4:4:4 8-bit transport. This is possible because both signal formats have the same 8.9 Gbps data rate requirements.

    DV requires dynamic luminance data which cannot be explicitly carried in an HDMI 2.0 (18 Gbps max) data stream, so it is designed to transport over HDMI 1.4 (8.9 Gbps max); at least up to 4K@30. DV base content and DV luminance (meta) data is encapsulated in an HDMI 1.4 compatible (except HDCP 2.2) RGB 4:4:4 8-bit video stream. That's why Dolby claims that DV can be sent via HDMI v 1.4, but in reality, HDMI v2.0 is needed due to the HDCP v2.2 encryption.

    The DV metadata is encoded into the least significant bits of the chroma channels. Upon the HDMI EDID exchange (handshake), the sink (AVR, Display, or HDMI switch) signals the source that it supports Dolby Vision "tunneling". The source then signals the sink that it's transmitting Dolby Vision through an AVI Infoframe, which therefore triggers the Dolby Vision mode in the sink. The display DV engine extracts the components and produces a tone mapped image.

    As a result, video pass-through components must be DV 'aware' to not alter the signal, which is in effect 'hidden' inside the 8 bit RGB 'container'.

    AVR’s may report DV signals in one of two ways, but both are correct:

    Resolution: 4k:24Hz ->4k:24Hz

    HDR: Dolby Vision

    Color Space: RGB 4:4:4 -> RGB 4:4:4 -OR- YCbCr 4:2:2 -> YCbCr 4:2:2

    Color Depth: 8 bits -> 8 bits -OR- 12 bits -> 12 bits
     
  4. OlivierQC

    OlivierQC Well-Known Member

    Hello Dennis,

    Well, that's good to hear. and it's very informative.

    Thank you so much
     
  5. DaMacFunkin

    DaMacFunkin Active Member

    For your info if you switch to LLDV output it should then show 12 bit as the player is decoding the stream instead of tunnelling to the TV.
     
    Reza-D likes this.
  6. OlivierQC

    OlivierQC Well-Known Member

Share This Page