Hello, I need your expertise because I do not know if it is normal or if my equipment has a problem or a bad setting. Here are my devices : Sony 77A9G and all HDMI are set on Enhanced (the Tv is up to date) Monster cable black platinium Pioneer SC-LX704 Monster cable black platinium Zidoo Z9X with the last FW Here are the settings of my Zidoo : ----------- When I watch an UHD HDR mkv movie, the Pioneer gives me this information : ----------- When I watch a Dolby Vision MP4 movie, the Pioneer gives me this information : The Pioneer indicates 8bit instead of 10Bit with MP4 video file. ----------- Is this my Pioneer receiver which does not recognize the 10 bit signal of the DV video file? And when i compare the 2 versions (MKV HDR and MP4 DV) on my 2 zidoo devices I do not distinguish any difference in quality on the MP4 video file, the colors are as beautiful as the MKV HDR version, there is no banding, etc ... as if it were well in 10bit but that the receiver indicate 8bit ... ----------- Here is the NFO of the MP4 video ----------- Could you give me your opinion / advice ? Thank you in advance for your help OlivierQC
That's normal all my hdmi dolby vision devices are using 8bit (nvidia shield xbox one x, my ugoos am6 plus) and an appe TV also See here https://discussions.apple.com/thread/250715577 Its the way the dolby vision signal gets packed bevor its transferred over hdmi Its called dolby vision tunnelling https://www.nvidia.com/en-us/geforc...62/netflix-dolby-vision-switches-to-8bit-rgb/
The method Dolby Vision (DV) uses to transport the signal over HDMI is referred to as “RGB Tunneling”. The 12-bit ICtCp DV signal + Metadata is encapsulated inside the regular RGB 8-bit video signal. The DV “tunneling” carries 12-bit YCbCr 4:2:2 data in an RGB 4:4:4 8-bit transport. This is possible because both signal formats have the same 8.9 Gbps data rate requirements. DV requires dynamic luminance data which cannot be explicitly carried in an HDMI 2.0 (18 Gbps max) data stream, so it is designed to transport over HDMI 1.4 (8.9 Gbps max); at least up to 4K@30. DV base content and DV luminance (meta) data is encapsulated in an HDMI 1.4 compatible (except HDCP 2.2) RGB 4:4:4 8-bit video stream. That's why Dolby claims that DV can be sent via HDMI v 1.4, but in reality, HDMI v2.0 is needed due to the HDCP v2.2 encryption. The DV metadata is encoded into the least significant bits of the chroma channels. Upon the HDMI EDID exchange (handshake), the sink (AVR, Display, or HDMI switch) signals the source that it supports Dolby Vision "tunneling". The source then signals the sink that it's transmitting Dolby Vision through an AVI Infoframe, which therefore triggers the Dolby Vision mode in the sink. The display DV engine extracts the components and produces a tone mapped image. As a result, video pass-through components must be DV 'aware' to not alter the signal, which is in effect 'hidden' inside the 8 bit RGB 'container'. AVR’s may report DV signals in one of two ways, but both are correct: Resolution: 4k:24Hz ->4k:24Hz HDR: Dolby Vision Color Space: RGB 4:4:4 -> RGB 4:4:4 -OR- YCbCr 4:2:2 -> YCbCr 4:2:2 Color Depth: 8 bits -> 8 bits -OR- 12 bits -> 12 bits
For your info if you switch to LLDV output it should then show 12 bit as the player is decoding the stream instead of tunnelling to the TV.
Hello Dennis, markswift2003 has another theory https://www.avsforum.com/threads/zidoo-z9x-rtd1619-thread.3140924/post-60097992 See you