Hey guys, I was browsing around this forum and there is tons of info! I hope you dont mind me asking a few questions which otherwise may have taken days to find the answer to I will be playing blu ray UHD 4k mkv files with HDR and some with DV. After using the Z9x for a bit, I get the constant hdmi syncs every time I start or stop a movie. I primarily watch 1080p blu ray files or the 4k HDR and soon DV files. Are there any particular settings I should adjust to reduce the hdmi sync issues while also maintaining the original source or as close to it as possible? I was also reading up on the VS10 DV engine. I plan to begin ripping my mkv files with the DV layer (I thought I had been all along but apparently I missed a checkbox...ugh). Once I begin playing those movies, what HDR setting is best for my particular TV (LG OLED c1)? I could not determine what profiles it supports or not. Im not sure I want to convert all non HDR sources to HDR, but rather just use it for strictly DV content. Thoughts on whats best? Historically most videophiles tend to prefer not messing with the original source unless you have an uber expensive processor. Edit - I forgot to mention I use DVDFab for ripping. Ay setting in here I need to be aware of other than adding the DV layer? Thanks again!
Playback/Automatic frame rate/Frame rate mode: Switch frame rate Display/Resolution: 3840x2160p 23Hz HDR: Dolby Vision VS10 for all contents Get the rest for this thread - Recommended Settings and other useful stuff for RTD1619 Players (JVC LLDV PDF added) | Zidoo forum
I don't think that's possible as Dolby Vision is Dolby Vision no matter what you do, If anything LLDV outputs in bt2020 whereas the other is rec 709 or something. then tunnels to your tv.
It is possible. Profile 7 Dolby Vision consists of the base HDR10 video layer and the RPU tone-mapping metadata (and the enhancement layer but that is not applicable to Zidoo). In regular DV, the player sends both parts out through 8-bit HDMI tunneling transport stream and the TV assembles the parts together to produce the picture. In LLDV, the player mixes the parts together and sends the fully assembled picture out in 12-bit HDMI signal in which the TV has to downscale to 10-bit. So, it is possible that source can do the better job than the display or vice versa.
Fully agree. Sometimes people have television which are stronger machines than Zidoo and it would be better to leave television to handle with DV and Zidoo just to pass-through original video. I have a problem with DV videos are too bright and foggy, which could be adjusted a little with brightness settings but than video start to be too dark. If television cannot handle it, than it is better to leave it to Zidoo for processing DV video and send it to television.
That video is comparing the Oppo SOC LLDV vs TV-LED, in that video the Oppo Engine color replication isn't as good as TV-LED, maybe due to it being an older SOC. I've tried LLDV on the Zidoo and TV-LED still capture of same frame, honestly looked the same to me. Though I still use TV-LED because I do sometimes use DV Home for IQ.
I think people just see what they want to see. Honestly, there is no difference in most cases. Unless you maybe have a really cheap quality T.V or player. Maybe @Markswift2003 could shed some light on this discussion.
I said at the time that this test was fundamentally flawed but that aside, there are definitely issues surrounding LLDV - it can produce artifacts that don't appear in TV Led, but the severity very much depends on the player. The Oppo for example is particularly poor at LLDV but it's all relative. For those of us who prefer to watch movies rather than test patterns, you'll probably never notice any difference, but if you have the ability to choose between TV Led and Player Led, my advice would be to go TV Led but not to worry about it one bit if you only have Player Led available.