Recommended Settings and other useful stuff for RTD 1619DR Players

Discussion in 'HDD Media player(RTD 1619DR)' started by Markswift2003, Oct 21, 2020.

  1. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    It's all a complete minefield isn't it!!
     
  2. @Markswift2003
    I have another question. I used hdr10 test patterns to do some basic testing for my TV with DV: Mehanik HDR10 calibration and test patterns set.

    As of black level setup,
    If I use Auto to output HDR10 as it is, the black is at the 64 as expected.
    If I turn on VS10 to output DV, the black is at 80. That is not what I expect. Any idea what's going on?

    Thank you!
     
    Letmein likes this.
  3. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Mehanik test patterns are 1000nit HDR10 so will be most accurate when output in HDR10.

    Both HDR10 and DV use a gamma PQ curve which comes out of black slowly as opposed to the more linear response of power gamma so I guess that with your display, converting HDR to DV that black responds slightly slower hence the raised level - but don't forget this is an HDR to DV conversion so you can kind of forgive such inaccuracies - and 80 is still reasonable with the Mehanik patterns depending which pattern and what the viewing conditions are.

    Generally I use Masciolla patterns and stick to those so I have a level playing field for comparison but I just tested the Mehanik 1000nit HDR10 patterns using VS10 to output LLDV and got the following results in a darkened room:

    Black Level V1 (48-120) resolved to 68
    Black Level V2 (48-80) resolved to 68
    Black Level V3 (56-112) resolved to 68
    Black Level V4 _crop (56-112) resolved to 68

    If I use Masciolla 1000nit HDR10 patterns I can resolve to 68 also.

    Top tip here - to remove idiosyncrasies of the display or viewing environment, press "Menu", select Advanced/Picture Parameter and knock the brightness up . This has the effect of raising all grey output levels equally but does not raise the relative black floor so you can easily see the clip level accurately.
     
    Last edited: Oct 28, 2021
  4. Thank you for explanation. I was thinking it could have something to do with VS10 HDR to DV conversion.

    Here is another question if you don't mind :)

    About MaxCLL and MaxFALL, I am wondering how the TV should behave. Typically, HDR MDL is 1000 and DV MDL is 4000. Let's say my TV peak is 2000.
    For 01. 240-1000nits-MaxCLL-1000-MDL-1000, I saw all values up to 1000, which is expected (although I don't know if the absolute value is indeed 1000 without measuring it.)
    For 03. 900-4000nits-MaxCLL-4000-MDL-4000, I still saw all values up to 4000. This behavior puzzles me. Does the TV somehow map MaxCLL 4000 to its peak 2000? Apple TV Infuse is different. For this pattern, I can only see up to 2000, which I think it is expected? As far as I know, Apple TV Infuse send fixed MaxCLL/MaxFALL (4000/1000). It happens to be the same as this pattern on Zidoo. Why does the TV show them differently?
    If MaxCLL indeed get mapped to TV's peak, does it mean if MaxCLL is greater than TV's peak, we see darker image than in studio setting? If MaxCLL is smaller than TV's peak, I assume TV would keep the value as it is. Otherwise, we see brighter image than in studio settings;
     
  5. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    So the first thing to understand is MDL - this is simply reporting the min/max luminance level that the display used for mastering can achieve and is really only there for information - it has no bearing on how your display at home reacts. In my (limited) experience these are either Dolby Pulsar but more usually Sony PVMs.

    In the early days of HDR (when quite honestly everyone had a bunch of new kit but no clue how to use it!!) you used to see wildly inaccurate MDLs reported. You'd see 10,000nits when there was no monitor (and still isn't) that could get anywhere near that and you'd often see 4000nits but know damn well it wasn't mastered on a Pulsar.

    MaxCLL/FALL are the important ones defining the luminance level of the brightest pixel in the entire presentation and the average overall luminance level. Some TVs use this information to tone map and some don't.

    Your TV obviously does use this and the reason you see values up to 4000nits is because the TV is tone mapping values above 1000nits. Technically it's actually tone mapping values below that as well so the curve will fit - basically compressing the high values so it fits the entire 4000nit range into its capabilities.

    Ideally the PQ curve would look like the yellow line, but in reality the display can't do that so the curve is rolled off so the higher luminance levels are compressed with, in the instance above, 4000nits being mapped to the display's capability of 2000nits.

    [​IMG]

    This looks a bit drastic but the thing to remember is that anything above diffuse white (traditionally 100nits and recently raised to 200nits) is specular highlights so you are really losing very little and you don't see a darker picture - this is the paradox of HDR - everyone seems to think it's supposed to make the overall image brighter - it most certainly isn't, it's supposed to enhance the specular highlights and provide a more natural luminance sweep. The APL is overall pretty much the same as SDR.

    Not really sure what the Apple TV is doing though o_O
    Being Apple I guess.
     
    susanstone2021 likes this.
  6. Thanks for the information. I forgot the curve is not necessarily linear. :) It can keep the lower parts as they are and only map the parts beyond 1000 or its peak value. "The APL is overall pretty much the same as SDR." is important. TVs with different peak values should only see slightly different in highlights in normal scenes and may not be even noticeable. The more highlights in the scene (for example, a big blast) , probably more noticeable.
     
  7. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Even high APL scenes will still have the vast majority of the information at or below diffuse white - all those nonsense images showing a dull grey image for SDR and a bright colourful image for HDR are just plain wrong and marketing hyperbole... That's not to put down HDR at all - PQ gamma is a definite improvement over power gamma which was, after all, created to accommodate the shortcomings of CRTs.
     
    susanstone2021 likes this.
  8. billchm

    billchm New Member

    Mark is it possible to get an edid for a XR65X90J Sony Led TV? or do I even need one?
     
  9. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    No, custom EDIDs are only needed for very specific uses - for instance to fool the Zidoo into producing LLDV for projectors, restricting the HDMI frequency if your HDMI chain won't sustain 600MHz, restricting to HDR10 for DV TVs etc.

    There are also instances where amplifiers modify a TVs EDID detrimentally which can be corrected using a custom EDID.
     
  10. billchm

    billchm New Member

    O.K Thanks, I reference a lot of what you say on here. Definitely a great source of info.
     
  11. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Happy to help :)
     
  12. HDR is used in several places in daily life. For example, in photography, it is about taking HDR picture. In old days (not sure now), single shot will not be able to capture HDR. It needs 3 shots to combine them into one picture so highlights and shadow details can be preserved in SDR picture because when you go print the photo, there is only this much color or dynamic range you can get.
    I would guess HDR vedio making is the similar. First you need to capture the HDR scene in some way to preserved the HDR details. Later when you master the video, you will need to reproduce HDR scene on the display. Then you create SDR/HDR video out of it.
     
  13. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    HDR photography is a bit different - you're right, you typically shoot 3 to 5 shots at increasing exposures and then combine - I started messing with HDR photography 11 years ago - you can get some really cool surreal effects.

    This is probably my first attempt circa 2010:

    upload_2021-10-29_22-41-34.png

    With video it's not about combining low, mid and high exposures, it's about widening the dynamic range - in other words having many more sample points between black and white - hence the move to 10-bit which gives 1024 samples rather than 256 from 8-bit (and higher in the case of Dolby Vision) - and also using the perceptual quantiser curve to spread those samples over the grey scale which supposedly mimics the way we perceive light.
     
    muha and pcristi like this.
  14. billchm

    billchm New Member

    Not sure if this has been discussed, but, if your chain of devices is media player (Zidoo) Receiver (Anthem 740) TV (Sony XR65X90J) which should do the video processing? In another words should bypass be used say on receiver and media player and let TV do it or vice versa. And if TV should setting it to LLDV or Tv led?
     
  15. Claudiu

    Claudiu Member

    Hi, I just bought the Z10 Pro and I am trying to understand the difference between HDR modes Auto and VS10. My tv set is a CX oled from LG, so it is DV compatible.
    I keep reading here on the forum that Auto mode uses Realtek processing and VS10 uses DV conversion, said to be better. I set DV compatibility to standard DV (not lldv) as I understand in this case it is the tv doing the processing and I guess it does a better job than the Z10 pro and TV led, as I saw it boosts better pq (see Vincent videos).
    Should I enable VS10 for all content (as seen in some reviews here, it gives better pq, less banding), or keep Auto HDR?
    I can not do a side by side comparison and I am not sure my observations are correct. I am looking for the best picture quality.
    Thank you.
     
  16. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Auto will choose the appropriate HDR format depending on the capabilities of the TV.

    In your case, SDR, HDR and Dolby Vision will be output natively so there's no need to worry about either the Realtek or VS10 engine performing any conversions because it's not needed.

    Feel free to test VS10 (for all content) which will convert SDR and HDR to DV (DV content will still be output natively) - it's a matter of personal choice.

    With a decent display able to display SDR, HDR and DV natively, the logical choice is generally Auto.
     
  17. Claudiu

    Claudiu Member

    Thank you for answering.
    So in this case the difference would be only in the format, not the content.
    I played with the settings, in some scenes I can see some difference, but as I said, I might be subjective, plus, I'm not an expert.

    I have one more question: is the VS10 for SDR mode something similar to the tv set's artificial HDR picture mode? I can see on SDR material that VS10 is adding some hdr look, more than the oled's HDR picture mode.
    Is this VS10 for SDR somewhat correct to watch SDR material? I know it is not what was intended, but for me at least it seems an improvement. Just wondering if VS10 was intended for this and if it does a good job at it. And I don't want to wonder too far from the original intent.
     
  18. AngryVirginian

    AngryVirginian Active Member

    @Markswift2003, are we supposed to leave these settings during playback alone?

    [​IMG]


    [​IMG]
     
  19. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    VS10 for SDR means that any SDR content is processed by the VS10 engine. The output depends on the TV - in your case this would convert SDR to Dolby Vision.

    If it's an improvement then it's absolutely the right choice - always chose what you think looks best for you, not what anyone else tells you should be the right setting.

    Trust your eyes.
     
    blenky and serg fedorov like this.
  20. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Personally I would leave well alone and always calibrate the display as opposed to the source if needs be.

    These controls aren't subjective - Brightness (Black Level) should be calibrated so black is black and anything above isn't and similar with Contrast (White Level) so really they are not subjective controls. Hue (or tone) shouldn't need to be touched either if colours are correct.

    It may seem a bit random to be set at 32, but that equates to 50% in the Android settings and in my experience is absolutely spot on.
     
    AngryVirginian likes this.

Share This Page