How exactly does the MaxCLL/MaxFALL toggle work?

Discussion in 'HDD Media player(RTD 1619DR)' started by Jimbo Randy, Apr 20, 2023.

  1. Jimbo Randy

    Jimbo Randy Active Member

    Is the Zidoos simply passing through the HDR10 metadata to the display? Or is the Zidoo actually reading the file, and then GENERATING the correct HDR10 metadata, then sending to the display? I'm trying to figure out if my LG C1 uses these values but first I need to take a step back to understand what exactly the Zidoo is doing.

    I wish there was a way to toggle it on and off in real-time since that would obviously answer the question of whether it's being used, however I cannot find a way to test that.
     
  2. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    It passes the values, if they exist, from the HDR metadata in the file to the display.

    Just leave it on.

    You could test whether they are used with say a 4000nit clipping pattern.

    Mehanik does open source ones and has a 900-4000nit colour and white clipping pattern with MaxCLL at 4000nits and Mastering Display Luminance at 1000nits.

    If the display tone maps correctly and uses MaxCLL correctly, you should see more or less the full range mapped into the capabilities of the display (775nits for a C1).

    https://mega.nz/file/BCZRgbrT#MAbmjhEhIqCq5bmfb1wzqUXLM1B0qgrKJiQxeJfl8Jw

    https://mega.nz/file/VaAVjLgR#4Li6mWDJ8W4pyxDk_FKJgtG4GcSBzS_kWOJpgKqrGGU

    So, for example, my Samsung display doesn't used MaxCLL but instead, incorrectly, uses Max Display Luminance. This is simply the maximum capabilities of the display used to grade - generally it's at 1000 (in which case, usually a Sony BVM) or 4000 (in which case, usually a Dolby Pulsar) and should not really be used for tone mapping. So with these files, I can resolve the 900nit point, but nothing above because MDL is set at 1000nits in the files.

    One reason (in addition to the coolness of using DV on a Samsung) that I use VS10 to tone map to Dolby Vision at 1000nits (which my Samsung is capable of).
     
    Last edited: Apr 20, 2023
    Jimbo Randy likes this.
  3. Jimbo Randy

    Jimbo Randy Active Member

    Wait sorry, just a bit confused. If I play the 900-4000 nit colour/white clipping pattern, if my TV is actually using the metdata, I should see the entire 900 - 4000 nit range visible? And if the TV is NOT using the metadata, i will see clipping around 800?

    EDIT: FYI I'm just trying to understand how this all works. I will definitely keep this option ON. Just curious how it works since it will help me to understand more about tone mapping and how metadata is used by different panels.
     
  4. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Yes - the idea is that the TV maps 4000nits to the 775nits your TV can do and the in-between bit is compressed using a roll off in the gamma curve.
     
  5. Jimbo Randy

    Jimbo Randy Active Member

    Thanks man. I'm going to go test that out. I've been trying to figure this out for ages so this will finally (hopefully) put that question to rest.

    What got me interested was that I bought a 27GR95QE-B monitor, and I am trying to understand how the panel does tone mapping. It has a peak brightness of 650 nits (in the more accurate HDR Mode) so I set it to 650 in the Windows Calibration App. The weird thing (and what got me interested in this) is that when I limit the win11 calibration to something like 400 nits, and then open the 240-1000 nits test pattern (also using Mehanik patterns), it showed clipping at like 900 nits. Then when I set the calibration to 10,000 nits, it showed clipping at like 550 nits. So I was trying to figure out what in the world was going on with the tone mapping. I THOUGHT that it should clip at whatever you set the calibration to, but I am now seeing that Windows probably does some sort of tone mapping to squeeze/expand the test patterns to whatever you have set. If you know anything about that, let me know. Anyway, i am off to test the TV. Will update you on my findings in case anyone else has the C1/C2, etc. and was wondering this.
     
  6. Jimbo Randy

    Jimbo Randy Active Member

    Okay I just tested it on the C1. On the 900-4000 nit pattern, I see WHITE clipping at the 2nd or 3rd to last box, which is around 3200-3300 nits I believe. Same with the 240-1k nit pattern. On the 10k nit patterns, it basically shows all of the white boxes. To clarify, I was only looking at WHITE clipping.

    Given all of this, is it safe to assume that the LG C1 IS in fact using Maxcll/MaxFALL values correctly?
     
    Last edited: Apr 20, 2023
  7. Jimbo Randy

    Jimbo Randy Active Member

    Alright so I actually tested it again with the MaxCLL/MaxFALL passthrough toggled OFF, and i'm seeing it clip at the exact same point in the 4000 nit test pattern (clipping at 2nd or 3rd box from the right). What would this mean? Because LG OLEDs apparently will just default to a 4000 nit curve if there is no metadata, so how would we know? Is there another way to confirm?

    Tested with both the 1k and 10k patterns as well. No change between MaxCLL/MaxFALL on/off.
     
    Last edited: Apr 20, 2023
  8. Jimbo Randy

    Jimbo Randy Active Member

    Ok so it seems like my TV uses the MaxCLL data regardless of whether the toggle for maxcll/maxfall is on or not? Is toggling that off supposed to disable metadata passthrough completely? Because if so, it doesn't seem to do that, unless I'm misunderstanding something.
     
  9. Oldpainless

    Oldpainless Active Member

    Well, that's odd....with the 900-4000nit clips I get no flashing boxes at all, well, maybe a faint one at 900-1000.
     
  10. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Some LGs use dynamic tone mapping, no idea if your does or not, but that would explain it.
     
  11. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Depending on the HDR mode that probably just indicates that MaxCLL and MaxFALL are not used (assuming switched on!).
     
  12. Oldpainless

    Oldpainless Active Member

    Yeah, it's on. I've put my LG into PC mode, as that's the only way to get proper 4:4:4 on my OLED, which also switches off a lot of the crap too, so it might be that.
     
  13. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    What do you mean "proper 4:4:4"?
     
  14. Oldpainless

    Oldpainless Active Member

  15. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    Thing is the content is encoded in 4:2:0 and the only reason I recommend outputting 4:4:4 from the Zidoo is that 4:2:0 is not a legal HDMI mode at 23~30Hz. So it's still 4:2:0 even though it's encoded in 4:4:4 at the output.

    So unless the display is doing something else that you like to the image, I don't see any advantage.
     
  16. Oldpainless

    Oldpainless Active Member

    Well, one of the benefits like I said is that it swtches off most of the crap, so the image you get is closest to what the source is sending, along with other things.

    It's easy enough for people to check the differences by going into the LG dashboard and setting the HDMI input name/type to PC.
     
    Last edited: Apr 21, 2023
  17. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    To be fair, that's always the very first thing I do with any display - turn off all the crap!
     
    Oldpainless likes this.
  18. Jimbo Randy

    Jimbo Randy Active Member

    So I am pretty sure that my LG IS in fact using the MaxCLL metadata. I found the below:

    - If I play the 1000 nit test pattern, and I go to the HDMI Signaling OVerride menu on the TV, and I switch between the maxcll and maxDML fields from the default "Auto" to 1000 nits, nothing changse. If I switch it to 2000, or 4000, or 10000, I see the test patterns clip at different points. So AUTO is using 1000 nit. I tested this with the 4000 nit and 10000 nit test patterns. So the TV IS using the values. Wouldn't you agree?

    Surpringly, toggling the maxcll/maxfall off in the Zidoo settings doesn't impact this in the slightest. Do you know why? I'd expect disabling it on the Zidoo would block metadata. Is that not how that field works?
     
    Last edited: Apr 21, 2023
  19. Markswift2003

    Markswift2003 Well-Known Member SUPER Administrator Beta test group Contributor

    The very fact that switching off the MaxFALL/CLL switch on the Zidoo still results in the 4000nit and 10000nit signal tone mapping correctly means that something else is going on.

    It's not MaxDML because that's 1000nits for that file so that would result in clipping at 1000nits.

    So that's why I mention DTM - as I say, some LGs have DTM, I just don't know if yours does, but it would explain the behaviour (a quick Google seems to indicate that the C1 does in fact have DTM - so there's the explanation).

    And in case you wondered, the MaxFALL/CLL switch is certainly working correctly:

    HDR Metadata with switch on:

    EOTF 2: SMPTE ST 2084 [PQ], MT: unknown, WP: D65
    GRN: 34000, 16000 [0.68, 0.32]
    BLU: 7500, 3000 [0.15, 0.06]
    RED: 13250, 34500 [0.265, 0.69]
    WP: 15635, 16450, [0.3127, 0.329]
    Max/Min Lum: 1000 / 0.0005 nits
    MaxCLL/FALL: 4000 / 1600 nits

    HDR Metadata with switch off:

    EOTF 2: SMPTE ST 2084 [PQ], MT: unknown, WP: D65
    GRN: 34000, 16000 [0.68, 0.32]
    BLU: 7500, 3000 [0.15, 0.06]
    RED: 13250, 34500 [0.265, 0.69]
    WP: 15635, 16450, [0.3127, 0.329]
    Max/Min Lum: 1000 / 0.0005 nits
    MaxCLL/FALL: 0 / 0 nits
     
    gymnos likes this.
  20. Jimbo Randy

    Jimbo Randy Active Member

    So yes, the LG1 with Dynamic Tone Mapping set to OFF doesn't actually disable tone mapping, as you said. It just does static tone mapping and will tone map it to a specific tone curve, rather than doing it dynamically. But I'm confused, because the TV is clearly adapting its tone curve based on the test patterns. So when I play the 1k nit test pattern, it is using a 1k nit tone curve. When I play the 4k nit test pattern. It is using a 4k nit tone curve. Same with 10k. The odd thing is though, with the LG C1 and other LG OLEDs, if it has NO metadata, it will always default to a 4,000 nit tone curve. This is the behavior that it should display with Dynamic Tone Mapping set to OFF, which is what I always use.

    That is NOT the behavior I am seeing here, so it is clearly reading the metadata of the file. The odd thing is that it still works even with the Zidoo toggle off. Is there any other way that I can test this that you know of? I am quite puzzled with this.
     

Share This Page