Hi all Given the new settings / firmwares, I thought I'd start a new thread to try and guage whether people watch their SDR titles in DV/LLDV, HDR10 or SDR? and whether there's any difference one way or the other? I know this is going to be subjective and will be difficult to compare, given a/b tests will not be instantaneous. But still, I for one would be keen to see what people say together with their reasoning as well as voting of course. Thanks ps Added SDR as third playback option
Ahh thanks - so no enhancement then or you have tried but think SDR titles are best played as SDR? I'll amend the Poll if you think this would be more appropriate
Yes, please add that to the poll. For me it’s SDR as SDR as well. While the VS10 does the best job in converting that I’ve seen it still looks a bit too artificial to me.
Interesting - so far I've found HDR10 to be amazing, compared to the native SDR - I haven't tried LLDV as yet. I've added "SDR" as a third poll option.
Today's quality TV's do an excellent job upscaling to the capabilities of the TV. Very difficult to beat that. No doubt for me AUTO is the setting to use resulting in SDR => SDR. Default video mode is 1080P23 for me for the same reason and obviously also TV-led for DV. @rozel You forgot to vote yourself it seems (no HDR10 vote).
Yes I've left the setting on that for when playing my MEL7 titles but when playing an SDR title, I move to Auto, that was until last night. Last night I was going to play the 50th Anniversary edition of "The Sound of Music" an HD SDR title in HDR10 but decided against it and played in LLDV. It was marvellous, really. My TV does an excellent job in upscaling to 8K so I figured that I'd see any imperfections using LLDV more clearly. But no - the PQ was absolutely superb - never seen the inside of the Church (during the wedding scene) look so clear and bright except in real life. No I haven't voted yet as I haven't really compared like for like - I've played a couple of SDR titles in HDR10 but I want to watch some Sound of Music scenes in HDR10 and of course SDR too first. I would like to ask why people set their frame rates to "23" and not "24". 23 fps is actually 23.98 so I would have thought 24 would be a better setting. I feel sure there's a logical reason to set at 23 - anyone?
That is an easy one. As I play (and many others) mostly cinema movies in 1080P23 format that is the default setting. No frame-rate switching at the start of a movie mostly. For menu's there is no difference visible.
I get that but isn't the true Frame Rate 23.98 and not 23? My Vertex2 shows 23.98 on just about everything.
With Zidoo P23 = 23.976 (24/1001) and P24 = 24.000 which may be confusing as other brands often denote P24 as the cinema rate. Be aware that movies with 24000 also exist and are displayed with the correct rate using Zidoo. I am bit surprised Vertex2 refers to 23.98 as it should really be the more precise 23.976 frame-rate. The early Zidoo models using RTD1295 were in fact limited to 23.98 which still was observed as suboptimal by some customers (RTD1195 used 24000). With RTD1296 amongst others that was further improved.
Ahh right - just checked my Vertex and it does display 23.976 my bad, but I didn't know Zidoo's interpretation of 23.976 was 23! Surely that needs changing? I'm fully aware of cinematic frame rates and set my FPS to what's shown on my V2 but that 23 anomoly has always confused - as good as my Zidoo is, it doesn't half confuse! Thanks
@mirror Zidoo should indeed give a Hint with these settings in Setup proving the explication given above. They sometimes do and on other don't. These choices need one.
23 is standard naming in the industry for integer roundup. Simply because there is also a 24.000 frame rate. 23 = 23.976 24 = 24.000 25 = 25.000 29 = 29.970 30 = 30.000 50 = 50.000 59 = 59.940 60 = 60.000 So 24 can’t be used for 23.986 and 23 is used instead. Same with 29.970 and 59.940. You also see this in computer monitor setup. Zidoo is doing it correctly, but some explanation wouldn’t hurt of course. This is only when the list is being rounded up to integer.
early on i watched complete 70s tv series Space 1999 in vs10 mode and compared to the original sdr files it looked fantastic ie the starry skies surrounding the Moon tended to be smokey grey ...with DV on they were inky black with decent stars all around , and the uniforms went from faded red to something fresh off the shelf The only issue was a lil too black , but all the other colours suddenly jump out , and the original sdr , even on my oled just looked washed out Atm im using hdr 10 as it still adds a lot to the original sdrs but way less of the black crushing ie ATM im just about finished SG atlantis series , which i makemkved to disk , so each epis like 8 gig for 1080p lol But the differences between sdr via even my Bluray player to Tv is very noticeable id go back to trying forced DV if they ever sort the black crush its more about bringing all colours back up in levels , rather than eye blinding hdr highlights
Thanks @cucnz - this is the sort of response I was looking for. Not because you've chosen HDR10 right now nor if you were to choose DV once they sort the blacks but because of the detail. This is much appreciated - thank you. I still haven't voted but I intend to once I've watched The Sound of Music again but in HDR10 - maybe later today.
Well after taking some advice, it seems at the moment HDR10 conversions are really awful due to the VS10 black levels and that the Realtek engine doesn't do a good job. So I've voted DV/LLDV as watching The Sound of Music in both SDR and LLDV - LLDV gives a better viewing experience in my opinion. I will come back to this thread as and when the VS10 black levels have been fixed