Thread: Dell UltraSharp 32″ HDR PremierColor Monitor

Reply to Thread
Page 4 of 7 FirstFirst 1234567 LastLast
Results 31 to 40 of 63
  1. #31  
    Senior Member jake blackstone's Avatar
    Join Date
    Nov 2007
    Location
    Los Angeles
    Posts
    3,975
    Quote Originally Posted by andrewhake View Post
    "Inversely, watching a properly tone mapped 1000 nits DolbyVision material on a 1600 nits monitor will result in a dark highlights." Isn't true at all. It will result in 1000nit peak highlights.
    Not surprisingly, I would have to disagree. One of the reasons of tone mapping existence is to map the peak brightness of the image to the peak brightness of the display. You wouldn't want to watch Rec-709 at 70nits, right? Because that is what happens if you watch a DolbyVision mastered at 1000 nits on 1600 nits monitor. In this case the peak brightness of the image on a 1600 nits monitor will be permanently fixed at a dimmer 1000 nits. The even bigger issue would arise if the situation were reversed. Say, mastering was done at 1600 nits and played on 1000 nits monitor. That monitor is just not capable of reproducing 1600 nits and therefore anything over 1000 nits would get clipped. Luckily, RIGHT NOW this scenario is not even possible as DolbyVision at 1600 nits doesn't exist. RIGHT NOW DolbyVision tone mapping is only can be used with 100, 600, 1000 and 4000 nits, period. And the reason for that is to prevent the very situation we're discussing. I specifically mentioned "right now", as Dolby had mentioned, that they are planning in the future to include other non standard brightness levels in addition to those specific brightness levels. But the tone mapping for those non standard brightness levels, I imagine, may be will be handled by the TV itself?
    Jake Blackstone
    Colorist
    Los Angeles
    MOD Color Web Site
    Demo Reel
    Reply With Quote  
     

  2. #32  
    Senior Member rand thompson's Avatar
    Join Date
    Aug 2011
    Posts
    7,944
    Phil,


    Thanks for that info. I will looking forward to your test results!!
    Reply With Quote  
     

  3. #33  
    Senior Member rand thompson's Avatar
    Join Date
    Aug 2011
    Posts
    7,944
    Quote Originally Posted by jake blackstone View Post
    Not surprisingly, I would have to disagree. One of the reasons of tone mapping existence is to map the peak brightness of the image to the peak brightness of the display. You wouldn't want to watch Rec-709 at 70nits, right? Because that what will happen with watching a 1000 nits master on 1600 nits monitor. But in this case, it will just the monitor will be dimmer at 1000, than it is capable. The even the bigger issue would be if the mastering was done at 1600 nits and played on 1000 nits monitor. That monitor is just not capable of reproducing 1600 nits and therefore anything over 1000 nits will get clipped. The bottom line, there is a reason why RIGHT NOW DolbyVision tone mapping is only used with 100, 600, 1000 and 4000 nits, period. And the reason for that is to prevent the very situation we're discussing. I specifically mentioned "right now", as Dolby had mentioned, that they are planning in the future to include other non standard brightness levels in addition to those specific brightness levels.
    I always wondered why those specific preset nit targets in the Dolby Vision palette.
    Reply With Quote  
     

  4. #34  
    Senior Member jake blackstone's Avatar
    Join Date
    Nov 2007
    Location
    Los Angeles
    Posts
    3,975
    Quote Originally Posted by andrewhake View Post
    One of the keys is if a display can actually do 1000nits full screen sustained as well. "1000 nits" doesn't mean shit if it isn't full screen sustained. Very few can.
    Sony X300 can only do 1000 nits at no more than 10% of the screen. It can barely hit 200-300 nits on a full screen. Anyway, you sure as hell wouldn't want to drive that monitor for any significant periods of time, there is a reason why that monitor has an overdrive indicator. The screen will quickly burn in and there goes your $30k investment.
    Said that, this very monitor for many years was THE MASTERING MONITOR for HDR delivery despite it not being "the shit" monitor according to you. It is now being supplanted by a new dual layer LCD technology-Sony X310, which addresses the burn-in and brightness issues. HDR is not about the overall image brightness and I can't imagine any colorist staring all day at 4000nits searing images without a significant detriment to their health. Majority of HDR images are not that much brighter than SDR images. The main difference between SDR and HDR can mostly be seen in an image specular highlights and surprisingly enough, in an apparent specular highlight sharpness of the HDR images in comparison to SDR. Baselight actually has a specific tool, that deals with this visual phenomena. Also, there is a much wider color gamut available when working in P3 or REC 2020 for HDR delivery. Using wide color gamut color spaces allows for saturated highlights, that are plain impossible in SDR (REC-709)images. But that has nothing to do with the peak brightness...
    Jake Blackstone
    Colorist
    Los Angeles
    MOD Color Web Site
    Demo Reel
    Reply With Quote  
     

  5. #35  
    Senior Member jake blackstone's Avatar
    Join Date
    Nov 2007
    Location
    Los Angeles
    Posts
    3,975
    Quote Originally Posted by rand thompson View Post
    I always wondered why those specific preset nit targets in the Dolby Vision palette.
    And now you know
    Jake Blackstone
    Colorist
    Los Angeles
    MOD Color Web Site
    Demo Reel
    Reply With Quote  
     

  6. #36  
    Senior Member rand thompson's Avatar
    Join Date
    Aug 2011
    Posts
    7,944
    HaHa Thanks!
    Reply With Quote  
     

  7. #37  
    Quote Originally Posted by jake blackstone View Post
    The main problem with LG OLED is not resolution or the peak brightness, but the W in WOLED tech. As a result, anything over a standard 100 nits is not really a valid image. Yes, it can be pretty, but not valid...
    Oh I agree 100%. That's why we would check on the $30K Canon reference monitor afterwards.

    The budget did not sustain getting $30K Canon monitors for entire graphics team so they had to use LGs, but again it's wasn't the last line of defense in the grading suite.

    As I was saying, if this new Dell monitor (or the Apple monitor) were a lot cheaper, it may make sense as an upgrade from the LG... but it's in a weird price bracket of "a LOT more expensive than the LG" but "also still not good enough REALLY."

    I'm sure there are people with the projects this makes sense for - I just find it an odd product category for Apple and Dell to both go for.

    Bruce Allen
    www.bruceallen.tv
    Reply With Quote  
     

  8. #38  
    Senior Member
    Join Date
    Oct 2009
    Posts
    1,191
    What's the cheapest set up for a Mac laptop that's decent for grading? I'm only working off 2k sources (S16 and ARRI Alexa 4:3 ProRes) and don't mind grading at 1080p.

    Is there some crazy way to calibrate a 16" rMBP or one of the 4k 21.5" LG displays? Most of my work is personal projects but I want to get in the ballpark. It doesn't need to be better than ballpark.
    Reply With Quote  
     

  9. #39  
    Senior Member jake blackstone's Avatar
    Join Date
    Nov 2007
    Location
    Los Angeles
    Posts
    3,975
    Quote Originally Posted by Matt W. View Post
    What's the cheapest set up for a Mac laptop that's decent for grading? I'm only working off 2k sources (S16 and ARRI Alexa 4:3 ProRes) and don't mind grading at 1080p.

    Is there some crazy way to calibrate a 16" rMBP or one of the 4k 21.5" LG displays? Most of my work is personal projects but I want to get in the ballpark. It doesn't need to be better than ballpark.
    The problem is not the calibration per se, but the way you get the image on screen. If you want to make sure you get the image you can trust, you need to bypass the OS color management. If you can get the cheap BM UltraStudio Monitor 3G, then you can use it to feed a number of decent monitors, that can be calibrated to Rec-709.
    Jake Blackstone
    Colorist
    Los Angeles
    MOD Color Web Site
    Demo Reel
    Reply With Quote  
     

  10. #40  
    Senior Member
    Join Date
    Oct 2009
    Posts
    1,191
    Quote Originally Posted by jake blackstone View Post
    The problem is not the calibration per se, but the way you get the image on screen. If you want to make sure you get the image you can trust, you need to bypass the OS color management. If you can get the cheap BM UltraStudio Monitor 3G, then you can use it to feed a number of decent monitors, that can be calibrated to Rec-709.
    What would you recommend? And recommend to calibrate the monitor? Again, it can be pretty bad, just needs to be ballpark.
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts