Thread: Dynamic Range... two different tests

Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 34
  1. #11  
    Moderator
    Join Date
    Apr 2012
    Location
    Atlanta Georgia
    Posts
    2,759
    Most manufactures calculate the DR as a theoretical number, based on the SNR of their sensor. They do this because it's a good way to objectively do it based on maths.

    All the other methods we argue about involve (rightfully) subjective judgements based on what is acceptable noise. Things like compression, in-camera pre-processing, internal bit depth processing also affect what this number can be. But you can't compare cameras from different brands that use different bit depths in this case other than what you photograph and compare.

    These numbers vary because of not only testing methodologies, but subjectively. I laugh when people compare DR on a still frame. The human vision perceives noise differently when it's in motion compared to a still frame. So why compare on a still frame when judging noise is the threshold for DR ?

    Arriflex have long understated the DR of their sensor by at least a stop, maybe more. It doesn't matter to me because it's usually "enough" for what I need to do. The best DR claims are the ones I can make for myself when I side by side to a known camera like the Alexa.

    In my experience the Ursa Mini is very very close to matching the Alexa for DR, whatever DR they claim.

    I even wrote about it here last time I checked it formally.
    https://johnbrawley.wordpress.com/20...-side-by-side/

    Since I did this test, I've shot three TV series mixed with Alexa and I've not seen any shots that made me think any less of what I've found here.

    I feel so good about this camera and what it can do that I actually CHOSE to shoot Ursa Mini 4.6K over Alexa on the last show I did, "The Warriors".

    JB
    Reply With Quote  
     

  2. #12  
    Senior Member
    Join Date
    Nov 2016
    Posts
    203
    Quote Originally Posted by sammy View Post
    Arri lists the Alexa at 14.5 stops (we must agree on that to keep going with argument).
    That seems an odd way to proceed, to give more credence to a claim than to a real world test.
    Reply With Quote  
     

  3. #13  
    Senior Member
    Join Date
    Nov 2016
    Posts
    203
    Quote Originally Posted by Howie Roll View Post
    Cool test but the guy can't count.
    I noticed that as well. Shouldn't the proper count be between steps? So from 100 IRE to 90 IRE would be 1 stop, for example, where he was counting them as 2?
    Last edited by Jim Simon; 01-11-2017 at 02:55 PM.
    Reply With Quote  
     

  4. #14  
    Senior Member
    Join Date
    May 2012
    Posts
    2,522
    DXO gave RED Helium DR as 15.2 stop

    https://www.dxomark.com/Reviews/RED-...me-high-score2
    Reply With Quote  
     

  5. #15  
    DR test of a single camera will always be subjective as hell. A sequence of methodological actions on the way to a semi-useless result. But If test made subsequently with other cameras there might be some benefit.
    Much more informative are tests made with a multiple cameras at the same time, with the results compared to each other. At least they give us immediate relative information, which is more useful.
    Though such tests will also be a bit subjective too, because there are too many camera unique variables, which are hard to take into account while counting the stops (noise type, sharpness, signal allocation, formats, etc...).

    Quote Originally Posted by DPStewart View Post
    I'm really sick of DR measurements of cameras that are not in a useful analytical context.

    There's probably 10,000 articles out there that say the DR of camera 'x' or camera 'y' is this, or that, or not what the maker claims.... But they don't immediately place "their" rating with other cameras upon which the EXACT same testing methodology was applied. It becomes useless information. And it spreads confusion and misconceptions. It gets to be information pollution.

    I'm reminded of Ken Rockwell's lens reviews. Many people like his style, and many people HATE his style, but one thing that's noteworthy is that his methodology is very consistent and that produces USEFUL information. I have shot with easily 20 lenses that Ken Rockwell has reviewed and his methodology is so consistent that I can tell if I am going to like a lens even if he DOESN'T like it because his baseline descriptions are standardized (even within his own personal vernacular) and I can know how to contextualize the information he is providing.

    In summary, I wish folks would do DR reviews of cameras with high levels of procedural integrity and standardization, or not do them at all.

    ....but everyone wants that advertizing revenue on their website...

    At least Tom Antos' analysis uses the exact same methodology across the board.
    Agreed.
    Reply With Quote  
     

  6. #16  
    Senior Member
    Join Date
    Dec 2012
    Location
    Fairfield, CA
    Posts
    168
    Cinema5D is about as objective as it gets with dynamic range measurements, along with DxO. C5D uses a Xyla-21, Zeiss lens, and Imatest image profiling/calibration software to analyze and determine the dynamic range of the camera.

    https://www.cinema5d.com/canon-measu...-c300-mark-ii/

    They also measured the ALEV III at 14.5 EV.
    Eddie Barton

    My blog where I write about math stuff related to video: https://iaremrsir.wordpress.com/
    Reply With Quote  
     

  7. #17  
    Quote Originally Posted by iaremrsir View Post
    Cinema5D is about as objective as it gets with dynamic range measurements, along with DxO. C5D uses a Xyla-21, Zeiss lens, and Imatest image profiling/calibration software to analyze and determine the dynamic range of the camera.

    https://www.cinema5d.com/canon-measu...-c300-mark-ii/

    They also measured the ALEV III at 14.5 EV.
    I agree they seem to have a good thing going. But something is either wrong with their UM4.6K or with the software, because on there test image of the Xyla chart there are 14 swatches visible, yet they say the FS7 has more dynamic range. Lots of people on the comments mentioned the crosshatching issue (which if one shoots raw and processes in resolve seems to not happen). It's not surprising that their software said 12 when 14 steps are visible, though that 14th is def a bit of a gravy stop – because their crosshatching is so bad. Yet they haven't redone the test despite it being pretty badly flawed.
    James Iain Barber
    Editor, writer, director, shooter, yadda yadda.
    Reply With Quote  
     

  8. #18  
    Senior Member
    Join Date
    Dec 2012
    Location
    Fairfield, CA
    Posts
    168
    Quote Originally Posted by JIB View Post
    I agree they seem to have a good thing going. But something is either wrong with their UM4.6K or with the software, because on there test image of the Xyla chart there are 14 swatches visible, yet they say the FS7 has more dynamic range. Lots of people on the comments mentioned the crosshatching issue (which if one shoots raw and processes in resolve seems to not happen). It's not surprising that their software said 12 when 14 steps are visible, though that 14th is def a bit of a gravy stop because their crosshatching is so bad. Yet they haven't redone the test despite it being pretty badly flawed.
    The only thing flawed with the test is doing 1600 ISO which cuts off the top stop of highlight in BMD Film 4.6K. Other than that the test was consistent with everything else we've seen from C5D. As for the measurement software, it took a measurement that isn't far off and is actually kind of generous considering the maze pattern is visible in the bottom 8 stops of the image, which is over half the range of the camera. It's not the fault of C5D that the camera exhibits mazing or FPN, so their test is valid. Once BM fixes the green imbalance and FPN completely, then the test is no longer accurate. And yes, the cross hatching happens in raw and when processed in Resolve as shown in the thread on the official BM forum (and he processed the DNGs in Resolve for the test, not just ProRes). Once that issue is fixed, though, it'll have a measured range closer to 14EV, probably more.
    Eddie Barton

    My blog where I write about math stuff related to video: https://iaremrsir.wordpress.com/
    Reply With Quote  
     

  9. #19  
    Senior Member
    Join Date
    Mar 2016
    Posts
    552
    These DR Tests are getting Old to me, at the end of the day most cameras these days have enough DR to get what you need especially if you are properly lighting a scene. It seems rather pointless to me to consider only this one aspect of a camera.
    Last edited by Jason Finnigan; 01-23-2017 at 02:40 PM.
    Reply With Quote  
     

  10. #20  
    Senior Member
    Join Date
    Jun 2015
    Posts
    162
    Quote Originally Posted by DPStewart View Post
    I'm really sick of DR measurements of cameras that are not in a useful analytical context.

    There's probably 10,000 articles out there that say the DR of camera 'x' or camera 'y' is this, or that, or not what the maker claims.... But they don't immediately place "their" rating with other cameras upon which the EXACT same testing methodology was applied. It becomes useless information. And it spreads confusion and misconceptions. It gets to be information pollution.

    I'm reminded of Ken Rockwell's lens reviews. Many people like his style, and many people HATE his style, but one thing that's noteworthy is that his methodology is very consistent and that produces USEFUL information. I have shot with easily 20 lenses that Ken Rockwell has reviewed and his methodology is so consistent that I can tell if I am going to like a lens even if he DOESN'T like it because his baseline descriptions are standardized (even within his own personal vernacular) and I can know how to contextualize the information he is providing.

    In summary, I wish folks would do DR reviews of cameras with high levels of procedural integrity and standardization, or not do them at all.

    ....but everyone wants that advertizing revenue on their website...

    At least Tom Antos' analysis uses the exact same methodology across the board.
    My personal "favourites" are the DR tests where you get half way through the video when a note pops up like "and then a cloud came out, so all this is under exposed on the RED"
    Reply With Quote  
     

Similar Threads

  1. GH4 Has More Dynamic Range Than You Think
    By Joshua Cadmium in forum Off Topic
    Replies: 3
    Last Post: 05-02-2014, 02:03 PM
  2. Dynamic Range from 5D Mark III
    By nickjbedford in forum Cinematography
    Replies: 8
    Last Post: 09-19-2013, 08:11 PM
  3. 4K Dynamic Range
    By mrbrycel in forum General Discussion
    Replies: 14
    Last Post: 04-26-2013, 10:19 AM
  4. Dynamic Range on TVC Shoot
    By tdavies0989@live.com in forum Footage / Frames
    Replies: 11
    Last Post: 03-01-2013, 09:19 PM
  5. Dynamic range at different ISOs
    By vegetableman in forum General Discussion
    Replies: 6
    Last Post: 02-07-2013, 08:34 PM
Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •