Thread: Dynamic Range... two different tests

Page 1 of 4 123 ... LastLast
Results 1 to 10 of 34
  1. #1 Dynamic Range... two different tests 
    for the UM46K.

    Soo... Cinema5d did their test and showed 12 stops (though even on their chart it looks like more, and they definitely suffered from the crosshatching issue which, if shooting raw, I've never seen in Resolve).

    And now this test shows a full 15 stops:
    https://youtu.be/vrnqe93OJIU?t=8m12s

    Thoughts?

    I'd tend to believe the latter test considering I can count at least 14 stops in the chart that Cinema5d counts 12 on.
    Reply With Quote  
     

  2. #2  
    For me the golden standard for DR is the Alexa, Arri lists the Alexa at 14.5 stops (we must agree on that to keep going with argument)..If that is base best case results , when comparing the Alexa and URSA 4.6K (visually) having used both (not tested) , I can easily see that the Alexa is 2 stops better than URSA 4.6k, thus with that in mind to me URSA mini 4.6k has 12.5 stops of usable dynamic range ..I do feel the URSA 4.6K shines in mids , but shadows and highlights loose alot of color info fast , when underexposed or overexposed , not a lot of room of wiggle .
    Reply With Quote  
     

  3. #3  
    One thing I noticed also, the 4.6K loves red , it just seems to pick red color out and over saturate it .
    Reply With Quote  
     

  4. #4  
    I still haven't had the opportunity to test the 4.6K against an Alexa of any flavour, despite working with tons of people using Alexas last year. Hopefully I can get that done soon. I do feel like Alexa has something nailed a bit better than the 4.6K.

    Dynamic range tests all vary based on methodology... but it's very interesting that the guy measured discreet stops on his light meter with each patch, and fifteen of them show up on the waveform discretely. Now that's not getting into paycheck stops vs gravy stops of course.

    With the red colour, I had huge issues until I made sure I was using IR filtration for every shot. It's an ungradeable black issue when it's not filtered as removing red screws up the rest of the colour. I wonder if what you mention and that are related.

    I'm still, months later, getting to grips with grading the 4.6K. Sometimes I get it right and it looks incredible. Sometimes I take longer than I wish and then start over. But it's an incredible tool. Not just for the price. It's really damn good.
    Reply With Quote  
     

  5. #5  
    Senior Member
    Join Date
    Sep 2012
    Posts
    1,672
    Quote Originally Posted by JIB View Post
    for the UM46K.

    Soo... Cinema5d did their test and showed 12 stops (though even on their chart it looks like more, and they definitely suffered from the crosshatching issue which, if shooting raw, I've never seen in Resolve).

    And now this test shows a full 15 stops:
    https://youtu.be/vrnqe93OJIU?t=8m12s

    Thoughts?

    I'd tend to believe the latter test considering I can count at least 14 stops in the chart that Cinema5d counts 12 on.
    Cool test but the guy can't count. Somehow he's already counted 2 stops at the 98% chip, I think he's given it a little headstart.
    Reply With Quote  
     

  6. #6  
    Senior Member
    Join Date
    May 2012
    Posts
    2,466
    Clearly Alexa has a more conservative DR measurement than other manufacturers.

    This test confirms what I always believed that the BMPC 4k has around 10 stops and there are about 4 or 5 stops diffferent between it and the Mini 4.6k.
    Reply With Quote  
     

  7. #7  
    Senior Member
    Join Date
    Aug 2015
    Posts
    166
    Quote Originally Posted by Howie Roll View Post
    Cool test but the guy can't count. Somehow he's already counted 2 stops at the 98% chip, I think he's given it a little headstart.
    Wait. Can you count?
    Reply With Quote  
     

  8. #8  
    Senior Member DPStewart's Avatar
    Join Date
    Feb 2015
    Location
    The Desert
    Posts
    3,018
    I'm really sick of DR measurements of cameras that are not in a useful analytical context.

    There's probably 10,000 articles out there that say the DR of camera 'x' or camera 'y' is this, or that, or not what the maker claims.... But they don't immediately place "their" rating with other cameras upon which the EXACT same testing methodology was applied. It becomes useless information. And it spreads confusion and misconceptions. It gets to be information pollution.

    I'm reminded of Ken Rockwell's lens reviews. Many people like his style, and many people HATE his style, but one thing that's noteworthy is that his methodology is very consistent and that produces USEFUL information. I have shot with easily 20 lenses that Ken Rockwell has reviewed and his methodology is so consistent that I can tell if I am going to like a lens even if he DOESN'T like it because his baseline descriptions are standardized (even within his own personal vernacular) and I can know how to contextualize the information he is providing.

    In summary, I wish folks would do DR reviews of cameras with high levels of procedural integrity and standardization, or not do them at all.

    ....but everyone wants that advertizing revenue on their website...

    At least Tom Antos' analysis uses the exact same methodology across the board.
    Last edited by DPStewart; 01-11-2017 at 01:01 AM.
    Cameras: Blackmagic Cinema Camera, Blackmagic Pocket Camera (x2), Panasonic GH2 (x2), Sony RX100 ii, Canon 6D, Canon T2i,
    Mics: Sennheiser, AKG, Shure, Sanken, Audio-Technica, Audix
    Lights: Every Chinese clone you can imagine
    Reply With Quote  
     

  9. #9  
    Senior Member
    Join Date
    Sep 2012
    Posts
    1,672
    Quote Originally Posted by jonesyjones View Post
    Wait. Can you count?
    Maybe not. Perhaps you can explain to me how the clipped chip to the 98% chip is 2 stops of DR?
    Reply With Quote  
     

  10. #10  
    Just my two cents on the subject. I think that cinema 5D either got a crap sensor ( it can happen ) or they are being very conservative on their rating. If I remember right they always had the MX sensor at something like 11.5 stops and we always found 12.5. I do think 12.5 is a real safe zone for this camera. You shoot inbthat range and you can play a lot with those highlights and shadows. On the other guys test he comes close to what I have found in the few dr tests I've done with the camer... I just don't agree with his counting. That firststop is gone and he counts it. And that very last stop is a half stop at best. So that gets ya somewhere in the 13.5 range. Which is what I have consistently seen as the goal post for the camera. That times out in my tests to be about 5.5 over plus your exposed stop and about 7 under, but we always pull back a half stop on the low end as a guard which puts you in a safe zone of about 13 stops. I really think there is a true 14 stops in the camera I just think it's not safe to make use of it, but it helps with a smoother roll off if you keep your shots in that 13 stop zone.

    Just my thoughts, don't take it for gospel,but it's worked well for our team.
    Reply With Quote  
     

Similar Threads

  1. GH4 Has More Dynamic Range Than You Think
    By Joshua Cadmium in forum Off Topic
    Replies: 3
    Last Post: 05-02-2014, 02:03 PM
  2. Dynamic Range from 5D Mark III
    By nickjbedford in forum Cinematography
    Replies: 8
    Last Post: 09-19-2013, 08:11 PM
  3. 4K Dynamic Range
    By mrbrycel in forum General Discussion
    Replies: 14
    Last Post: 04-26-2013, 10:19 AM
  4. Dynamic Range on TVC Shoot
    By tdavies0989@live.com in forum Footage / Frames
    Replies: 11
    Last Post: 03-01-2013, 09:19 PM
  5. Dynamic range at different ISOs
    By vegetableman in forum General Discussion
    Replies: 6
    Last Post: 02-07-2013, 08:34 PM
Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •