Click here to go to the first RED TEAM post in this thread.   Thread: KOMODO....

Reply to Thread
Page 125 of 358 FirstFirst ... 2575115121122123124125126127128129135175225 ... LastLast
Results 1,241 to 1,250 of 3579
  1. #1241  
    Quote Originally Posted by Björn Benckert View Post
    I think when I talked to Graeme about it the data is there so its more about people / third party VFX applications that does not care to implement it. The thing is you need more info than just the gyro data, sensor size, shutter speed and redraw time. You also need to know the focal length and what actual lens / the distortion and preferably also the vignetting for it to work properly... There is quite a few lenses out there so quite a bit of data to implement in the SDK... But yes, red is doing focus mapping for most common EF lenses into the firmware of the cameras so could start adding distortion as well I guess.
    I have looked for this data in the past but you can't get anything via Nuke for sure and i couldn't actually find any data in RedcineX either - so i have no idea where this might be. I do think it is a missed opportunity but then i also have to ask if it's not commonly exposed then is it just not accurate or even why this data is there in the first place...? What was it designed for?

    cheers
    Paul
    Reply With Quote  
     

  2. #1242  
    Quote Originally Posted by paulcurtis View Post
    I have looked for this data in the past but you can't get anything via Nuke for sure and i couldn't actually find any data in RedcineX either - so i have no idea where this might be. I do think it is a missed opportunity but then i also have to ask if it's not commonly exposed then is it just not accurate or even why this data is there in the first place...? What was it designed for?

    cheers
    Paul
    I know people used it. But can be that they got it via the THC that also records focus data.

    If its there it can not be to difficult to write a script that extract it.

    But agree, roll data for example... Shit easy to make a script that let you adjust roll, set hold on a frame and then use the rotation data to give a fixed horizon which would be far more than what the completion got. So yes Christoffer is right. It really suck that red does not implement atleast roll neutraliser in the SDK. It would not take more than a few days or even hours for a half talented in the field programer to write such feature. I got guys here that can do it.
    Björn Benckert
    Creative Lead & Founder Syndicate Entertainment AB
    +46855524900 www.syndicate.se/axis
    VFX / Flame / Motion capture / Monstro
    Reply With Quote  
     

  3. #1243  
    Senior Member Blair S. Paulsen's Avatar
    Join Date
    Dec 2006
    Location
    San Diego, CA
    Posts
    5,053
    It would appear that RED saw the potential for gyro data, put basic hooks into the SDK and expected 3rd party interest. AFAIK, the uptake was weak which meant RED saw little reason to devote engineering resources to it. Can't argue with that, they had plenty of other things to tackle. Whatever the actual case, and whatever the inherent limitations of the gyro in current RED bodies might be, would be great to see RED come strong on this. As Björn suggests, the smart phone vertical has driven development of chips that seem ideal for the use case - so perhaps it's just a question of coding resources...

    Would love to see DSMC3 include robust tools for exploiting positional data, and I don't think I'm the only one.

    Cheers - #19
    Reply With Quote  
     

  4. #1244  
    Moderator Phil Holland's Avatar
    Join Date
    Apr 2007
    Location
    Los Angeles
    Posts
    11,144
    I've seen two VFX productions tap into the gyro data thus far. It's in there, but putting RED on top of developing the tools to do more with it is a bit weird. Maybe I'll make a dumber program or something. I did similar for clipJoiner, but I'm a full time filmmaker, it's hard to be tech support on stuff like that :/
    Phil Holland - Cinematographer - Los Angeles
    ________________________________
    phfx.com IMDB
    PHFX | tools

    2X RED Monstro 8K VV Bodies and a lot of things to use with them.

    Data Sheets and Notes:
    Red Weapon/DSMC2
    Red Dragon
    Reply With Quote  
     

  5. #1245  
    Member
    Join Date
    Aug 2017
    Location
    CA
    Posts
    58
    You can pull the data with REDline and import it into your software.
    Reply With Quote  
     

  6. #1246  
    Senior Member Blair S. Paulsen's Avatar
    Join Date
    Dec 2006
    Location
    San Diego, CA
    Posts
    5,053
    There are a lot of angles (pun fortunate), on capturing positional data during acquisition for a host of post processes. Custom integration using RED's SDK is great if that's your jam, but it seems like an awfully heavy lift for a large segment of the user base.

    Komodo is interesting in no small part, because its small - heck the current DSMC2 is very light and compact compared to historical norms. This means more gimbal, handheld and drone shots will seem like a good idea to more people - and the modern style tends more toward an active camera. These are scenarios where a tool in RedCineX - and in the SDK - that supported stabilization in at least 2 axis - likely pitch and roll - could be a real lifesaver.

    Note to potential developers: [I'd like a slider, in the -100 to +100 range, 100 being as glass smooth as possible, zero means as shot, no adjustment, -100 is amplification of motion vectors to make a stable shot look as if it was shot wildly.]

    I want to note that bringing AI stabilization to the workflow is not just for taming shaky cam. We already do intentional camera movement as a way to "spot" story beats like the audio does. But there are shooting scenarios where that's not going to be possible. What if shot handheld/active style, but in post you could keyframe the stabilization effect to moments/POV changes, etc where the camera reveals a character, or other key moments in a chaotic scene. Even add some active camera in post when desired, especially nice for matching coverage where a static shot would kill the momentum of the scene, etc.

    Should RED be responsible for adding features such as these to their free software? That is certainly NOT what I am saying.

    What I am suggesting, is that if RED were to add this kind of functionality to their core value proposition, it might be a compelling reason to choose RED over other options. Yes, they'd have to spend money on engineering resources to implement. FWIW, as the disparity in image quality between the low end cameras and the top end continues to narrow, the feature set starts becoming the key differentiator. Other than huge increases in HFR capabilities, I can't think of a better pitch than offering simple/intuitive stabilization in post. [plus, perhaps some engagement with the VFX folks to foster a more accessible framework if they'll buy in] Metadata may not be sexy, but if REDs were the darling of the VFX peeps...

    Cheers - #19
    Reply With Quote  
     

  7. #1247  
    Senior Member
    Join Date
    Jan 2007
    Posts
    290
    Just bear in mind that stabilization in post requires forethought. If you stabilize a shaky shot that was shot at 24fps with a normal 180 degree shutter, you'll end up with a bunch of blurred frames. You'd need to shoot it at a much smaller shutter angle and if necessary, add temporal blur in post to match (as well as is possible) a normal 180 degree blur.
    Reply With Quote  
     

  8. #1248  
    i is the
    Quote Originally Posted by Alex.E View Post
    You can pull the data with REDline and import it into your software.
    Aha. Will try it. Would be easy enough to bring in the raw data in flame and scale to taste.

    Quote Originally Posted by Mike Smith View Post
    Just bear in mind that stabilization in post requires forethought. If you stabilize a shaky shot that was shot at 24fps with a normal 180 degree shutter, you'll end up with a bunch of blurred frames. You'd need to shoot it at a much smaller shutter angle and if necessary, add temporal blur in post to match (as well as is possible) a normal 180 degree blur.
    All ill cameramoves are not hard shakes. If you use a dampning device like a steady cam for example or a gimbal then you can use gyro data to smooth and level if things are drifting for example.
    Also the data is usable when inserting CG. If having your camera on a tripod and panning around you could for example do splitscreens like if you had a fixed camera etc etc. So there is lots of places where this data can be used where the motionblur from the original camera move is a no issue. If the canera and lens is well known you can also counter steer rolling shutter abd motion blur to a very high degree, just take a look at gopro fusion footage before and after post processing and you will understand what I mean. Tha capture can look like a turd and still after processing its silk smooth without rolling shutter ichias.

    Quote Originally Posted by Blair S. Paulsen View Post
    There are a lot of angles (pun fortunate), on capturing positional data during acquisition for a host of post processes. Custom integration using RED's SDK is great if that's your jam, but it seems like an awfully heavy lift for a large segment of the user base.

    Komodo is interesting in no small part, because its small - heck the current DSMC2 is very light and compact compared to historical norms. This means more gimbal, handheld and drone shots will seem like a good idea to more people - and the modern style tends more toward an active camera. These are scenarios where a tool in RedCineX - and in the SDK - that supported stabilization in at least 2 axis - likely pitch and roll - could be a real lifesaver.

    Note to potential developers: [I'd like a slider, in the -100 to +100 range, 100 being as glass smooth as possible, zero means as shot, no adjustment, -100 is amplification of motion vectors to make a stable shot look as if it was shot wildly.]

    I want to note that bringing AI stabilization to the workflow is not just for taming shaky cam. We already do intentional camera movement as a way to "spot" story beats like the audio does. But there are shooting scenarios where that's not going to be possible. What if shot handheld/active style, but in post you could keyframe the stabilization effect to moments/POV changes, etc where the camera reveals a character, or other key moments in a chaotic scene. Even add some active camera in post when desired, especially nice for matching coverage where a static shot would kill the momentum of the scene, etc.

    Should RED be responsible for adding features such as these to their free software? That is certainly NOT what I am saying.

    What I am suggesting, is that if RED were to add this kind of functionality to their core value proposition, it might be a compelling reason to choose RED over other options. Yes, they'd have to spend money on engineering resources to implement. FWIW, as the disparity in image quality between the low end cameras and the top end continues to narrow, the feature set starts becoming the key differentiator. Other than huge increases in HFR capabilities, I can't think of a better pitch than offering simple/intuitive stabilization in post. [plus, perhaps some engagement with the VFX folks to foster a more accessible framework if they'll buy in] Metadata may not be sexy, but if REDs were the darling of the VFX peeps...

    Cheers - #19
    Again, correcting tilt is a whole different level than correcting roll. To correct tilt you need to know the lens and the distortion and likly your camera is not in the nodal point so there will be paralax change that can not be accounted for. So thats a much, much bigger thing to tap into than roll.
    Roll you can eliminate without knowing anything about the lens.

    All needed to stabilze roll is: Negating the gyro values for the roll axis. Scale and offset those values ( which can all be done in any half deasent chanel editor.) So all you need for roll is basically a raw string of numbers for each frame that imports as a raw curve. Will try to extract from a shot in redline as soon as I get to the office.
    Björn Benckert
    Creative Lead & Founder Syndicate Entertainment AB
    +46855524900 www.syndicate.se/axis
    VFX / Flame / Motion capture / Monstro
    Reply With Quote  
     

  9. #1249  
    Senior Member Joel Arvidsson's Avatar
    Join Date
    Nov 2008
    Location
    Sweden
    Posts
    2,883
    It would still be a win if you could easy extract the tilt axis. Say you have a static tripod shot that you like to do a projection of to geometri. Then it would be nice to have the exact tilt angle in the meta data. Or you already have an backplate for a shot and you are going to shoot an foreground clip in green screen to match that background.
    Epic #06696
    Epic-W #004069
    Reply With Quote  
     

  10. #1250  
    Senior Member Blair S. Paulsen's Avatar
    Join Date
    Dec 2006
    Location
    San Diego, CA
    Posts
    5,053
    There are clearly limitations on what can and cannot be addressed with a particular technique. SteadXP suggests there's potential for embedding positional data in metadata to enhance stabilization coherence. If a first tier camera company like RED made it a priority, I'd be very interested in what might be possible. Thanks to smart phones, game controllers, etc the tiny low power chips needed should be both capable and cheap. I think it could be a meaningful value add.

    Cheers - #19
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts