Click here to go to the first RED TEAM post in this thread.   Thread: New Beta 4 Redcine-X with NVIDIA CUDA decode

Reply to Thread
Page 2 of 8 FirstFirst 123456 ... LastLast
Results 11 to 20 of 78
  1. #11  
    Moderator Phil Holland's Avatar
    Join Date
    Apr 2007
    Location
    Los Angeles
    Posts
    10,713
    Quote Originally Posted by Misha Engel View Post
    And you reached 33 fps, 8k, 5:1 playback on a single RTX2080ti? That is very impressive.

    Especially when you consider the puget only reached (23 fps, must be a typo)..25fps, 8k 9:1 playback on a single RTX2080ti combined with a i9-9960x in Davinci Resolve.
    https://www.pugetsystems.com/labs/ar...formance-1328/

    Does this also mean that RCX has a superior de-bayer(~30% faster) compared to Davinci Resolve ?
    Misha, this is utilizing the new GPU Decode Acceleration from Nvidia which Puget hasn't tested yet and hasn't been updated via SDK in 3rd party applications.

    And yes, at the moment at full debayer playback RCX is literally superior to everything with the new tech, that is until it's implemented in other programs likely in a couple months.
    Phil Holland - Cinematographer - Los Angeles
    ________________________________
    phfx.com IMDB
    PHFX | tools

    2X RED Weapon 8K VV Monstro Bodies and a lot of things to use with them.

    Data Sheets and Notes:
    Red Weapon/DSMC2
    Red Dragon
    Reply With Quote  
     

  2. #12  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,503
    Quote Originally Posted by Phil Holland View Post
    Misha, this is utilizing the new GPU Decode Acceleration from Nvidia which Puget hasn't tested yet and hasn't been updated via SDK in 3rd party applications.

    And yes, at the moment at full debayer playback RCX is literally superior to everything with the new tech, that is until it's implemented in other programs likely in a couple months.
    I know, in Resolve the 2080ti only has to de-bayer and is around 30% behind RCX New Beta 4 where it has to decode+debayer, it must be a typo or some other mistake in pugets testing because the GTX1080ti was already capable of debayering 8k 25 fps (de-bayering wasn't the problem with previous gen GPU's).
    Reply With Quote  
     

  3. #13  
    Moderator Phil Holland's Avatar
    Join Date
    Apr 2007
    Location
    Los Angeles
    Posts
    10,713
    Quote Originally Posted by Misha Engel View Post
    I know, in Resolve the 2080ti only has to de-bayer and is around 30% behind RCX New Beta 4 where it has to decode+debayer, it must be a typo or some other mistake in pugets testing because the GTX1080ti was already capable of debayering 8k 25 fps (de-bayering wasn't the problem with previous gen GPU's).
    Compression ratio matters. I need to go back through their tests to see what was done at the different types of 1/2 and full.
    Phil Holland - Cinematographer - Los Angeles
    ________________________________
    phfx.com IMDB
    PHFX | tools

    2X RED Weapon 8K VV Monstro Bodies and a lot of things to use with them.

    Data Sheets and Notes:
    Red Weapon/DSMC2
    Red Dragon
    Reply With Quote  
     

  4. #14  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,503
    Quote Originally Posted by Phil Holland View Post
    Compression ratio matters. I need to go back through their tests to see what was done at the different types of 1/2 and full.
    For decoding yes, for de-bayer it shouldn't
    Reply With Quote  
     

  5. #15  
    Senior Member Kevin Marshall's Avatar
    Join Date
    Jul 2009
    Location
    Balitmore, MD
    Posts
    422
    Quote Originally Posted by Misha Engel View Post
    I know, in Resolve the 2080ti only has to de-bayer and is around 30% behind RCX New Beta 4 where it has to decode+debayer, it must be a typo or some other mistake in pugets testing because the GTX1080ti was already capable of debayering 8k 25 fps (de-bayering wasn't the problem with previous gen GPU's).
    Checking in one of the other Puget tests (specifically the 2070 one: https://www.pugetsystems.com/labs/ar...formance-1264/), the 2080 Ti does seem to be hitting 25fps on that 25fps 8K clip, so that 23 could be a typo. Alternatively, the test that reads 23fps is using an Intel 16c/32t 9960x, whereas the system in the 2070 test uses a 32-core Threadripper. Since the CPU is often the limiting factor in the current R3D decode implementation, that could explain the difference.

    So there's a few things to consider about the difference between Puget's test and the current RCX implementation. First, Puget's test sets the playback speed to the clip's native frame rate, so we don't know if that 25fps is a cap, or if there's still overhead. Second - since the existing R3D decode implementation is still CPU-limited, it's unlikely that 23/25fps figure in the Puget tests represents 100% GPU utilization.
    Reply With Quote  
     

  6. #16  
    Senior Member
    Join Date
    Jan 2007
    Posts
    288
    This build is soooooo much better than the last. Now I can view 5k footage at full rez with my older system using a GTX Titan X. It uses less of it's memory too. About a third. Excellent work guys!

    There is still about the same amount of stuttering as previously however. Interestingly, the stuttering is about the same regardless of resolution and the GPU is not running out of steam. For 1/2 and 1/4 rez, once it buffers enough memory, it's running at less than 50% GPU processing power. Also, the instantaneous frame rate is dancing around between 22 and 24+ fps. On older versions of Redcine X Pro, once the memory has enough time to buffer, playback is butter smooth and the instantaneous frame rate is rock solid. It just couldn't handle full at all, or even half resolution without buffering into memory first.

    There is also still some occasional momentary ghosting, most noticeably off the sides of buildings against the sky. This does not appear on older non-GPU decoding versions of Redcine X Pro (but did appear in the previous beta of this one), so it's definitely not on the footage itself.

    I'm also seeing some occasional bugs with viewing the footage on a separate monitor if I start that up before running the footage, but I haven't quite worked out what the sequence of making this particular bug happen. Maybe someone else here has seen that.

    Perhaps all this is a function of me not yet using RTX technology, and I do intend to upgrade. I just wanted to see how much more improved it was on GTX, and it is. I think it's getting close.

    I will send a report to RED Support to see if I can help at all but just wondered if anyone else is seeing some of this so we can all help.
    Reply With Quote  
     

  7. #17  
    Quote Originally Posted by Mike Smith View Post
    This build is soooooo much better than the last. Now I can view 5k footage at full rez with my older system using a GTX Titan X. It uses less of it's memory too. About a third. Excellent work guys!

    There is still about the same amount of stuttering as previously however. Interestingly, the stuttering is about the same regardless of resolution and the GPU is not running out of steam. For 1/2 and 1/4 rez, once it buffers enough memory, it's running at less than 50% GPU processing power. Also, the instantaneous frame rate is dancing around between 22 and 24+ fps. On older versions of Redcine X Pro, once the memory has enough time to buffer, playback is butter smooth and the instantaneous frame rate is rock solid. It just couldn't handle full at all, or even half resolution without buffering into memory first.

    There is also still some occasional momentary ghosting, most noticeably off the sides of buildings against the sky. This does not appear on older non-GPU decoding versions of Redcine X Pro (but did appear in the previous beta of this one), so it's definitely not on the footage itself.

    I'm also seeing some occasional bugs with viewing the footage on a separate monitor if I start that up before running the footage, but I haven't quite worked out what the sequence of making this particular bug happen. Maybe someone else here has seen that.

    Perhaps all this is a function of me not yet using RTX technology, and I do intend to upgrade. I just wanted to see how much more improved it was on GTX, and it is. I think it's getting close.

    I will send a report to RED Support to see if I can help at all but just wondered if anyone else is seeing some of this so we can all help.
    What is your screen refresh rate? I assume you need to have in even numbers with playback frame. Like having the screen set to 30Hz and playback 25p or such will normally create studdering.
    Björn Benckert
    Creative Lead & Founder Syndicate Entertainment AB
    +46855524900 www.syndicate.se
    Flame / VFX / Motion capture / Monstro
    Reply With Quote  
     

  8. #18  
    Senior Member
    Join Date
    Jan 2007
    Posts
    288
    Quote Originally Posted by Björn Benckert View Post
    What is your screen refresh rate? I assume you need to have in even numbers with playback frame. Like having the screen set to 30Hz and playback 25p or such will normally create studdering.
    No, it's not that. I can start up the same footage in an older, non-GPU decoding version of Redcine X Pro and it runs butter-smooth. Same PC. Same monitor(s). Same footage. Same frame rates. Of course the older version cannot handle the resolutions that Beta 4 can. But if I'm willing to let the older version buffer into memory at, say, 1/2 or 1/4 rez, the playback is very smooth. As is the intantaneous frame rate indication below the screen. Not so on Beta 4 where it wanders about, sort of in sync with the stuttering.
    Reply With Quote  
     

  9. #19  
    Senior Member AndreasOberg's Avatar
    Join Date
    Oct 2011
    Location
    Leicestershire, United Kingdom
    Posts
    1,464
    If I run half debayer it takes 6.8GB of GPU RAM according to Task Manager. Quarter is about 4.2GB. If I try fullress it either goes very slowly or it crashes
    MSI 75 Titan laptop with full 1080 GTX 8GB and hexa 4.3GHz. So Redcine takes 5.8GB for half debayer since 1GB is reserved for other software.
    So it seems it is running out of memory on fullres.

    Another thing I note is that it only uses 11GB for the cache, even though it is set to use 52GB. So the cache is very short.
    Andreas
    Last edited by AndreasOberg; 02-23-2019 at 11:18 AM.
    www.ObergWildlife.com- Natural History Filmmaking
    www.WildlifeRescueMovie.com- Saving the animals of the Rainforest!
    CF Weapon Helium 8K Strummer Bright Tangerine, OConnor 2560, Canon 50-1000mm
    Reply With Quote  
     

  10.   Click here to go to the next RED TEAM post in this thread.
  #20  
    Quote Originally Posted by Mike Smith View Post
    This build is soooooo much better than the last. Now I can view 5k footage at full rez with my older system using a GTX Titan X. It uses less of it's memory too. About a third. Excellent work guys!

    ...

    I will send a report to RED Support
    Thanks Mike!

    This beta version was obviously optimized for higher playback speeds. A side effect of that optimization is potential stuttering when the system isn't capable of hitting the target FPS. We know of a few ways we can compensate for this. To help decide which way (or ways) to use, we've added additional information to this version's log files.
    Last edited by Steven Yaeger; 02-23-2019 at 11:40 AM.
    If you can't reproduce it, it must have been caused by the pulse generated by the mass synchronization of digital watches.
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts