Click here to go to the first RED TEAM post in this thread.   Thread: Apple + RED

Reply to Thread
Page 17 of 18 FirstFirst ... 7131415161718 LastLast
Results 161 to 170 of 171
  1. #161  
    Senior Member
    Join Date
    Nov 2009
    Location
    Norway
    Posts
    1,074
    [QUOTE=Jarred Land;1896017]
    Quote Originally Posted by Jon MIchael Puntervold View Post
    Any news regarding the implementation of Metal support in Redcine, DaVinci or FCPX?[/QUOTE
    ]

    Yes indeed one of those teams have been working hard over the quarantine and ask me the same question tomorrow :)
    Never got to ask you the question again - before the great news arrived. Finally got myself a 4k reference monitor literary yesterday, so can't wait to test it!!! Great news for all of us Mac users!

    However, I still find RedCine an incredible elegant tool for reviewing Raw-footage and looking forward to Metal support. I guess that's the next question... Any news regarding implementation in RedCine?
    Reply With Quote  
     

  2. #162  
    Senior Member Bastien Tribalat's Avatar
    Join Date
    Dec 2009
    Location
    Cannes area, France
    Posts
    773
    Today Apple announced the switch from Intel to ARM chips on Mac with a transition period over 2 years (but seems the first computers will come out at the end of this year according to Bloomberg)... How is it going to affect working with RED products ?
    (and all the software we know and love ?)
    VIDEO EDITING - COLOR GRADING - VFX
    APPLE FINAL CUT PRO, AVID MEDIA COMPOSER
    ADOBE CREATIVE CLOUD, DAVINCI RESOLVE, REDCINE X PRO...
    Reply With Quote  
     

  3. #163  
    Senior Member Blair S. Paulsen's Avatar
    Join Date
    Dec 2006
    Location
    San Diego, CA
    Posts
    5,288
    Regardless of what CPU type is used, I think the more critical issue for RedUsers is Metal development. Otherwise, it's a CUDA world and the Apple/NVIDIA feud is the killer.

    Cheers - #19
    Reply With Quote  
     

  4. #164  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,850
    Quote Originally Posted by Blair S. Paulsen View Post
    Regardless of what CPU type is used, I think the more critical issue for RedUsers is Metal development. Otherwise, it's a CUDA world and the Apple/NVIDIA feud is the killer.

    Cheers - #19
    Nothing beats a threadripper at the moment and the RTX Titan is the only valuable "cheap" option for 8k when you don't want to run out of memory all the time while color grading and other effects(with 8k output).
    Apple got f*cked in the b*t more than once by Jensen, NVidia is a no go for them and in the near future AMD will also be a no go for them.

    The new Apple eco-system is the most closed eco-system we've seen in a long time. Good to see that others are opening their eco-systems (RED, Windows/Linux WSL, Samsungs exFAT for Linux, NVidia A100/AMD EPYC, Samsung/AMD with RDNA, etc..).
    Reply With Quote  
     

  5. #165  
    Senior Member Bastien Tribalat's Avatar
    Join Date
    Dec 2009
    Location
    Cannes area, France
    Posts
    773
    Quote Originally Posted by Misha Engel View Post
    Nothing beats a threadripper at the moment and the RTX Titan is the only valuable "cheap" option for 8k when you don't want to run out of memory all the time while color grading and other effects(with 8k output).
    Apple got f*cked in the b*t more than once by Jensen, NVidia is a no go for them and in the near future AMD will also be a no go for them.

    The new Apple eco-system is the most closed eco-system we've seen in a long time. Good to see that others are opening their eco-systems (RED, Windows/Linux WSL, Samsungs exFAT for Linux, NVidia A100/AMD EPYC, Samsung/AMD with RDNA, etc..).
    So you think that there's no way to integrate AMD's GPU in the new "Apple Silicon" Ecosystem ?
    This is kinda my big question because they just launched the new Mac Pro and it's a great machine for its modularity (without taking into account pricing) and I feel that yesterday announcement just killed it (unless they can make an ARM chip that supports all the PCI-E and DIMM slots ?)
    VIDEO EDITING - COLOR GRADING - VFX
    APPLE FINAL CUT PRO, AVID MEDIA COMPOSER
    ADOBE CREATIVE CLOUD, DAVINCI RESOLVE, REDCINE X PRO...
    Reply With Quote  
     

  6. #166  
    Moderator Phil Holland's Avatar
    Join Date
    Apr 2007
    Location
    Los Angeles
    Posts
    11,840
    Quote Originally Posted by Bastien Tribalat View Post
    Today Apple announced the switch from Intel to ARM chips on Mac with a transition period over 2 years
    I see two sides to this coin. On one hand an exclusive enclosed ecosystem can provide a good deal of concern from a user and developer perspective, it already is to a degree. However, this move to ARM will allow Apple to develop in likely newer perhaps less expected ways and not be constrained by chipset limitations or architecture cycles when developing future hardware. Specifically for the core Apple user base I see a lot of potential here in reference to mass market and possibly even to the workstation level.

    This slowly moves the turn over to Intel to conceptualize a strategy to be competitive at this game. However, the move to ARM was pretty much the worse kept secret in the tech sector during the last 5 years and movement is already in place. And at the end of the day I am betting we'll see a true entire ecosystem developed by Apple likely this decade, including GPU, putting them in a pretty unique position even with Intel's GPU offerings.

    The true net result of this is this will slowly change the face of some of the norms in the personal and professional computing space over the next 2-5 years. AMD is actually the most impacted by this, but that will be clearer soon, but to their credit landing two big somewhat recent contracts likely has softened the blow of what this means industry-wise. I suspect they are having chats about how to proceed, but they won't do much until Intel and Nvidia reveal their bigger game plans as that is their general mode of operation.

    The curious questions here are what will Intel be doing with this "knowledge" and also why isn't Nvidia doing X, Y, or Z? I'll put a spicy question in there about what refined efforts will Apple expand on with ARM and how will that look towards the userbase. The stage is set for a trio of giants and oddly some of the supporting manufacturers will need to dig deep and get creative.

    For software developers, yep, more work ahead. But the good news is much of the advancement in hardware is paired with being rather concerned about developers and how taxing it is to develop for a platform. We'll likely see some sort of more integrated app-like atmosphere for typically deeply developed software. Actually a relief probably for a few companies who have been around a while.

    For hardware developers, well, lots to think about and digest here. We've recently navigated to fresh waters of PCIe 4.0, 5.0, and USB 4, this alone will create a new vision of what certain hardware even looks like in the next few years. Through the scope at the not-so-distant horizon, yet a longer term effort that is indeed in sight are PCIe 6.0 and USB V. And yes, that's being developed before the current gen or gen 2 is even out in the market as is the normal way of this sector.

    Bigger personal and professional concerns for me are the larger gaps between making systems for the masses versus professionals with higher end demands. Specifically in our industry, storage is something we all devour. We've developed much faster solutions, but price scaling hasn't been in sync during much of this party. Fairly recently we're seeing some drops here and there. That will be more evident once the world is closer to normal.

    To heighten the presence of these longer term plans and developments keep in mind zero computer (I should add mobile) technology companies met their expected revenue targets in 2020. Yet the ball was set in motion long ago for any announcements and bigger ideas of developing new tech. The last major cycle of mobile technology being released in many ways "just didn't happen" market-wise. Meanwhile though, it's been an interesting time for a pandemic and global pause as future devices can either exhibit a notable leap or a notable sit when Q4 and Q1 rolls around.

    Fun stuff. Curious to see how it truly unfolds. I have some ideas of what this looks like as a whole, but there are many major gaps to be filled. We'll likely be laughing at the tech we were using in 2017 by 2022. Things are moving faster than market's can even respond to even for a normal functioning global economy.

    I'm not really worried about our workflow. The companies will need to support the user base on whatever platforms they choose to work on. The production work is Microsoft, Apple, and Linux and it truly depends on the user or studio which order of importance each of those ecosystems represent. Market-wise, we know who has the most systems, where, and who uses them. Truly all 3 are trying attract more users in a glorious tug of war.

    I'm down for however this plays out. If Apple comes out with some radical new take on hardware that is exciting to me, well, I'm interested if it helps me do what I do.
    Phil Holland - Cinematographer - Los Angeles
    ________________________________
    phfx.com IMDB
    PHFX | tools

    2X RED Monstro 8K VV Bodies, 1X RED Komodo, and a lot of things to use with them.

    Data Sheets and Notes:
    Red Weapon/DSMC2
    Red Dragon
    Reply With Quote  
     

  7. #167  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,850
    Quote Originally Posted by Bastien Tribalat View Post
    So you think that there's no way to integrate AMD's GPU in the new "Apple Silicon" Ecosystem ?
    This is kinda my big question because they just launched the new Mac Pro and it's a great machine for its modularity (without taking into account pricing) and I feel that yesterday announcement just killed it (unless they can make an ARM chip that supports all the PCI-E and DIMM slots ?)
    It's no problem to integrate anything in the new ARM system, Apple will just do everything in their power to block it.
    The new Mac Pro is indeed almost as modular as a normal of the shelf PC.
    Reply With Quote  
     

  8. #168  
    Quote Originally Posted by Misha Engel View Post
    It's no problem to integrate anything in the new ARM system, Apple will just do everything in their power to block it.
    The new Mac Pro is indeed almost as modular as a normal of the shelf PC.
    Apple, which failed the first time around with the Lisa, is taking a second crack at it. Will they create the Mona Lisa? (Check the colloquial translation of these italian words before answering.)
    Michael Tiemann, Chapel Hill NC

    "Dream so big you can share!"
    Reply With Quote  
     

  9. #169  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,850
    Apple didn't fail, neither did their customers. It's LVMH all over again.
    Reply With Quote  
     

  10. #170  
    Senior Member Antony Newman's Avatar
    Join Date
    Mar 2012
    Location
    London, UK.
    Posts
    1,641
    Apple have overtaken Intel in the IPC (instructions per clock of the the CPU).
    Apple use TSMC leading edge that uses far less power than Intels current process.
    Intel's finely tuned processes are pushed to the limit in clock frequency (and can run faster than a TSMC chip).

    In the Laptop Space : A CPU + GPU + DSP on one Chip will mean:
    +) Far lower latency
    +) Far higher bandwidth
    +) Energy not wasted sending huge amounts of data between devices that are physically separated.

    In the short term - Intel can make an AMD or APPLE equivalent desktop that uses 1000 watts of power.
    On the Laptop however - and Apple SoC that integrates everything could be very hard for anyone to beat.

    Fujitsu's 7nm SoC (CPU) chip is more energy efficient than than the current top of the line NVidia GPU.
    If Apple adopt CPU (SVE2 and ARM v9) + GPU - all on a 5nm process - We may find a 45 watt Laptop that behaves like a high end Desktop.

    There is no reason why Apple would prevent a desktop from having PCIe. Its convenient off the shelf standard, and allow many a third party to customise and accelerate Apples base offering.

    I however hope that Apple use this next transformation to additionally introduce GenZ (or CCIX / CXL) to the Desktop : Gen-z has legs.

    AJ

    Quote Originally Posted by Bastien Tribalat View Post
    So you think that there's no way to integrate AMD's GPU in the new "Apple Silicon" Ecosystem ?<snip>
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts