Click here to go to the first RED TEAM post in this thread.   Thread: Apple + RED

Reply to Thread
Page 14 of 16 FirstFirst ... 410111213141516 LastLast
Results 131 to 140 of 153
  1. #131  
    Senior Member Michael Lindsay's Avatar
    Join Date
    Apr 2007
    Location
    London UK
    Posts
    2,683
    Never met a Mac user who would doesn’t want Nvidia GPUs in their laptops... (inc me) so I am assuming there must be some huge behind the scenes story not yet told...

    Does your metal support put all the de-wavelet work onto the GPU? I only ask as at the moment with open Cl only the debayer/scaleing done on GPU it shows pretty high utilisation on a 16” with top GPU..

    Quote Originally Posted by Jarred Land View Post
    Agree. and even though we just got on the metal train I still do wish the 16 had the option for an RTX inside it though..
    Reply With Quote  
     

  2. #132  
    Senior Member
    Join Date
    Sep 2009
    Location
    Northern VA
    Posts
    1,628
    Quote Originally Posted by Michael Lindsay View Post
    Never met a Mac user who would doesn’t want Nvidia GPUs in their laptops... (inc me) so I am assuming there must be some huge behind the scenes story not yet told...
    The unofficial story is that there is bad blood at the executive level between Apple and Nvidia because of the Nvidia GPU debacles from when Apple was making it's big comeback in the mid-2000's and their CUDA vs OpenCL stance, as being locked in to Nvidia's CUDA was not in line with Apple's then preferred support of their OpenCL, which would allow them to use both ATI and Nvidia GPUs (or whatever else might have popped up back then) in their offerings. After the delayed and faulty GPU hardware, a serious string of executive-level fuck-ups at Nvidia and their legal issues with Intel - whom Apple was in bed with - only made things worse. It was originally intended that both the iPhone and iPad be powered by Nvidia's Tegra chips, but Tegra kept hitting hardware execution delays and wasn't available for Apple to use in engineering, so Apple went with their own design after using Samsung chips in the first iPhones.
    Reply With Quote  
     

  3. #133  
    Unofficially, Apple's "new Mac Pro" was woefully late before finally being released at the end of 2013. Sporting zero nVidia GPUs, the new Mac Pro did not usher in a new wave of innovation and strategic capability. Indeed, it under-performed on Day 1, and during it's miserable six year lifespan, no new CPUs or GPUs were ever offered. No wonder they are so angry with nVidia, who clearly got their act together and then would not stop leapfrogging themselves while the 2013 Mac Pro went from "not insanely great" to "insanely not great".

    Six years later, the new new Mac Pro has arrived. But will the petulance that doomed the new Mac Pro doom the new new Mac Pro. I'm waiting and watching.
    Michael Tiemann, Chapel Hill NC

    "Dream so big you can share!"
    Reply With Quote  
     

  4. #134  
    Senior Member
    Join Date
    Jun 2017
    Posts
    1,756
    Quote Originally Posted by andrewhake View Post
    Intel has already discussed implementing TB3 directly on newer CPUs as well. The current TB3 controllers are quite large still so it's hardly a trivial task. Makes more sense to integrate them into a much larger GPU, something Apple has already done on the Mac Pro.

    Also. Just stop. I don't want this, and nobody else wants this. I don't know why you bring this upon everyone and yourself in nearly every thread. It isn't "just install it, it works", and it's not even close to that. It is an extremely Mickey Mouse solution with a lot of limitations. If you are a Mickey Mouse operation I am sure it is great solution. If you live in a world where you think that machine you described is in anyway similar in terms of setup, specs, design, build-quality, sound, capability, upgradeability, IO, and so on to the new Mac Pro then I don't even have a response.

    Driving a 6K HDR display over a single TB3 cable is a very non-trivial task as well. Not to mention 4 from the same machine. A 32" 4K display is a really terrible solution for desktop use. The size of the display to resolution is a very bad mix and is not in anyway an alternative to the 32" 6k display.
    Intel has also discussed 10nm for a couple of years.... and Apple had AMD build an intel TB3 controller on the GPU(s)

    And yes, it just works, also on a TR1950x (AMD Vanilla).
    All the components we use are tested by independant reviewers and the quality is (till now 100% working for the last 5 years).
    The upgradebility of an EPYC workstation is way more flexible than anything Apple can deliver at the moment (with respect to RAM, storage, IO, PCIe lanes and version, sound, build-quality, performance) Design is a matter of taste (the Dunecase has a nice design IMO).
    Driving a 6k HDR is peanuts over a single TB3 cable is peanuts, only at 60 Hz it gets a bit difficult (36.644.659.200 bps).
    I don't see why 2, 2.5, 4, 5 or 8k displays would be a terrible solution for desktop use and 6k would be the only resolution to aim for.

    I'm just teasing...;)
    Reply With Quote  
     

  5. #135  
    Quote Originally Posted by andrewhake View Post
    A 32" 4K display is a really terrible solution for desktop use. The size of the display to resolution is a very bad mix
    ..?
    Analog > Apollo wooden handgrip http://omeneo.com
    Digital > Primers - professional image transformation tools http://omeneo.com/primers

    imdb


    "Como delfines en el fondo del oceano
    volamos por el universo incentivados por la esperanza"

    "L'esperanza", Sven Väth
    "It's a poor sort of memory that only works backwards"
    Jung/ Carol
    Reply With Quote  
     

  6. #136  
    Senior Member Rakesh Malik's Avatar
    Join Date
    Apr 2013
    Location
    Seattle, WA
    Posts
    846
    Quote Originally Posted by Jarred Land View Post
    You mean the RED ROCKET? ha ha, I kid, I kid but it is kinda the same thing.

    I dont think the windows side needs it. The Rocket is dead because Nvidia stepped up. And as you all know I celebrated that funeral. And for what CUDA / GPU can't do... What AMD is doing with CPU's now... I mean it's just insane. absolutely insane.
    nVidia's not going to slow down... it's prepping Ampere for 2020 release on 7nm EUV lithography, so AMD won't have that edge for long either... but not even Intel believes that Intel is going to catch up to AMD on the CPU side.
    --------------------------------------------------------
    Rakesh Malik
    Director of Photography, Colorist
    http://WinterLight.studio
    http://www.imdb.me/rakeshmalik
    Reply With Quote  
     

  7.   This is the last RED TEAM post in this thread.   #137  
    Fire Chief Jarred Land's Avatar
    Join Date
    Dec 2006
    Posts
    10,542
    Quote Originally Posted by Rakesh Malik View Post
    nVidia's not going to slow down... it's prepping Ampere for 2020 release on 7nm EUV lithography, so AMD won't have that edge for long either... but not even Intel believes that Intel is going to catch up to AMD on the CPU side.
    Pretty spot on.
    Reply With Quote  
     

  8. #138  
    Senior Member andrewhake's Avatar
    Join Date
    Oct 2009
    Posts
    585
    Quote Originally Posted by Hrvoje Simic View Post
    ..?
    A 4096 or 3840px wide display at 32" is really annoying to use with regular UI on a display that sits on your desk in front of you. At 1:1 resolution the UI tends to be annoyingly small. And you can't really do 2x hi DPI resolution as the UI will be too large. And if you do an in-between scaled resolution most displays will look pretty bad and not perfectly sharp.

    4k at 21", 5k at 27" or 6K at 32" is pretty perfect for 2x hi DPI UI with pixel for pixel viewers on apps that support it. If all you plan to use a display for is fullscreen video then it isn't really a concern.
    Reply With Quote  
     

  9. #139  
    Senior Member andrewhake's Avatar
    Join Date
    Oct 2009
    Posts
    585
    Quote Originally Posted by Jarred Land View Post
    Agree. and even though we just got on the metal train I still do wish the 16 had the option for an RTX inside it though. I am typing on the 16" as we speak with a gnarly keyshot render going on in the background which is another program that only uses NVIDIA GPUs and it would of been done an hour ago but now it wont be done till the morning.
    Keep talking to Doug and convince Nvidia to agree to develop some drivers that support Metal and maybe it will happen! :P

    Yeah as someone who's job is vfx and enjoys video more as a side hobby where CUDA and Optix have been the only real option it is definitely annoying. More annoying that developers only focus on developing for one platform honestly, but good to see that slowly changing. To be perfectly honest though, the renderers I use I really don't care that much about using their GPU variants. I mainly use Arnold and Vray, and am rendering things that take much longer to render than any GPU renderer will really help with and rely on a render farm. The big issue with most production GPU renderers is time to first pixel. CPU renderers are much better at that currently, where GPUs are good for rendering noise free finished images more quickly. The dream is a version of Arnold with CPU/GPU hybrid rendering to make use of the advantages of each.
    Reply With Quote  
     

  10. #140  
    Senior Member Karel Šimůnek's Avatar
    Join Date
    Feb 2014
    Location
    Czech Republic, Europe
    Posts
    656
    Nice! Looking forward to METAL + RED since I'm planning to buy new RED camera at the end of 2020 :)
    KAREL ŠIMŮNEK fan of RED :)
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts