Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Apple + RED

I think this is great news for MacBook Pro 16inch Purchasers but I am not clear... any chance you can help;-)

Will my brand new maxed spec MacBookpro 16 playback r3ds better than it does now?

What a win it will be, if it could play one stream full Res.
 
What Apple is not doing with AMD CPUs is just insane. Absolutely insane.

It was only very recently that Intel opened up Thunderbolt 3 spec to be used by other CPU and chipset manufacturers, so if AMD decides to implement thunderbolt 3 support directly on CPU it could definitely open up some interesting possibilities. Obviously Apple is extremely well versed in thunderbolt 3 (likely more than any other company).

If you want a Mac with AMD CPUs, please let it be known!

The implementation of the MPX modules with additional PCI connectors to support additional power (without extra power cables like all other big GPUs) as well as multiple Thunderbolt 3 ports on the the GPU, is one of the coolest things about the new Mac Pro. The amount of power available through those connectors is much higher than normal.
 
It was only very recently that Intel opened up Thunderbolt 3 spec to be used by other CPU and chipset manufacturers, so if AMD decides to implement thunderbolt 3 support directly on CPU it could definitely open up some interesting possibilities. Obviously Apple is extremely well versed in thunderbolt 3 (likely more than any other company).

If you want a Mac with AMD CPUs, please let it be known!

The implementation of the MPX modules with additional PCI connectors to support additional power (without extra power cables like all other big GPUs) as well as multiple Thunderbolt 3 ports on the the GPU, is one of the coolest things about the new Mac Pro. The amount of power available through those connectors is much higher than normal.

OUR hopes not only electricity, but everything else .
 
The new 16" is a massive jump in metal performance over the 15". The new Cuda SDK drops this morning, so Resolve on both sides will work when BM is finished integrating, obviously much much better under Cuda on your 28 core titan.

Thanks for the info... looking forwards to the improvements!
 
It was only very recently that Intel opened up Thunderbolt 3 spec to be used by other CPU and chipset manufacturers, so if AMD decides to implement thunderbolt 3 support directly on CPU it could definitely open up some interesting possibilities. Obviously Apple is extremely well versed in thunderbolt 3 (likely more than any other company).

If you want a Mac with AMD CPUs, please let it be known!

The implementation of the MPX modules with additional PCI connectors to support additional power (without extra power cables like all other big GPUs) as well as multiple Thunderbolt 3 ports on the the GPU, is one of the coolest things about the new Mac Pro. The amount of power available through those connectors is much higher than normal.

That would be fun, AMD having implemented Thunderbolt 3 directly on the CPU while intel is still using a separate Thunderbolt controller IC.
When you want an AMD machine running MacOS, just install it, it works.
A 3950x + Asrock Creator(10G + 2x TB3) is fast enough to process 8k.R3D 5:1 24 fps at full resolution in realtime (editing) on MacOS, Windows and Linux. When you prefer the looks of the new Mac Pro, you can get a Dunecase with similar looks for a fraction of the price.
 
That would be fun, AMD having implemented Thunderbolt 3 directly on the CPU while intel is still using a separate Thunderbolt controller IC.
When you want an AMD machine running MacOS, just install it, it works.
A 3950x + Asrock Creator(10G + 2x TB3) is fast enough to process 8k.R3D 5:1 24 fps at full resolution in realtime (editing) on MacOS, Windows and Linux. When you prefer the looks of the new Mac Pro, you can get a Dunecase with similar looks for a fraction of the price.

Intel has already discussed implementing TB3 directly on newer CPUs as well. The current TB3 controllers are quite large still so it's hardly a trivial task. Makes more sense to integrate them into a much larger GPU, something Apple has already done on the Mac Pro.

Also. Just stop. I don't want this, and nobody else wants this. I don't know why you bring this upon everyone and yourself in nearly every thread. It isn't "just install it, it works", and it's not even close to that. It is an extremely Mickey Mouse solution with a lot of limitations. If you are a Mickey Mouse operation I am sure it is great solution. If you live in a world where you think that machine you described is in anyway similar in terms of setup, specs, design, build-quality, sound, capability, upgradeability, IO, and so on to the new Mac Pro then I don't even have a response.

Driving a 6K HDR display over a single TB3 cable is a very non-trivial task as well. Not to mention 4 from the same machine. A 32" 4K display is a really terrible solution for desktop use. The size of the display to resolution is a very bad mix and is not in anyway an alternative to the 32" 6k display.
 
The new 16" is a massive jump in metal performance over the 15".

Patiently waits for some nice pre-Metal R3D and post-Metal R3D numbers. :)

The new 8GB 5500M on the 16" chomps pretty hard so far in my testing.
 
Patiently waits for some nice pre-Metal R3D and post-Metal R3D numbers. :)

The new 8GB 5500M on the 16" chomps pretty hard so far in my testing.

Agree. and even though we just got on the metal train I still do wish the 16 had the option for an RTX inside it though. I am typing on the 16" as we speak with a gnarly keyshot render going on in the background which is another program that only uses NVIDIA GPUs and it would of been done an hour ago but now it wont be done till the morning.
 
Never met a Mac user who would doesn’t want Nvidia GPUs in their laptops... (inc me) so I am assuming there must be some huge behind the scenes story not yet told...

Does your metal support put all the de-wavelet work onto the GPU? I only ask as at the moment with open Cl only the debayer/scaleing done on GPU it shows pretty high utilisation on a 16” with top GPU..

Agree. and even though we just got on the metal train I still do wish the 16 had the option for an RTX inside it though..
 
Never met a Mac user who would doesn’t want Nvidia GPUs in their laptops... (inc me) so I am assuming there must be some huge behind the scenes story not yet told...

The unofficial story is that there is bad blood at the executive level between Apple and Nvidia because of the Nvidia GPU debacles from when Apple was making it's big comeback in the mid-2000's and their CUDA vs OpenCL stance, as being locked in to Nvidia's CUDA was not in line with Apple's then preferred support of their OpenCL, which would allow them to use both ATI and Nvidia GPUs (or whatever else might have popped up back then) in their offerings. After the delayed and faulty GPU hardware, a serious string of executive-level fuck-ups at Nvidia and their legal issues with Intel - whom Apple was in bed with - only made things worse. It was originally intended that both the iPhone and iPad be powered by Nvidia's Tegra chips, but Tegra kept hitting hardware execution delays and wasn't available for Apple to use in engineering, so Apple went with their own design after using Samsung chips in the first iPhones.
 
Unofficially, Apple's "new Mac Pro" was woefully late before finally being released at the end of 2013. Sporting zero nVidia GPUs, the new Mac Pro did not usher in a new wave of innovation and strategic capability. Indeed, it under-performed on Day 1, and during it's miserable six year lifespan, no new CPUs or GPUs were ever offered. No wonder they are so angry with nVidia, who clearly got their act together and then would not stop leapfrogging themselves while the 2013 Mac Pro went from "not insanely great" to "insanely not great".

Six years later, the new new Mac Pro has arrived. But will the petulance that doomed the new Mac Pro doom the new new Mac Pro. I'm waiting and watching.
 
Intel has already discussed implementing TB3 directly on newer CPUs as well. The current TB3 controllers are quite large still so it's hardly a trivial task. Makes more sense to integrate them into a much larger GPU, something Apple has already done on the Mac Pro.

Also. Just stop. I don't want this, and nobody else wants this. I don't know why you bring this upon everyone and yourself in nearly every thread. It isn't "just install it, it works", and it's not even close to that. It is an extremely Mickey Mouse solution with a lot of limitations. If you are a Mickey Mouse operation I am sure it is great solution. If you live in a world where you think that machine you described is in anyway similar in terms of setup, specs, design, build-quality, sound, capability, upgradeability, IO, and so on to the new Mac Pro then I don't even have a response.

Driving a 6K HDR display over a single TB3 cable is a very non-trivial task as well. Not to mention 4 from the same machine. A 32" 4K display is a really terrible solution for desktop use. The size of the display to resolution is a very bad mix and is not in anyway an alternative to the 32" 6k display.

Intel has also discussed 10nm for a couple of years.... and Apple had AMD build an intel TB3 controller on the GPU(s)

And yes, it just works, also on a TR1950x (AMD Vanilla).
All the components we use are tested by independant reviewers and the quality is (till now 100% working for the last 5 years).
The upgradebility of an EPYC workstation is way more flexible than anything Apple can deliver at the moment (with respect to RAM, storage, IO, PCIe lanes and version, sound, build-quality, performance) Design is a matter of taste (the Dunecase has a nice design IMO).
Driving a 6k HDR is peanuts over a single TB3 cable is peanuts, only at 60 Hz it gets a bit difficult (36.644.659.200 bps).
I don't see why 2, 2.5, 4, 5 or 8k displays would be a terrible solution for desktop use and 6k would be the only resolution to aim for.

I'm just teasing...;)
 
You mean the RED ROCKET? ha ha, I kid, I kid but it is kinda the same thing.

I dont think the windows side needs it. The Rocket is dead because Nvidia stepped up. And as you all know I celebrated that funeral. And for what CUDA / GPU can't do... What AMD is doing with CPU's now... I mean it's just insane. absolutely insane.

nVidia's not going to slow down... it's prepping Ampere for 2020 release on 7nm EUV lithography, so AMD won't have that edge for long either... but not even Intel believes that Intel is going to catch up to AMD on the CPU side.
 
nVidia's not going to slow down... it's prepping Ampere for 2020 release on 7nm EUV lithography, so AMD won't have that edge for long either... but not even Intel believes that Intel is going to catch up to AMD on the CPU side.

Pretty spot on.
 

A 4096 or 3840px wide display at 32" is really annoying to use with regular UI on a display that sits on your desk in front of you. At 1:1 resolution the UI tends to be annoyingly small. And you can't really do 2x hi DPI resolution as the UI will be too large. And if you do an in-between scaled resolution most displays will look pretty bad and not perfectly sharp.

4k at 21", 5k at 27" or 6K at 32" is pretty perfect for 2x hi DPI UI with pixel for pixel viewers on apps that support it. If all you plan to use a display for is fullscreen video then it isn't really a concern.
 
Agree. and even though we just got on the metal train I still do wish the 16 had the option for an RTX inside it though. I am typing on the 16" as we speak with a gnarly keyshot render going on in the background which is another program that only uses NVIDIA GPUs and it would of been done an hour ago but now it wont be done till the morning.

Keep talking to Doug and convince Nvidia to agree to develop some drivers that support Metal and maybe it will happen! :P

Yeah as someone who's job is vfx and enjoys video more as a side hobby where CUDA and Optix have been the only real option it is definitely annoying. More annoying that developers only focus on developing for one platform honestly, but good to see that slowly changing. To be perfectly honest though, the renderers I use I really don't care that much about using their GPU variants. I mainly use Arnold and Vray, and am rendering things that take much longer to render than any GPU renderer will really help with and rely on a render farm. The big issue with most production GPU renderers is time to first pixel. CPU renderers are much better at that currently, where GPUs are good for rendering noise free finished images more quickly. The dream is a version of Arnold with CPU/GPU hybrid rendering to make use of the advantages of each.
 
Back
Top