Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

RED + NVIDIA Announce Realtime 8K R3D GPU Decoding

I would be most happy if Apple and/or AMD can offer a solid alternative to nVidia based solutions... but when?

Cheers - #19

Apple I don't know.

Part of AMD's affordable solutions (8k.R3D 5:1 +24 fps in real-time) will be announced at CES 2019 (and released around march 2019) and it will not only accelarate .R3D, but everything you throw at it.
The second part of the affordable solutions (8k.R3D 5:1 +48 fps in real-time) will be release around august 2019.

And for the bigger budget's (Like Phil Holland's workstation) they have EPYC "Rome" with it's massive amounts of PCIe-lanes.

Intel already has a solution with its i9-7960x and 7980xe for real-time 8k.R3D 5:1 +24 fps in real-time, you just have to cool it really good when overclocking (Phil's workstation http://www.phfx.com/articles/workstation_2018/).
 
I would be most happy if Apple and/or AMD can offer a solid alternative to nVidia based solutions... but when?

Cheers - #19

I think at this point, though nothing is firm, Apple will likely jump back into supporting Nvidia products likely around Q2/Q3 2019. Their upcoming Pro is going to be something a bit different and we'll have to wait and see what develops there.

Just a theory, not like a know anything or anything like that.

I'm Jon Snow. Swings sword, swoosh. Swoons over dragon lady. Avoids death by dying. Wait.
 
I would imagine the target user for the 2019 mMacPro would be highly likely to want fast performance in apps that rely on CUDA for accelerated processing. Not sure what Apple would have to do to get back in business with nVidia, but if they could...

Cheers - #19
 
I think at this point, though nothing is firm, Apple will likely jump back into supporting Nvidia products likely around Q2/Q3 2019. Their upcoming Pro is going to be something a bit different and we'll have to wait and see what develops there.

I really want this to be true. I was looking at eGPUs (AKA, only AMD cards) for macOS today and was not impressed by the marginal performance gains. Hopefully Jarred's plea to Apple during the presentation sent that message home.
 
pic_disp.php

I know you stole this from Puget Systems website but a bit of context is needed.

Whether or not the blower cards are better depends entirely on the situation. Which motherboard, how many cards, etc.

If you're planning to have space between the cards anyway for improved cooling or to use the 3 or 4-slot NVLink connectors, then the dual fan cards are generally better as they are quieter.

If you are trying to stack 4 cards in an 8 slot case, then yes the blower cards are better.
 
I know you stole this from Puget Systems website but a bit of context is needed.

Whether or not the blower cards are better depends entirely on the situation. Which motherboard, how many cards, etc.

If you're planning to have space between the cards anyway for improved cooling or to use the 3 or 4-slot NVLink connectors, then the dual fan cards are generally better as they are quieter.

If you are trying to stack 4 cards in an 8 slot case, then yes the blower cards are better.

I don't steal, it's open and for everyone to see.

One can always use EK-water block's when you want to squeeze the most out of it, to bad it only works with 2 GPU's on NVLink.
 
Which of these do you guys think would yield the best performance on a system with an Intel 6700K? They’re all pretty similar in price, but I’m trying to do some research, but specifically for editing 6-8K footage. Right now, my system barely handles 8K. Short of starting over on a new build:

Quadro RTX 5000
$2300
3072 Cuda Cores
16GB

Titan RTX
$2499
4608 Cuda Cores
24GB

GeForce RTX 2080 Ti x2
$2600
4352 Cuda Cores each
11GB

I’m completely unsure of how the resources on a dual card setup are used. So, I’m looking to you guys for help. Thanks!
 
Which of these do you guys think would yield the best performance on a system with an Intel 6700K? They’re all pretty similar in price, but I’m trying to do some research, but specifically for editing 6-8K footage. Right now, my system barely handles 8K. Short of starting over on a new build:

Remember that everything is still in Beta stage, when everything goes according to plan (that I know of) NAB 2019 will be a good time for official release.

Quadro RTX 5000
$2300
3072 Cuda Cores
16GB

About perfect for 6k and bare minimum for 8k (memory wise) but don't expect full-res premium decode in real-time.

Titan RTX
$2499
4608 Cuda Cores
24GB

Looks to be around the sweet spot for realtime 8k editing in realtime.

GeForce RTX 2080 Ti x2
$2600
4352 Cuda Cores each
11GB

Fastest, but will run out of memory fast with 8k when doing effects like noise reduction.

I’m completely unsure of how the resources on a dual card setup are used. So, I’m looking to you guys for help. Thanks!

Don't buy to fast on promises and keep an eye on this thread http://www.reduser.net/forum/showthread.php?172922-NVIDIA-CUDA-RCX-Public(ish)-Build. On wednesday everything can be different.
 

I saw a few responses on the NVIDIA forum that gave the impression that Apple and NVIDIA were moving forward
on Mojave but I'm not sure how true it is.

This recent Apple Insider article goes deeper in explaining the contentious history between the 2 companies.
https://appleinsider.com/articles/1...in-macos-and-thats-a-bad-sign-for-the-mac-pro

The change.org petition on the matter now has over 10K signatures and growing.
https://www.change.org/p/tim-cook-a...-work-with-nvidia-on-drivers-for-mac-os-10-14

I'm starting to see that the issue may be as much about NVIDIA not wanting to support Apple's Metal as it is about Apple not wanting to support NVIDIA's CUDA.

Regardless of who's right or wrong, this is creating a real mess for users.

Brian Timmons
BRITIM/MEDIA
 
Apple wants a single, unified solution that runs on all GPUs. That is why they backed OpenCL for so long. NVidia wants their proprietary hardware and software in as many places as possible. It has cost them big money in HPC deals, but they have invested billions in CUDA and hope to make billions from it, so they can't let it go. AMD and Intel just want to make money with volume sales to Apple and will back whatever Apple wants.
 
I'm starting to see that the issue may be as much about NVIDIA not wanting to support Apple's Metal as it is about Apple not wanting to support NVIDIA's CUDA.

Imagine if Nvidia decided they didn't want to support DirectX...
 
Back
Top