Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

GTX 680M or QUADRO 5010M in a NOTEBOOK?

I believe those are 2nd gen i7 (Sandybridge) and not 3rd gen (Ivy bridge) that came out in April.

With 3rd gen i7 (Ivy bridge) you get a slight speed bump, Thunderbolt (if they include it) and the
Intel graphics on die subsystem HD 4000 (good for background transcoding in certain programs).

Also the GTX680m 4GB are almost nonexistent, I have seem to remember reading in a couple of
forums that the lead time for the 4GB (not 2GB) version is really, really long. So by the time you
actually get your laptop with a 4GB, the Ivy bridge version could start shipping. And you definitly
want the 4GB one (I would think) because with R3Ds and support for all those hungry Cuda cores
it will most definitely make a big difference.

This means with the long wait on the GTX680m 4GB; you might be better to wait till Origin
upgrades to i7 3rd generation Ivy bridge, and the GTX680 are more readily available.

If I were you I would call Origin and ask them a ballpark for the 3rd Gen i7 (Ivy bridge) and if
they plan to include Thunderbolt connections that can be done with that i7.

Note... from what I understand, not everybody will include Thunderbolt on their Ivy bridge first runs.

Good luck and post back if you do get some answers.
 
Thanks for the reply.
They said they have the 4gb GTX680s in stock.
I am not interested in Thunderbolt, only USB 3.

But, I will ask them about the IVY BRIDGE.

I will be leaving FCP andgoing back to Premiere.... and CS 6
 
I will get my 680m in a week. Today, i work with 670m.
As soon as I get the 680m, i will give you informations


Olivier
 
Cool Gary, good to here on the 4GB.

I think it's smart to check on the Ivy bridge, and ask other co-workers/friends what they think.

Maybe Jeff Kilgore will chime in on the need for Ivy Bridge or not, and the what exact benefit the
Quadro has.

Rule of thumb though (I believe) is that Quadro is not so needed (and even slower then Geforce) if
you are strictly video editing and 2D effects. Now when you do heavy 3D and 3D style compositing,
Quadro's have much more precision on 3D model presentation etc.

So I am betting the needs for Quadro are more for the CAD/CAM and 3D crowd etc.
 
Merci bien Olivier. I will look forward to your impressions.

Yes Michael, I have noticed that Jeff Kilgore seems very knowledgeable indeed.
I think you are right about the Quadro. Someone else told me pretty much the same thing.
 
Firstly, great choice with the Origin EON-17S. Sure, there are faster laptops, like the 17X, but you pay a price in portability.

Two clarifications - the CPU used is Ivy Bridge. You will know this from the model numbers, Core i7 3xxx is Ivy Bridge, 2xxx is Sandy Bridge.

For Premiere Pro CS6, go for GTX 680M 4 GB. If you want to know the benefits to Quadro, check out NVIDIA's guide. (search Quadro vs Geforce, it should be the first result) Quadro K5000M is the GPU to look out for if you must require a Quadro GPU. Avoid 5010M, it is EOL now.
 
does someone here have an idea of a good test showing the reel benefit of using 680m vs 670m in adobe cs6 ?

Olivier
 
Where di I get the 17x ?????
I want the fastest I can possibly configure.
 
That's interesting, 17-X is no longer listed. After a bit of digging around, it seems the Clevo 17 SB-E chassis has supply issues. It's not just Origin, but also some competitors with similar products that are showing out of stock or delisted.

No worries, 17-S Pro is still plenty capable.

Dell Precision M6700 is worth a look at as well, they have far superior display options (100% AdobeRGB 10-bit IPS!) alongwith upto Quadro K5000M if you need Quadro. Like I mentioned before, K5000M is faster than the desktop Quadro 6000! For general Premiere Pro CS6 work, 17-S Pro will be better value for money.
 
I get my clevo notebook back with the GTX680m instead of the 670M.
Premiere works with the 680. I don't know how to see the improvment. In Resolve 9, the difference between the 670m and the 680m is not big : i can read Quad HD red raw files at 19 fps for 1/4 res good and 16 fps for 1:2 res good. With the 670M, it was about 2 or 3 fps less.
If you want me to do test, just ask.
MyClevo notebook config :
- Clevo 170 EM
- Win 7 64 bits SP1
- i7 3920XM 2.9GHz
- 32 g of RAM
- 3x samsung 512 SSD
- GTX680M

Olivier
 
Neat video seems to be a good "reel" performance tester. In the preferences menu, you have a "performance" checker to evaluate the maximum numbre of frame/sec you can do, and the best combination between number of CPU, CPU alone or CPU + GPU
With the 670M i get my sustem optimized with 4 CPU + GPU 670M (6.76 frames/sec)
With the 680M, i get best performance with 5 CPU + GPU 680M (8.77 frames/sec)

20% more in denoise with is a very powerhungry task is not so bad !


Olivier
 
I tried to see what the charge is on the GPU with widget GPU observer.

With Premiere CS6 and GTX680, I don't "eat" more than 10% of the GPU power. With Resolve, I "eat" 30% of the GPU power. So, I think we need to wait for new software to be able to use the increase of the GPU power.



Olivier
 
Olivier, thank you for the test!
Have you tried rendering for example 4K footage scaled down in a 2K timeline with 'maximum render quality' checked? I think there should be a huge speed boost coming from the GPU.
 
Yes, I have tried, but I am very disapointed. With the maximum render quality checked, my rendering time gets very long (x12 the reel time) and my GPU is nearly not used (only 5 to 10% sometimes and sometines 0%). So, i thin,k the CS6 does not like the GTX680 at this time, and I need to wait for some optimization from Adobe.
All my tests are done with a 1080 timeline to a 1080p file. Maybe the GPU is only usefull with images reduction. I will do a testwith a quadHD timeline to a 1080p.


Olivier
 
Olivier,
Are you testing these times for an H.264 output file?

I think CUDA only accelerates certain render outputs more than others.
 
No, i render always for the "same settings" check boxes. It provides MPEG file with maximum informations. Then i do my h264 from that file wich is my master (in the same way than i did it with ProRes)


Olivier
 
I just did a 2 rapid test :

1) with a timeline Quad HD --> 1080p h264 ipad2 preset (h264). For 5 min, it takes 1hour 25 (x17 reel time). The processor is always at 100% but the GPU goes from 0 to 10% max.
2) With the timeline Quad HD --> same settings as the sequence checked (So 4K HD in MPEG2). For 5 min it takes 7 minutes to render. Idem for the processor and GPU activity.

So, if you want to render quickly, do a sequence in the same size than your final output. I think it is good to always check the same settings as you sequence checkbox. And do trust the GPU is doing a big job for rendering (or you need a GPU validated by Adobe like the 580. Could someone test it ?).


Olivier
 
Thanks!
Yes, it should render fastest when having the timeline's resolution/framerates identical to the output.
Have you tried the workaround to 'activate' the 680M for CS6 (write the GPU's name in the CUDA cards textfile)?
Rendering to h.264 for iPad sounds like heavy compression so it should naturally take much longer than the other output I guess.
 
Back
Top