I have been noticing a pervasive pattern of driver update problems with the Quadro. At times, no driver is to be found. Currently, it appears that nVidia is just bumping the date, and doing so minor changes that their driver vastly lags behind the Geforce drivers.
I realize that the Quadro were meant for more professional use, but an older driver isn’t always better. I am fairly certain that the laptop Quadro’s don’t have ECC memory on them, so unless they do the value-added does not seem a reality.
[I have an Quadro RTX A3000 in my Lenovo P15 Gen 2.]
I think the general consensus is the Nvidia GeForce RTX cards with as much VRAM as you can comfortably afford are the best option currently.
Just make sure your power supply is big enough to drive them and connected to power correctly.
Hopefully others will weigh in with their experiences and recommendations.
Which driver version do you have?. I tested all of them and personally I found that version 474.xx provides the most reliability / performance. These drivers are good enough for rhino Tasks but some rendering applications may complain about unsupported (or old) drivers.
I’m running 527.27 from November 22 on a Quadro P2000 under Windows 10. Nothing fancy at all. I don’t push it very hard doing tech support but it’s been spectacularly reliable driving two 27’ standard resolution screens.
The only “value add” with a Quadro is driver support for archaic programs that require a Quadro because that’s they only thing they were ever tested with.
There’s nothing stored on a video card is important enough to need protection from errors from cosmic rays, that’s not really a “thing.” They have error-correction strategies that might be like ECC or something like it, but that’s so that when you overclock it too much it might not crash, it’s not really a comparable use case to RAM ECC at all.
For video, that’s a fair observation, but with a lot of applications using the graphics processor for parallel computations it’s just as important as for main memory.
If you are running, for instance, structural analysis for your slender tower skyscraper on an analysis program that uses the graphics card I can’t imagine you’d be happy if your analysis came back after an hour or two with results that seem related to modeling too far from the origin in Rhino. Or even worse, come back looking right, but off in a couple of critical bits. And your unhappiness would be related to your eventually discovering the error. Otherwise it could be the building’s occupants.
Just remember: “It only takes a few more payments to go first class”.
GeForce and Quadro uses identical GPU chips for the most part.
GeForce are overcooked to be “gaming stable” where no critical applications can be affected while you can gain a good amount of FPS.
Quadros are clocked for more stability which will result some performance loss but when it comes to medical imaging or scientific applications or even low tolerance manufacturing. FPS or gaming benchmarks doesn’t mean anything.
Quadro chips gets more QA/QC at Nvidia and their Drivers are released weeks later to make sure they don’t ruin everything.
I personally run Nvidia A40 GPU that uses GA102 Chip identical to RTX 3090Ti with the below differences:
1- slower GPU clocks compared to 3090TI with approximately 10% lower speed in benchmarks.
2- lower power usage . A40 uses up to 300W vs. 3090Ti uses up to 450W.
3- lower power usage = less heat = more stable operation
4- quadro runs lower memory clock but with Higher capacity. Which reflect the same idea. Lower performance vs Stable operation. With ECC if needed.
I get the impression Nvidia would like to merge the product lines - the A3000 doesn’t even have the word Quadro in its name, it’s just RTX A3000.
And I used to run a previous generation mixed Geforce and Quadro desktop rig: both cards ran fine under the Geforce driver (but you couldn’t run the Geforce card on the Quadro driver). The Quadro card was on the supported hardware list for the Geforce driver. I don’t think that’s the case for the A3000 though.
@Brenda, I’ll be in the market for a new laptop soonish so I’m intrigued: what do you find the A3000 with its driver can’t do that a Geforce card could?
No, I don’t agree. I do wish I had ECC Memory on both my CPU and GPU. Just so you know, on my laptop, I do Cycles rendering on the Quadro A3000 (RTX 3060 plus a bit) because it’s much faster than a 8-core CPU.
The upsetting thing is: my A3000 is faster than my desktop GTX 1080, which is just a bit faster than the 12-core Ryzen 3900x.
I’ve done 4k renders on my laptop, but heat is certainly an concern, as is GPU ram at 6GB. I made a 3D printed thing to prop it rear edge up to get more air into it. I would have bought a bigger GPU for it, but I feel that the P15 can’t handle any more heat.
Though I usually prefer Thinkpads because the rest of the laptop is physically robust. My previous one lasted 7 years, and it used to be carried in a backpack for 3.04 miles a day. For the new one I use a square cut from a old thin sheet to protect the keyboard from the keys. (Yes, Macs will mark the coating on the screen after a while, too.)
I just installed 527.56 Geforce drivers on my Quardo A3000 Laptop GPU. I downloaded the RTX 3060 drivers. No mod needed to be done, as was needed to be done in the old days because the driver saw it as “Supported Hardware.”