Is an Nvidia Quadro P400 good enough for Rhino 7?

Hello you wonderful people of the McNeel forums.
First of all, as a warning, I’m not “Rhino Savvy”, I’m tech savvy but not knowledgeable in the details of Rhino as a software. Basically I’ve never used Rhino

I’m currently investigating about Rhino for our team, I have noticed that they are facing a lot of issues when it comes to hardware and I’d like to get some insight or advice when it comes to hardware specs.

Currently, we’re using intel UHD 620 integrated graphics cards, but I want to know for sure that those are not good enough for the workflow of using Rhino.
Can you confirm that with me?

And finally, the question for this topic, if we get all this computers with an NVidia Quadro P400, would those suffice the purposes of modelling?


It only has half of the minimum recommended VRAM for a single, standard resolution monitor.
Minimum 4 GB for each standard resolution monitor, or 8 GB minimum for each high resolution monitor.

If you’re planning on rendering, then get a GPU with more VRAM and WAY MORE CUDA cores.

1 Like

I mean, a Quadro P400–you mean 400?–would be a big step up from integrated graphics, but it’s an incredibly basic cheap card that’s still more expensive than it has any right to be thanks to the Quadro label slapped on it.

If your current graphics are Intel UHD 620, then it sounds like your computers are basic “office” machines not intended for anything more strenuous than web surfing or connecting to a server. You need…actual computers, like decent “gaming” machines if not fully blown insanely expensive workstations, not laptop hardware in a desktop case.

1 Like

Hi John and Jim, thanks for your answer.
A couple of questions, for closure and for my reporting to the executives.

The Intel UHD620, is a no go. Correct?

The P400 is an improvement, but not ideal. Correct?

So, after those two questions. I guess the last question would be,
What’s the bare minimum GPU (in terms of cost/benefit) that would allow the users to work with Rhino without the feeling that time drags on forever?
If that question is too broad or takes a lot of research, I can look up a list of GPUs and come back with another inquiry if it’s not too much to ask, but probably this was answered before.


PS: I leave here our current specs so to have them noted:
Samsung 970 EVO Plus
650W 80+ Bronze PSU (different brands, this is the lowest W ones)

By the way I know I sound dense but I have to report back with the answers.

The Quadro is under spec but better than the Intel.
If you are only modeling, no rendering, and not big files, then either one will ‘work’.
Beyond that, you’re on your own. Everyone defines ‘minimum’ and ‘optimal’ very differently. I’ll not take that bait.

Yes, I guess that’s a whole answer on it self.
The team is not rendering the models they create.
Rhino is being used only for modelling, but I have seen that the intel GPU really struggles.

So, just for modelling, and honestly I don’t know what it means “not big files” but I assume it has something to do with the complexity of the models.
Would the P400 work?

I will not ever recommend an under spec GPU.

Please read this for some good perspective:

1 Like

Ok, understandable that you don’t want to recommend something that well, can’t be recommended.

What would you recommend?
As a bare minimum improvement over the integrated GPU.
My plan is to get rid of the iGPU and switch to a dedicated one, and I’m looking for knowledgeable opinions/advice, that’s why I came into the forum.
I’ve read the link for the perspective and I already understand that the P400 is on the ‘not recommended category’

Just from the Quadro line-up,
would a P620 be “recommended”? My guess is no because of the 2GB of VRAM
would a P1000 be ‘recommended’?

Outside of the quadro spec, is there anything of worth?



My 2nd hand understanding is you can get more performance for you dollar with an Nvidia GeForce RTX with ‘enough’ VRAM for your monitor setup, and as many CUDA cores as you are willing to pay for.

Hmm, even if we don’t do renders that’s the recommended lineup (RTX cards)?

This might help…


Thanks, is quite a comprehensive list.

They have a CPU list, too. It’s really a great website. It’s a secret. Shh! : )

1 Like

+1 for future proofing yourself with an rtx card.

Yes an RTX card is the way to go for sure. Best bang for your buck. Just avoid the RTX 3090 as it has not been knee capped to damage crypto mining like the latest versions of the RTX 3080 and below. So the GTX 3090 is very pricey compared to its siblings. I am using a GTX 3080 ti in my new build and it performs well but will be overkill for your described usage. I suggest looking at lower tier members in the RTX family like the RTX 3060. Even for these, the challenge will be to find a source that is not sold out.


I won’t say a word :slight_smile:

Hi Francisco, if your company has executives, it should be able to afford best-in-class computers. At least a cheap/budget gaming PC with an RTX 3070-3080, 64 GB ram, all ssd nvme m.2 drives, fast n’ cheap Ryzen CPUs and 4K display(s).

Simple rule: if the owners draw big profits, drive nice cars, etc. while you work on nerve-wrecking slow computes it’s time for your and your entire team to look for better jobs. No one can get very good at this game with slow computers. It’s a losing battle, terrible quality of life and career-limiting.

Happy to talk with your executives and convince them the same way I combined mine (when I worked for someone else) that only idiots and really broke wanting to go even more broke buy slow computers.

I hope this helps,


1 Like

Thanks Gustavo, I’ll keep your suggestion as an option.

I would at least go for the P2200. Anything less than 4GB of Vram is a waste of money for workstations.
Here is the current lineup of Quadro cards from nVidia:

But I don’t recomend buing a card with less than 8GB of vram.

Personally I use the RTX 2070 card on my workstation and am happy with that. It has 8GB Vram and I run one ultrawide screen and one 1200p screen on that (Combined that equals just over 75% of one 4K screen! So resolution draws a lot of Vram)

At home I have one single ultrawide (Which equals half a 4K monitor) and when a large model is loaded I use 2GB of Vram:

If I open one more Rhino with that same file it jumps up to 4GB, and so on. So Vram is not about resolution alone, it is also about how much stuff you want to navigate and about efficiency when working. Do note that the system can share “normal” RAM with the graphiccard, and that is quite efficient, but not as fast as running everything on the GPU’s Vram.

Hope that helps a bit.

Oh, final note: Quadros are not better than Geforce these days, but they are designed for professional use, so they are often clocked slower and can require less power. (They are designed for stability and durability, but so are most gaming cards… otherwise a world of gamers would be furious, and no producers want’s to upset a bunch of online hating bloggers, that’s bad for the reputation)

1 Like