Best laptop computer for Rhino 5



Yes. I have 7 computers, all with nVidia-based Quadro cards. Zero problems. The last time I read a McNeel advisory, Quadro was the one video card they recommended.


Is it possible to have access to the advisory?
PS Does your DELL get hot and noisy as it is so slim?


I did not bookmark it, but you can do a search here for ‘video cards’ on the forum. You’ll see a mostly problems with AMD and many recommendations of the Quadro cards.

The fans never come on when modelling in Rhino. Mostly when rendering. The SSD keeps it super quiet and cool.


Hi all

I’m looking at a laptop workstation for Rhino 5. Mostly I’ll be working from a fixed place with a dock to a larger monitor, external HDD etc so only limited portability is required:

Dell Precision 5510 i7-6820HQ

On the Dell site it says this comes with Intel HD graphics 530 but it also lists the NVIDIA Quadro M1000M with 2GB GDDR5. Dell support are not clear on whether the NVIDIA card only comes with the Xeon processor which is a fair bit more expensive.

1920x1080 screen is a reasonable saving over the UHD 3840 x 2160 and (again according to Dell) I can plug in 1x external 4k monitor (40"tv screen?) or my existing 2 x 1920x1080 monitors via a dock such as their thunderbolt dock.

My son’s XPS 13 with i7 and 16GB RAM seems to run Autocad just fine and is a lower cost option - the only significant difference seems to be the graphics card

Thoughts welcome from those who have tried.


What have you heard about the store netbookreview? Can we believe them? I want to buy a laptop there


Mine too. I’m waiting on a call from them to help me configure my machine for Rhino.


Within the last couple of weeks, I finally have a Boxx MXL VR! A true desktop CPU (i7-7700K @ 4.2GHz), Nvidia GeForce GTX 1070, M.2 PCI-E SSD and 32GB of RAM! It’s a large laptop, but it’s powerful. One thing I wished I knew about the VR series is that there is no option for using Intel integrated graphics. There is the option on the other MXL versions, but not the MXL VR. With the VR series, it’s always using the GTX cards, so you can’t switch to integrated graphics when running on battery, you can only throttle it down. But I guess that’s the price of having the performance of a Geforce 1070 or 1080 in a laptop.


I believe the manufacturer of the chassis is CLEVO?
Does yours look similar to those I attached?
I have a Bullman (CLEVO Chassis) from 2012. Never had a more stable one, even no desktop.
But now it’s time to get a new one…

Still there is no comparison between Geforce and Quadro cards.
I believe it would be really helpful to everyone if all people with quite new machines did a holomark run and put the results here by screenshot. I could only run holomark on rhino 5, no soluton yet.


I haven’t ever run the Holomark 2 test on this machine, so I just did. Here are the results from my Boxx MXL VR:

Laptop advice

I noticed some of my GPU scores weren’t that great, so I cranked everything to performance in the nvidia control panel and knocked down Rhino anti-alias settings and re-ran the test:

Yikes, big difference.


Nice, yeah, the “only” difference between quadro and geforce are their ability to run AA with close to no performance drop. But that is only on apparent on the mesh test in Rhino 5.


Dammit, I have to pimp it up too, considering I’m one generation above and my scores are lower. O_O


Turn anti-aliasing off in Rhino and the results will go through the roof! Nice specs on that machine!


The RTX is few days old, I can’t wait to see it in some real action! However with AA off and some Nvidia control panel mumbo jumbo, it’s still nothing compared to yours. Well, maybe I’m not the PC builder master like I thought. :neutral_face:


Try these settings in the control panel:


I’m no expert, but these are what I’ve changed for the big increase. I can’t imagine that your card won’t trump mine, everything being equal.

(FPU) #56

Can GeForce cards really detect whether an OpenGL function is being called from a CAD application or not? :astonished:

Or do some CAD applications call certain legacy OpenGL fixed-functions (from the pre-shader era) which run slower on a GeForce, because of how the driver executes these legacy functions?