I am having problems with graphics performance when working with large (1.3 GB files) in Rhino5 - 64 bit. We are using Windows7 Pro, 32 GB DDR3 1600 RAM, Sata Drives, Intel I7-2600 CPU for hardware. We have tested GeForce 660 (2GB RAM) and GeForce 200 (512MB) and found the GeForce 200 cards to have superior performance.
When the Perspective viewport is rotated with a mouse action, the CPU usage immediately goes to 100% for the time the mouse is being activated. It then drops to 4% - 6% and nothing happens in the viewport. Disk usage experiences a series of “100% spikes” then drops to 2%-4%.
Are there any settings we can use to load the large file (with associated object links) into RAM so that we do not need the disk swaps?
Are there any CPU settings that can be used to activate more cores / threads and prevent “overload”.
Can a graphics card solve this problem - has anyone had success in speeding up the graphics performance for very large files?
Most of my files are larger than 1 GB, and, yes, it’s not always easy.
Some suggestions for scene management:
make sure you are not showing many curves.
modify the mesh settings to have display meshes with fewer polygons.
Most of the time you’ll only be working on a few of the objects in the scene at the same time. You could try to adopt a workflow where you extract all render meshes and hide all NURBS objects. Then only show the NURBS object that you are currently working on. The downside of this is, of course, that the file size will increase (unless you also make sure you delete all the display meshes of the NURBS objects).
As for hardware, you could look into using Neon and using a dedicated card to speed up its rendering. I have no experience with this and I guess it might be a good option for fast rendering but perhaps less so for use as a work mode.
It depends on what you mean by “large files”… Millions of small objects? or very few large objects using millions of polygons?
Graphic performance cannot be increased via the CPU or multiple cores. However, it again depends on the question above.
When you have a ton of objects, then performance can take a hit due to “object management” overhead in Rhino… In other words, simply iterating over a large database is causing performance hits.
The first thing I would try is to turning on the “Bounding Box Display” option in your display mode(s)… This will convert every object in your display to simply bounding box wireframe while the view is being updated…once you release the mouse, the objects will display as normal. That should help a little…but again, if it’s a large database issue, then speeding up the drawing won’t help much.
Another thing…lots of Block objects (especially nested blocks) will slow the system to a crawl because of the way they’re managed and because each block level requires 4 Modelview matrix transformations for each object within the block, which goes up exponentially with nested blocks… So if you have thousands of block objects, then that too can be the primary performance hit…which really has nothing to do with the speed at which objects actually get drawn by the GPU.
For now, the only thing I can suggest is the BBox Display setting I mentioned…but that will only help if the bottleneck is the GPU.
I have files here well over 5 GB that display just fine… I have a 120 million polygon mesh that gets 15fps… So you can see that the GPU is quite capable… It’s the rest of Rhino and it’s object management (getting the objects to the GPU) that’s not working too well…and we’re working on cleaning that up in V6.
I have found another problem when using large files. The actual .3dm
file is 1.447 MB, but it uses 1.355GB memory (linked objects, Rhino5,
etc). At this point, BlockManager does not work (software quits
responding). Insert works (Ctrl I) and the Move command can be used to
place objects in 2D.
I thought that Rhino could manipulate files using the full 64bit bus
… am I doing something wrong?
As I stated, the file monitor indicates that the total file with linked
object files is 1.335GB. The main file links to many objects … we
are designing factory layouts etc
Are you sure you’re using the 64bit version of Rhino and not the 32bit? If you have 32GB of RAM on that machine, you shouldn’t be running into memory problems with such a small file.
Can you post s a screenshot of your entire Rhino Window?
I always create two files of each machine that I design … the engineered model and a “LowRes” version. In the low res version, I delete bolts, chain links, cable carriers (anything that requires a lot of meshes) since these are generally not required info when showing a factory floor layout.
In spite of everything, I am “hitting the wall” at about 1 GB (used memory). I understand how the Rhino database handling issues can create overhead … hopefully this is not the issue since I don’t know if I can wait until V6 to solve this problem.
Ideally, everything can be loaded into RAM and database issues can be minimized. I will keep working and will post any findings
Start Rhino and resize the window so that it takes up half your screen
Start the Task Manager and go to the Performance tab, and position the Task Manager on the other half of the screen.
Load your file.
Watch the memory meter in the Task Manager… I can’t believe that it’s going all the way to 32GB…
I have performed this exercise multiple times. The memory usage does not change (+1% etc). The CPU spikes momentarily to 100% - and the viewport is very erratic when rotating.
The problem increases with file size … the rendering seems to be a function of CPU capability
Wow, ok…sorry for being slow here…I think I’m caught up now.
Is there any way you can send me the file(s)… I would love to see what happens here. You also mentioned that the BlockManager becomes unresponsive…right? It sounds like you’ve run into a bug/problem that we haven’t seen yet…but that’s hard to believe, which is why I really need to be able to load your files here and watch what happens in the debugger.
In my opinion, you should not be experiencing the problems you’re seeing given the files you’re describing… Clearly it’s not a memory issue, so it’s possible something is coming in and corrupting things.
Do you have a server to FTP the files? It will take a few hours to generate the file grouping … I am really interested in figuring out why my system doesn’t work correctly.
I find that Rhino5 is favoured by the shop techs (welders, millwrights, etc) who have extensive experience in working with paper drawings. I use it to advantage to explain the assembly and protocol issues (using viewports) when I co-ordinate machinery assembly … I really don’t want to change to any other products.
A great feature to keep the display responsive was the general dynamic redraw view option - an automatic frame rate based bbox reduction. The bounding boxes wasn’t visible always, only if an user defined frame rate limit was reached. A simple and great feature.
The option is there still, but the feature is broken since months (years?). When a fix can be expected? It would help for the need of this thread too.