Possible to modify a Rhino file outside of Rhino ? ( aka: command line )

I’m stumped, so it’s time to ask the experts :stuck_out_tongue:

I’m trying to help someone with a file they can’t open. It’s quite large in size ( 4.5GB !! ) and I suspect this is due to a crazy amount of texture files embedded within it. I believe it’s a project they were working on while in school utilizing the schools hardware and network.

They are trying to open the file on their own hardware ( laptop ) and it’s just not happening. So I told them I would try to open the file on my system ( which is a bit beefier spec than their laptop is ) and try to resave the file via save_small and / or strip the textures out to bring the filesize down to something a bit more reasonable.

I can’t even open it on my system with the following specs:

3970x Threadripper 32 Core Processor
256GB Ram
Nvidia 4090 GPU
Samsung 990 Pro SSD’s

I let Rhino try and it simply sat there thinking about it for ~45 minutes before I gave up on it. While it was making the attempt, I was able to watch what it was doing and it looked like Rhino was stuck in a loop trying to open texture files that resided on a remote network path I do not have access to. ( It appears to be a OneDrive file location )

So, are there any command line options to convert a Rhino .3dm file that would emulate the save_small command without first loading the file into Rhino ? ( Because that’s just not happening lol )

Or !

Are there any options I can set to tell Rhino to ignore all the texture files ( some of them are 8k textures which might explain the file size ) upon opening or importing ?

I’ve never had to open a file this size nor run into an issue where Rhino took so long to open one at all so I figured I would run it by you all and see what your thoughts were.

Hi James -

I’d at least let it sit overnight before giving up…
At any rate, have you tried importing the file in a new blank scene? In a quick test here with a model that doesn’t contain referenced images, that imports instantly whereas opening it takes several seconds.
-wim

I have tried importing it as well. Same results though, admittedly, I did not let it chew on it for the same amount of time.

I’m making a second attempt as I type this and Rhino is currently eating 12GB of memory while thinking about it for the last 3.5hrs. Good thing I have plenty of Ram . . . .

One of the reasons for how long this is taking is typical of a lot of software these days in that it’s still rocking much of the same coding practices from a decade ago. That particular bit of code utilizing only a single processor core where it could be using far more. ( Maya does it. Blender does it. It’s frustrating lol )

To note, I have 31 processors sitting idle and one core killing itself at 100% while Rhino is doing it’s thing. ( Hint, hint . . . time to update that code a bit to take advantage of multicore processors :stuck_out_tongue: It is 2023 )

Most tasks in “content creation” are inherently linear, they cannot be parallelized at all, nevermind with any reasonable effort or with any guarantee any benefit will be seen. Rhino certainly does have multithreading in some tasks, but invariably a bottleneck comes up where something has to happen and then another thing and then another thing.

Hint, it’s 2023, multi-core CPUs have been common for over 20 years now, people should know by now they’re not a magic bullet. The only use for 32 cores is archaic software rendering and having 31 Chrome tabs open while working.

Yes, even I don’t use the CPU for rendering these days. GPU’s absolutely crush them. It’s not even a contest.

They are nice to have half a dozen applications open all at once though and / or any virtual machines you might be running. It’s just frustrating to open the task manager and seeing all that potential number crunching just sitting there drooling on itself :expressionless:

On a more positive note, after FOUR HOURS, Rhino finally opened the file.

Now I have to figure out what I need to purge from this behemoth to get it down to a reasonable size.

So, while I managed to get the file open, it’s really impossible to do much with it as every single step turns into a long wait. I was deleting all the materials ( 55 of them @ 8k resolution :expressionless: ) and one of them caused the entire thing to lock up. I spent the next two hours waiting and hoping it would resolve, but it never did.

I think I’m throwing in the towel on this one. I did manage to run an audit of the file so I figured I would share the results here.

Document Manifest:
Texture Mapping: 376102 active, 1 system.
RenderMaterial[4] name missing from document manifest.
Material: Error: differences between model table[4] and manifest information.
Material: Error: 1 errors found. Model: 376605 active, 0 deleted. Manifest: 376606 active, 0 deleted.
Line Pattern: 1 active, 3 system.
Layer: 190 active, 1 system.
Group: 564 active.
TextStyle: none.
Annotation Style: 2 active, 12 system.
Light: 57 active.
Hatch Pattern: 4 active, 9 system.
Block: 84 active.
Model Geometry: 376867 active.

Error: 1130476 model components (0 deleted). 28 system components. 1130505 manifest items

Audit Summary:
0 object errors detected.
0 linetype table errors detected.
0 layer table errors detected.
0 block table errors detected.
0 font table errors detected.
0 annotation style table errors detected.
Table tally:
190 layers
84 instance definitions
2 annotation styles
0 fonts
57 rendering lights
1 linetypes
376606 rendering materials
Object tally:
376161 normal objects
0 locked objects
2 hidden objects
0 deleted objects (in undo buffer)
704 block definition objects
0 reference normal objects
0 reference locked objects
0 reference hidden objects
0 reference block definition objects
Audit found problems.