I have the following challenge and looking for a way to automate the process if possible.
Any feedback or suggestions would be highly appreciated.
There are multiple Rhino files (over 10-20) that are all “work in progress” files with hundreds of layers. My goal is to organize all of them. Instead of doing this manually, I would like to systematize the “scanning” of these files and compare 2 or more files at once and get an output of what is “different” between them, what are the extra layers added, what is the extra geometry added etc, so I have a better idea what’s the difference between files as they all look pretty similar, but there are some small changes done within each one of them. I’m looking to find these changes.
Any suggestions or pointers? Is there a script that can do something like this?
below script should get you started. The script asks for 2 files and then compares layers (their full path). It lists common layers and the ones which are missing in either file using sets.
LayerPathCompare.py (1.9 KB)
This one is a bit more work. You might try to access the object table of each file, collect data and then make your comparisons.
There is no quick n easy solution that I know of.
My first take would be to write a script to compare the current file with another.
The other file would be read in memory to query its content.
For layer names and structure it would be fairly easy. For geometry it would be more complicated because only “low level” identical geometry is easily found.
You could primarily just check for identical object_id’s as a good indication for equal parts
For comparing if geometry is identical by human standards it will be more complicated. eg If two lines coincide but one is flipped there is no out of the box solution to mark them as identical.