Change Management For Rhino Files


#1

I’m interested in hearing from anyone who has set up change management for Rhino through readily available SCM tools. As the number of files and changes in support of my product development grows, it become increasingly more difficult to manage the the changes. iIke to know what other users are employing to manage their work other than the flat file management through Windows Explorer. Particularly interested in the ability to branch off a baseline file in support of different product development streams.

Thanks,
Chuck


GH file revisions property
Change management for Rhino projects
#2

We’ve tried managing rhino files within git. For smaller files this worked pretty ok, when working with large files - the git repository got huge, and it became more and more difficult to manage. It should be doable, but it was quite difficult to manage, and in the end we reverted to plain file management.

Also the lack of a diff and a merge made it to be quite difficult, but perhaps this can be created.Tools could be developed to compare different geometries.
Git supports different diff/merge tools for custom file formats:
http://git-scm.com/book/en/Customizing-Git-Git-Attributes#Binary-Files

Eg: A rhino diff command, that will show the differences between two files.
Or a Rhino Merge command: to merge three files: an original file, a file with changes for user A, and a file with changes for user B. This is not extremely trivial, but I can imagine a script comparing the state of different guids, and based on that move them to a different layer.

For geometry it should be doable, to include everything saved in a Rhino file can get quite heavy. (We gave up at the point where the git repo ended up being around 5gb; the committing and fetching changes became unbearably slow, and we reverted back to dropbox).

For managing the git process, usually git keeps a copy of all files, with rhino files it will probably mean you’ll keep a full copy of each commit. meaning that fetching your repository can take a while and becomes quite expensive, also committing becomes more expensive. This means that you’ll need some discipline to manage the repository: To not always checkout a full repository, to have a fast uplink to the a powerful server to manage the remote repository, and perhaps to learn how to do partial checkouts.

Perhaps it also requires some discipline to have people working to add meaningful commit messages, and to keep the


#3

If you would like to use GIT for version management of Rhino files I recommend you install it on your local computer or a local server. This speeds up the checkin/checkout, push/pull process. In addition to GIT I recommend Source Tree, a GIT client with a nice user interface. With this you can control branches etc.

http://git-scm.com/


(Nathan 'jesterKing' Letwory) #4

I would advice against Git as versioning tool in projects where a lot binary files are used. Especially if they can become large (anything up from 100k).

As already mentioned, a Git repository will grow large very quickly. When one clones a Git repository the entire history is downloaded as well. Cloning only parts of a Git repository is also impossible, so you’re always stuck with getting all of the project.

If you work in a team where people manage distinct different parts of the project something like Subversion (SVN) works much better. SVN fetches only what is necessary, as well as has the possibility to check out only a small part of the project without running into problems.


#5

Thanks to all for the insights. Your responses pretty much echo my own thoughts about using an SCM tool to manage changes. My experience has been that most of these tools are geared towards text-bases files, and those tools which do accommodate non-text files types are usually clunky at best. SVN probably has the most potential, however, I’m skeptical of its ability to merge changes effectively and accurately across branches (which is really what makes these tools so useful, amongst other things). I’ll probably set-up an instance of SVN and play around with it to see how it handles the files and it test out its ability to merge changes .

Cheers,
Chuck


#6

Since, as others have noted, Git and other SCM tools are REALLY not designed for binary files, I would propose that the best version controls are the ones you design yourself. I can think of two possible solutions:

  1. Set up a project directory structure where you basically have a new folder for each day (or subday) and you save-as a new copy of your file at each of these points in time. Granted, depending on the size of your files, this can become as storage intensive as the git-versioning mentioned before, albeit with the advantage that you don’t need to access ALL the data at once (as you do in cloning), but only the most recent file.

  2. Have one layer (or more likely, a master and sublayers) for each time increment you’d like to version on. Then you can just leave the older layers alone, and build new things only on the current time-layer. Again the effectiveness and wieldy-ness of this is dependent on your file complexity. This solution is nice because you can basically traverse your version “tree” by showing/hiding each master time-layer

If you’re working on a networked drive (like I do) with other people, you get merge protection for free since only one user can have save-access at a time. You don’t get the nicety of true “merging”, but you could accomplish this in a way by forcing team members who need to work on the same files to append their initials or something to their master time-layers. Then you could build a simple tool to pull the different (same) files together at the end of the day. Everything is on different layers but its easy to merge everyone’s work that you want to keep into the master-master layer for that day.

Hope that wasn’t overly confusing :wink:


#7

Has there been any progress on this front?

As discussed before, Git itself is not a good fit to store large binary files, but recommending Subversion or a network share doesnt feel like a super solid solution either. As one simple alternative, there’s the Git LFS (Large File Storage) project that allows exactly this to work. But besides that, it might be interesting to explore other possibilities.

And before I continue, a little disclaimer: Im very new to Rhino, so forgive me if Im saying something that just makes no sense :slight_smile:

With that said, I was thinking it would be interesting to have the possibility of exploding the big Rhino file into smaller non-binary files, as an alternative on-disk representation. E.g. instead of opening one .3dm file, we would be opening a folder ending with .3dm that contains the same content as a current Rhino file, but stored in smaller files. These files would be subject to be diffed, merged and all the niceties we get from source control systems.

Does that make any sense?

Cheers!
Gonzalo


(David Cockey) #8

Worksession allows opening multiple files in Rhino. The user does need to manually change which file is “active” to modify geometry in different files. http://docs.mcneel.com/rhino/5/help/en-us/commands/worksession.htm


#9

Hi David!

Thanks for pointing me at this worksessions stuff!
Definitely looks like a very good step in the right direction, but the individual .3dm file would still be stored in binary format, right?

Is there any chance to store the 3dm file in a text representation? Or would the only option be to create a Rhino diff tool using the openNURBS project?

Cheers
Gonzalo