We really need to have our userobjects in the could. I sometimes have to work from different locations. Some userobjects need to be shared between teams. They also have to be organized by projects.
Right now, keeping them into network folders creates chaos, there are several versions of the same component adapted by different people. It’s hard to track changes.
Running pythonscripts from a cloud folder could be an option but the userobjects folder in GH cannot be changed.
The thing is that one userobject might develop quite fast, with several daily changes.
So it would be nice if they could show up immediately in the userobject tabs without extra manual uploading/downloading. Especially because we have to share them with users which are not so experienced, and also because there’s not enough time.
I’m nowhere near this level of detail for GH2, but the first iteration will almost certainly rely on some sort of external version control system (such as Git) since that would allow for a really nice separation of responsibilities.
You can set GIT to synchronize repositories on some interval. And if this repository is placed in the folder of the user objects of GH it has to appear in grasshopper.
Though, this is untested but should work.
The problem with this kind of workflow is that after using an userobject and someone modifies it, you’ll have to go in all old definitions and replace each instance of it. A validation process has to be implemented. I think it would be better if they are not synchronized that often. Maybe twice a day (during lunch and after everyone has left).
Hi @DavidRutten ,
A good example could be the Dynamo platform, where I can scroll trough several versions of the same objects.
But I could also see it as a network/cloud location for userobjects. Some could stay local, others could be picked up from there by several computers. And changes in that network/cloud location could be managed by admin.
Is it too complicated to add a remote userobject special folder?
No, but part of the mess we’re in now is exactly because of this sort of organic feature creep over time. I’d very much like to stop doing anything about plug-in loading and versioning and leave all that stuff to the Yak developers. @will is waaay more knowledgeable than me when it comes to distributing and managing packages and versioning. So if Yak provides the functionality for all of Rhino and GH2 can just piggy-back along on that… ideal world.
To me at least, adding a remote folder could be quite user-friendly. And it will place the cat in someone else’s bag for sure. The folder could be in a Dropbox for example. Versioning becomes the user’s problem. It will only provide a way of distributing the components.
there has to be some control over who can upload to Yak. I also doubt you’d want to share every definition you have to the public. I believe @DavidRutten is talking about plugins rather than userobjects. You will be able to distribute the plugins you use across the company/department to make sure gh definitions work on each workstation, but distribution of userobjects will have to go down a different pipe. A private one. Even if Yak starts to support UserObjects and connects it to Food4Rhino can you imagine the amount of requests they will get? Many of them will be “garbage”, test userobjects. Seems like a hell of an infrastructure to maintain.
Being it GIT or “network folder” or “network drive” or “cloud” or “local ftp server”, you’ll have to make some scripts to distribute the files and some mechanism to check their version. If you allow everyone to modify a chaos will emerge for sure.
Yes, I am mostly talking about userobjects. And for those I think a remote folder could be enough. I totally agree with you that yak should have a complex management system.
“You’ll have to make some scripts to distribute the files” - why? once is there, the userobject will be picked up as it gets picked up now from the local. And like I said, versioning is user’s problem. Modifying files is IT administration problem.
To reduce the amount of work the IT has to do. I’m not talking about Rhino/GH scripts, but batch/power shell/bash whatever they use to link all machines to that folder. Perhaps you should ask your IT department, they may implement a quick solution.
At my workplace they use Java application that deals with all software/plugin/configuration deployments.
I’m sorry but still I don’t understand the problem.
The computing for the userobject should be done locally. The folder is just for distribution. It’s a simple solution. (even though I can imagine a SAAS like scenario - but let’s not get into that)
And we do share folders all the time between us, containing files which are far larger than the userobjects.
Everyone shares a folder and have his own edited files in their own folder. How do you transfer the files from everyone’s folder so that each folder contains the same files? Through a distributing application or some script.
One approach is all folders are in constant sync with each other. This would be a bad solution leading to many errors if two persons edit the same file before the scheduled sync. A conflict that will lead to corruption or multiplication.
The second approach is if all files are saved on a server and this server handles the versioning and the conflicts. Much like GIT. For this commits and push requests are done semi-manually and the admin (IT) has to decide or code a script how the conflicts will be resolved.
You are describing a much more complicated scenario then what I’m proposing.
Think it like this: There’s just 1 admin. That admin only adds userobjects in a remote folder.
From that remote folder the userobjects are picked up by everybody else.
That’s it, no more than this.
(I even changed the title of this post to avoid confusion)