Yes, bake. keyword: Reducing dependenies. No, wrong, eliminating dependencies altogether.
A GH definition must have the option of becoming “self-contained” with regard to plugins.
It’s not an option for all users to install plugins being used in a GH definition, and Yak isn’t (and probably won’t be) supported by all plugins ever (Yak is in fact only yet another dependency).
Pseudo self-contained GH definitions is also an option. Like dumping all referenced plugins into a new folder (a “distributable”) containing the gh-file and the plugins, preferably zipped together but still recognized by GH as an “executable”.
Yak? Well yes, very practical (when it works), and when eliminating dependencies to make a definition distributable is not the aim. Yak is at best only yet another dependency.
Let me explain a little bit more about why the plugin-problem needs to be solved (and why Yak is far from solving it).
Hell won’t freeze over
The ability to “freeze” a complete set of referenced plugins with the correct versions (correct = at the time of designing the GH definition) is crucial for example in research (but in many other cases as well.) But the dll-hell wont freeze over, instead…
AVICI - A place in hell
“DLL hell” isn’t precisely a new disovery. With GH this problem (referenced plugin’s) reaches only a new level - namely the level of Avīci. Self-containment is in fact the only solution to this particular problem.
The only solution. Yak isn’t the solution (since the problem domain involves version handling, archiving plugins “forever”, etc).
Research no longer reproducible nor verifiable
Only self-containment can solve the problem of reproducibility for research, for example. And providing self-containment can also become a competitive advantage if done right. For this end I provide three papers on the subject of an increasing problem in science regarding something as “basic” as reproducibility and verifiability, which is in fact already something of the past. Today’s rapid change in software (versions and hidden evolving proprietary algorithms) and complex deployment simply makes it impossible to reproduce and thus verify many research results out there.
The links below may inspire to new ideas about how to package solutions in the future (this problem won’t go away, and awareness is on the rise (see the links), and so it will be adressed, but hopefully by McNeel as one of the first to recognize the problem as to understand why someone must start to provide working solutions):
Strongly recommended reading. Make this problem, which draws increasing attention in these days, into an opportunity instead. Reproducible research computations may require special licenses (ensuring archived software versions for a longer period of time) as well as deployment solutions.
In any case, self-containment is a first step, not solving all related problems, but at least it recognizes the problem
The folks over at provingground took this approach to the problem you are describing.
and…it’s open source. Maybe this could be built on? I do think the question/challenges you have posed are totally relevant.
Like many of the people on this forum, I work in a studio that shares lots of files back and forth with various designers/team members. To combat the “missing plug-in” solution, we are taking two approaches, (not suggesting this is ideal, nor that it solves the problem for individuals that are working outside the confines of a company network, it’s just what we did to combat the problem)
1 - We try to standardize which plug-ins are used and push them out to our workstations.
2 - We include a custom component, in our company GH template, that lists all the plug-in libraries/names of components, and writes them to a text panel in the definition.
I posted an early version of the component here: (on the old GH forum).
With all of that said…I agree with your position. It would be great to have a more robust/built-in method to address the missing plug-in issue.