I’ve had a look at splitting my huge GH definitions using DataOutput and DataInput and yes, they would have done the job, but only if :
I could define the File name via an input (I need to concatenate file names), and
Save and Read back data from (default) the same directory as the Rhino model (or the GH def), or from an optional (input) Path
I need to have full control of my “data flow” (which can vary, and can be complex) so I will have to write my own Data Output/Input components for my needs.
Q: But how do I save individual GH objects (do I have to bake them first?) to file (fast & compact format?) in the most straight forward way? Are there any code examples/links? Prefered File format? Etc. A simple (C#) example would get me started.
You can attach a timer to them (to the one without number input, the one with the input will update every time you make a change) to update or make them self update with some tweaking.
By the way, using sticky will update the data at a speed defined by the timer (or the self-update function) while DataInput-Output are set to update at 1s delay. This slows down everything.
I need to write objects to disk from separate GH definitions, with a naming convention based on (Rhino || GH) source file as well as “processing step”.
Such an approach makes it possible to:
Distribute Processing.
Delay Distribution & Processing.
Restart a failing Process from the Porcessing Step where it went wrong.
Separate different concurrent jobs (naming conflicts)
Sticky or any form of internal variables (Plugin or Document scope) thus is not an option.
I’ve been increasingly serialising large chunks of RhinoCommon and custom objects to/from files as part of my modelling pipelines (especially for analysis stuff, but also when exploring design spaces and generating phenomes). This is using Python pickling, but .NET will surely have some namespaces for this as well. @dave_stasiuk added some components for writing/reading data to file in his TreeSloth plugin, this might work for you OOTB, or serve as a starting point for developing your own custom workflow perhaps.
You might want to use GH_IO.Serialization. But the serialized data will stay associated with the particular GH file that contains the component instance (right, @DavidRutten?) , so it’s probably not what you want.
Here’s how the Data Output component creates a file which contains multiple datatrees:
GH_LooseChunk archive = new GH_LooseChunk("Grasshopper Data");
archive.SetGuid("OriginId", InstanceGuid);
archive.SetInt32("Count", Params.Input.Count);
for (int i = 0; i < Params.Input.Count; i++)
{
IGH_Param param = Params.Input[i];
IGH_Structure structure = param.VolatileData;
if (string.IsNullOrEmpty(param.NickName))
{
AddRuntimeMessage(GH_RuntimeMessageLevel.Warning, "Parameters without a name will not be included.");
continue;
}
GH_IWriter chunk = archive.CreateChunk("Block", i);
chunk.SetString("Name", param.NickName);
chunk.SetBoolean("Empty", structure.IsEmpty);
if (!structure.IsEmpty)
{
GH_Structure<IGH_Goo> tree;
access.GetDataTree(i, out tree);
if (!tree.Write(chunk.CreateChunk("Data")))
AddRuntimeMessage(GH_RuntimeMessageLevel.Error, string.Format("There was a problem writing the {0} data.", param.NickName));
}
}
byte[] data = archive.Serialize_Binary();
try
{
File.WriteAllBytes(Destination, data);
}
catch (Exception e)
{
AddRuntimeMessage(GH_RuntimeMessageLevel.Error, e.Message);
}
Looking at it now, it’s not the most efficient way, there’s no need to both access the VolatileDataand call access.GetDataTree(). Those amount to the same thing.
One problem I found is that, when I want to save a capsule menu option, the component’s Writer runs before the Menu_NameClicked event, which means that I always have the wrong (bool toggled) state stored in the file. For example the below event is stored as false, although the actual state is true: