Load and save into own file format


we are developing a c++ rhino7 plugin with a rather complex data structure. We have some non-serializable data and want to save this into an own file format (Basically a container format with our own data and a 3dm file inside). We also want to make it impossible for the user to save as 3dm when data is created with our plugin.

Is there a way to modify the load and save process, so we can store in our own data format?

I know that you can create import and export plugins for this purpose, but we don’t have our data in a shared DLL and a plugin can only be an Import or export plugin. Restructuring the plugin would be a lot of work, so I am interested if there is another way.

Thanks for your help,

“Non-serializable data … want to save to file” - sounds like it is serializable?

Why not save it to 3dm file as plug-in data? This is the preferred way to do it, and it sounds like you may not be aware that this is possible. In my experience any type of data can be written to 3dm file as plug-in data.

You should override these functions in your C++ plug-in code:

  • CallWriteDocument and have it return true if you want to write
  • WriteDocument and write to the ON_BinaryArchive object
  • ReadDocument and read from the ON_BinaryArchive object

Tip: start writing version information from day 1, it will help you to update your data model and keep reading older files without problems.

You are right the data is serializable, but they are not standard ON_ data types.
I am not aware that I can put any data into 3dm files. I thought that that only ON data types are supported.

What would be the right way to store binary data like a jpg into a 3dm file?

Hi @Daniel_Ahlers ,
Rhino’s 3DM file format can be embedded within your custom file format. You can serialize the 3DM data using Rhino’s SDK and then include it in your custom file format.

There you go with an example
Save command :

import rhinoscriptsyntax as rs
import Rhino
import json
import os

def create_secretsauce():
    appdata_path = os.getenv('APPDATA')
    temp_3dm_path = os.path.join(appdata_path, "temp_model.3dm")
    merged_path = os.path.join(appdata_path, "merged.secretsauce")
    delimiter = b'---SECRETS---'
    Rhino.RhinoDoc.ActiveDoc.WriteFile(temp_3dm_path, Rhino.FileIO.FileWriteOptions())
    # Create JSON data for each Brep
    breps = [obj for obj in rs.AllObjects() if rs.IsBrep(obj)]
    data_to_save = {str(brep): "123" for brep in breps}
    # Merge .3dm and JSON data into .secretsauce
    with open(temp_3dm_path, 'rb') as _3dm_file, open(merged_path, 'wb') as merged_file:

    print("Created .secretsauce file at:", merged_path)


Load :

#! python3
import rhinoscriptsyntax as rs
import json
def load_secretsauce():
    secretsauce_path = "C:\\Users\\admin\\AppData\\Roaming\\merged.secretsauce"
    if not secretsauce_path:
    delimiter = b'---SECRETS---'
    with open(secretsauce_path, 'rb') as file:
        content = file.read()
    delimiter_index = content.find(delimiter)
    _3dm_content = content[:delimiter_index]
    json_content = content[delimiter_index + len(delimiter):]
    temp_3dm_path = "temp_loaded_model.3dm"
    with open(temp_3dm_path, 'wb') as temp_3dm_file:
    rs.Command(f"_-Open {temp_3dm_path}")
    data = json.loads(json_content.decode('utf-8'))
    print("JSON Data:", data)


It would be more efficient if you just stored a json file in the same folder with the data encrypted, you could store with any extension you desire, without having to embed data onto files, it gets messy and files get heavy, but if you want to limit your users ability to deal with your model unless they have your software it can work.
This is something I hacked together in 5mins I’m sure you can find a remarkably better approach, but this is more or less how you could tackle the problem.

Hope this sparks some inspiration,


Any byte array can be converted to a Base64 string and saved / loaded as string.

Another option is ON_BinaryArchive::WriteByte(size_t size, const void* data) with ON_BinaryArchive::ReadByte(size_t size, void* data). I would then first write the length using WriteLong(size), so upon reading you first read the size of the byte array using ReadLong, then allocate the correct size, and finally read the byte array into the allocated buffer.

1 Like

Thank you for your solutions, I will try them in my plugin.

1 Like

Oh, I even see ON_BinaryArchive::WriteCompressedBuffer(size_t size, const void* data) and its counterpart ON_BinaryArchive::ReadCompressedBuffer(size_t size, void* data)

and a way to get buffer size without keeping track of it yourself : ON_BinaryArchive::ReadCompressedBufferSize(size_t *size)

Hi @Daniel_Ahlers,

You might start here:

There are several sample projects that demonstrate plug-in user data on our Samples repo.

Let me know if you have any questions.

– Dale

1 Like