Hi @Gijs, let me please share more thinking on this: Currently bocks have an edit mode, and if I go into block-edit mode I should be able to drag/assign materials to objects (regardless of how the block was created, or be able to change its material assignment tule). But also I should be able to apply/change materials to any other objects in other nested block inside that block. So it’s like an edit-blocks-materials mode that works through nested blocks.
The part we still need to figure out: how do we tell we we want to change/apply materials to:
A. all instances of that block?
B. only that specific instance?
Example: I have a model of headphones, I make it into a block so I can have 4-5 headphones on a scene to try 5 colorways. Some parts of all the 5 headphones have exactly the same materials/colors, including a nested block of imported internal components. Then some parts like plastic molded shells and leather cushions need to be of separate colors.
This is a very common work scenario in what we do all the time. Keyshot had a pretty nice logic to deal with this. I wish Rhino did too, and whatever we do in Rhino can seamlessly transfer over to Blender. Since we use Rhino < > Blender for all our ID work.
@nicolas2 in our team has been exploring how to do this in Blender, which is also a challenge.
You might wonder why do we need this in both Rhino and Blender? We do all our ID work in Rhino, even with imported SubDs modeled and edited in Blender. We also have a modelshop and build physical prototypes (3D printed, machined, plated, painted, the works…) so we need to document assembly and finishing instructions for our modelmakers in Rhino. Same for final production engineering and CMF documentation we hand off to clients, and their suppliers and manufactures of the work we do for them.
Besides all this Rhino-centric work for production, we also create virtual assets: renderings, virtual set photography, animations, and a real-time PBR-based pipeline for 3D web (three.js), AR (ARKit, , ARcore), and sometimes VR authoring (Unreal). All that virtual pipeline starts in Blender.
This is why I’ve been asking @nathanletwory for a live-link solution between Rhino/Blender.
Right now the process of having finished-looking 3D assets while the design is still evolving and changing is very labor intensive, yet absolutely necessary for the type of work we do. So we need a way to update geometry and maintain materials assignment, views, etc. that’s why blocks can help us there. We had tried in the past also using Rhino’s snapshots but we gave up because they are slow, confusing and unreliable. also Snapshots do not let us have side-by side comparisons of design and/or colorway options. So we need a way to update and evolve our assets as the design iteration process keeps evolving, at any given point we also need to see how those design iterations, or physical configurations look in various colorway options of those updatable assemblies.
Tough problem. I know. Very different that modeling one product, rendering it once and be done. We have production pipeline problems and it would be great to have your team at McNeel help with some of that pain.
Happy to discuss more, and show you/your team some examples, and pain-points.
Added summary: We do not work in Rhino for everything. We do not expect to make Rhino best at everything, just play well with others. This blocks stuff is a play-well function.
Examples of things that we do not do in Rhino are:
- Layout/documentation work (we only generate realtime views from Rhino and bring to Indesign)
- SubD Modeling (too slow workflow and geometrically, we only convert what we build in Blender)
- Rendering (too slow, low quality, and slow/limited workflow for complex materials, done in Blender)
- Animation (same as above, done in Blender)
- 3D painting/PBR workflow (same as above, done in Substance)
- UV unwrap (same as above, done in Rizom UV)
Thanks!
Gustavo