'Bake to Layer' component with 'M' (Material) input?

The mMaterial component I posted two days ago, using C# found on this forum, offers more parameters that are useful for creating a material and can easily support more properties defined for the class, such as ‘IndexOfRefraction’.



mMaterial_2020Jan14a.gh (46.4 KB)

Properties currently implemented:

  1. DiffuseColor
  2. AmbientColor
  3. EmissionColor
  4. ReflectionColor
  5. Reflectivity
  6. ReflectionGlossiness
  7. Transparency

And as I said before, why must I become dependent on a plugin for a basic, low level capability like this that should be built into Grasshopper?

As noted, encapsulating the C# code in a cluster allows two big improvements:

  • I can rename the output to match the geometry, which isn’t possible on a C# or Python script component.
  • I can edit the cluster and delete inputs that I don’t need to make the cluster footprint smaller on the canvas.
1 Like

interesting thread @Joseph_Oster.

I tried to look into the jan16a gh file you posted. For me this is complicated stuff. One thing I found is that using
scriptcontext.doc.RenderMaterials.Add() works for adding materials to the doc
but I couldn’t figure out how to retrieve the generated material’s index

the document is set to not draw any geometry

@wim already pointed that out in the other thread.

So I looked at it again and tried to use some of that code but failed. Besides, as noted in my reply to @wim, there is a bug in that example. It doesn’t have a ‘try:/finally:’ pattern and so fails to restore ‘scriptcontext.doc’ as recommended, which causes very confusing behavior if there is an error:

    #we put back the original Grasshopper document as default
    scriptcontext.doc = ghdoc

Personally. based on decades of programming experience, the weird consequences of failing to restore ‘scriptcontext.doc = ghdoc’ indicate a sketchy, prone to failure coding pattern inherent to Rhino scripting - or is it confined to Python?.

And as I said before, why must I become dependent on a plugin for a basic, low level capability like this that should be built into Grasshopper?

You are not wrong, but as I understand GH developments are mostly on hold for GH2. Have you been following David’s progress on GH2? https://www.youtube.com/channel/UCcqTyOLns1CwcGESoXAqyQg

Seems a lot of core stuff will change.

Here’s a quick and dirty version.
First component creates a material that outputs both a “gh” material ;for use with custom preview component, at which point, you could bake that and be done… (python recreation of david’s C# code). The M2 output is a NativeRenderMaterial.
The second component is a very rough “bake with material” option. A better solution would be to not set the material for the first layer in the document, but like I said…rough

Note, the following comment in the developer example:

# At this point, you will see render materials show up in Rhino's
# Materials panel. Note, RhinoDoc.Materials.Count will equal 0. This
# is because we have not added any Rhino materials. We've only added
# Render materials. When you assign a Render material to an object
# or a layer, a compatible Rhino material will be added RhinoDoc.Materials,
# and this material will be referenced by the Render material.

Developer Sample

MaterialGHPY_test.gh (16.6 KB)
Version 6 SR22
(6.22.20009.23271, 01/09/2020)

OK, thanks for the effort. With all the talent on this forum, do we have to settle for “dirty” and “rough”?

Going through the code I see this in the first (bigger) Python component:

  • You added ‘Name’ and ‘SpecularColor’, neither of which are important to me.
  • You did it in Python instead of C#.
  • Just as I did, you export two versions of the material, one for display (‘GH_Material’ for Custom Preview) and another for rendering (baking). Guess I was on the right track but it’s a pity (and a source of possible confusion) that one output can’t serve both purposes?

Moving on to the second Python component for baking, it is “dirty” indeed with the very unwelcome side effect of assigning a material to ‘RhinoDoc.ActiveDoc.Layers[0]’. UGH!! That won’t fly, even though it appears to work.


How can that be fixed?

Also, I don’t see the ‘try: / finally:’ construct that restores ‘scriptcontext.doc = ghdoc’ in either of your Python scripts? Are you aware of that and what can happen if an error occurs without it?

Thanks again for giving it a go. I think “we” can do it a little better though, eh?

P.S. OH! Wait, there’s no layer name input to your bake component? :frowning: :cry: :sob: Adding that could fix the problem of assigning the material to layer zero?

So, couple things.
1- if you are using STRICTLY rhinocommon, and NOT rhinoscript, (as in my example), you can, (unless needed in another area of your code), skip the chaos of scriptcontext.
2- see the comment from the developer sample. I cheated and let rhinocommon associate the RENDERMATERIAL with the MATERIAL by adding it into the document by assigning it to a layer. Another option, )not the best, but quick), would be to simply create a dummy layer, add that to the document, assign your material to the dummy layer, finish up whatever code you need to run, then dump the dummy layer.
3- yes, I’m familiar with try/except…if I was writing this for production, I certainly would have included.
4- while this is quite dirty and provided as a proof of concept, I do agree that it would be nice to have natively.
5- lastly…make the code I posted better! I would guess that the troublesome spot for you was what was quoted in my post.

You’ve discussed a lot about technical details. I would like to voice my idea about software engineering.

Grasshopper used to be a personal project. Although David is employed by McNeel now, the infrastructure, the divisions of components are still well-preserved now. Much evidence is there to show GH undergoes a rapid-iteration development pattern. Tons of components are added, altered, or obsoleted. It’s never easy to make a comprehensive list of what components should be included, as Rhino itself is constantly progressing. I do think neither of rhino and GH should block each other from advancing, just to ensure feature consistency.

My personal thoughts on why baking is not a component are: baking involves ObjectAttributes, a very complicated structure including a lot of other complex data types, such as Material. It’s difficult to abstract those properties in a GH syntax context, without affecting RhinoDoc in every intermediate step. I know you can do lazy evaluation, however it means you will represent every Rhino attribute in GH-format again, which I believe David may not deem as a priority.

Moreover, you must have noticed all the “system” components are from libraries - the only thing is they are shipped with Grasshopper. Extensibility, is always a big factor in system design. One man’s effort is hard to compare with a community.

Why bother inventing wheels again and again? I don’t see making use of a plugin a bad thing. The issue is in how GH plugins are managed and how new component is introduced. Dynamo has a package system. You type package’s name and in few seconds it is installed. Yak is far from optimal now. Besides, I also advocate the approach of Linux’s shell. When you type a unknown command, shell will prompt which packages include that, and potentially install it instantly. Then the feature can be used like a built-in one, just slower, depending on your network.


I got it. WAY COOL!!!

The ‘mSrc’ (Material Source) input defaults to ‘1 = from object(used by the Box in this image), but can be ‘0 = from layer(used by the Sphere in this image) or ‘2 = from parent’.
If you bake geometry with ‘mSrc = from layer’, all other geometry on that layer will be affected by any change to the material, like these spheres.
The boxes, baked with ‘mSrc = from object’, don’t affect each other when a new one is baked with a different material.

There is very little code involved. ~25 lines of Python not including comments and trace statements.

I deeply resented every moment spent on this task, for several reasons.

  • It completely distracted me from focusing on my primary goal of creating a semi-complex multi-part GH model using Data Output and Data Input.
  • It required sifting through many, MANY “hay stacks”, searching for the bits to make it work. Many dozens of forum posts over a ten(!) year period from others struggling with the same issues, pouring over Rhino API documentation and examples that never quite put all the pieces together.
  • I know there are many people here far more qualified than I am to write this feature, so why hasn’t anyone done it? Why isn’t it built into Rhino 6 where material rendering is so advanced over R5?
    Is it a test to qualify for the “cool kids” club? Is posting this code taboo? Will it disqualify me from membership in the priesthood?

I’m going to test it for awhile and think about that…

Polite words are inadequate to describe the frustration I felt being forced to do this myself… :man_facepalming:

I don’t see it that way at all. Compared to the rapid release schedule for web applications in Silicon Valley twenty years ago (bi-monthly if not daily), the update pace for “major” new features and bug fixes in Grasshopper has been measured in years.

I’ve been playing with Grasshopper and active on the forum(s) for almost six years. There is an ivory tower atmosphere around Rhino that fully accepts the need to write your own tools in VB, C# or Python using RhinoCommon to get “serious work” done, as both you (@gankeyu) and @chanley suggested. Most people are completely incapable of that and are forced to settle for what is built into Grasshopper.

When you have a GH model with a dozen layers or more of geometry that need to be re-baked many times per day, the time, tedium and effort saved with a ‘DELETE All’ (by layer) button and a ‘Bake’ (all to separate layers) button, as shown here, is truly incredible. It changes everything about the work flow. And there is no need to send a Rhino .3dm file with a GH model to have full control of materials and layers.

I’ll probably have more to say about this but that’s my report from the Python trenches for now. Ciao.

1 Like

Good for you Joseph! And I feel you, I also try to make my definitions 100% native, for numerous reasons.
Some have to do with the horrible support for plug-ins that GH has: There has been some baby steps in the right direction with the package manager, but it is still in diapers. It hardly works and I end up updating or installing the plug-in/s manually. Not only this, but shearing definitions or using definitions from someone is still a pain.

Some others have to do with the UI, which offers not customization or control of where the components of plug-ins go.

And some others have to do with a more personal and philosophical matter. I often felt that swarming my work space with custom unofficial tools was untidy. I also don’t like how every plug in has its own icon and color palette, which really messes up the definition. Native compones all share a common theme. (And no, I won’t use names on my components because that means more information to process, which icons simplify. Its like logos, why use sentences when you can have a logo?).

Having said that, I really don’t understand why you had to do it yourself. In my case, I don’t have the time to do so, but even if I did, I wouldn’t do it either.

Why waste time on a problem that many people faced before you, solved, and gave the solution away for free? Even more, they packaged all those solutions into a nice plug-in with explanations and tutorials.

I think you got away with it this time, but maybe in the future you will need to do more complex stuff, for example use blocks. Are you going to code custom components to deal with blocks in GH?

I remember you used Anemone a lot, maybe try to implement the same argument to use other must-have plug-ins.

I agree with you that simple functions like these should be native to GH. I was shocked when I found out that I needed a plug-in called Human or Elefront to achieve these, but David can’t do it all and that is something we must understand. This is not Autodesk, with over 10,000 employees.

I guess David figured that if someone went through the trouble to design, code, implement and offer functionalities for free, then these were no longer top priorities. And honestly, I think it’s the wisest decision. Who knows which other things we would be lacking if David had directed all his efforts into coding already achievable things with plug-ins.

As discussed here, the built in Create Material component in R6 has a flaw that hasn’t been fixed since it was reported two years ago:

And none of the plugins I looked at offered as much control over materials as the C# code from David that is the basis for my mMaterial cluster in that thread. It can be easily extended or, as I did in today’s example, unused inputs can be removed from the cluster to make it smaller.

As for ‘Bake To Layer’, I cobbled together a simpler version of that almost two years ago when I trialed R6. That version depended on the materials being manually created and assigned to each layer in Rhino but still, the ability to instantly delete and re-bake dozens of layers of related geometry was a revelation. Muy bueno! Now I want more, and I’ve got it.

My aversion to plugins, except when absolutely necessary (like Anemone), stems from decades spent learning and adapting to an endless series of “frameworks” for web application programming in Javascript, Java, JSP, PHP and even CSS. Each one considered themselves to be the ultimate layer of abstraction. Very few were really that unique, like jQuery and Angular JS, which were ground breaking concepts. Grasshopper is fun but “big projects” (a relative term) generally require more than what it currently offers. Additional expense, commitment to someone else’s coding ideas and C#/Python can each be deal breakers.

stems from decades spent learning and adapting to an endless series of “frameworks” for web application programming in Javascript, Java, JSP, PHP and even CSS.

I would say, yes, nowadays front end industry is full of incompatible, difficult-to-learn frameworks. Angular introduces breaking changes every major version. I really don’t like such methodology. It’s also true for today’s “serious” java development. There’re factory classes, factory’s factory classes, etc.

I don’t see it that way at all. Compared to the rapid release schedule for web applications in Silicon Valley twenty years ago (bi-monthly if not daily)

That’s why I said “evidence”, instead of “Grasshopper is following a rapid-iteration model now”. Currently GH is a native part of a commercial product. Maybe Mcneel thinks it’s better to test it fully before release. I won’t comment that.

My idea

First, I don’t think down upon everyone’s own effort to, for example, make your Material implantation. I don’t see any plugin now can support Material well.

However it may be too demanding to ask Grasshopper to include everything, at least for now, as all but limited resources are put into GH2. Certainly Material can be listed as a feature request and I would be more than happy to see it as a native feature (I’ve been tackling Texture Mapping recently, which is also difficult if there’s no native way to abstract Material).

Anyway, whether, what and when to support something are purely commercial decisions.

I added some exception handling to what I had three days ago and was pretty happy with it until I tried passing materials through Data Output/Input. I already knew that works fine for Custom Preview (Display Material) but found that I couldn’t do it with Render Materials for baking. So I resorted to exporting JSON from mMaterial and adding a json_M component to replicate both Display and Render materials from JSON via Data Input. It’s embarrassing and I’m sorry I couldn’t do it otherwise. Now three outputs from mMaterial instead of one… :man_facepalming:


Here’s the code in demo models:

demo_mMaterial_2020Jan23a.gh (40.4 KB)

demo_mMaterial_bake_to_layer_2020Jan23a.gh (50.7 KB)

STEP1_mMaterial_bake_to_layer_2020Jan22a.gh (44.1 KB)

STEP2_mMaterial_bake_to_layer_2020Jan22a.gh (18.4 KB)

And for what it’s worth, a video:


By it’s nature, software is layer upon layer of abstraction and every programmer creates their own layers when writing an application or creating a tool for others to use. It’s marvelous on the one hand and a complete Tower of Babel on the other.

I got an email from a recruiter two days ago for a “Senior Full Stack Developer” position with these requirements:

To be a good fit for this position you will have:

  • 5+ years of web applications development experience
  • Experience with large scale external facing web applications
  • Experience with most, or all, of the following tools/technologies:
    • Python (required), PHP, Java, MySQL, Javascript/Typescript, Node.js (Full Stack JS), CSS3, HTML5, MVC, GIT, or SVN
    • Proficiency in Web Services (REST/SOAP/GraphQL), SOA concepts, Web Front End, and Batch programming
    • Expert experience in JavaScript frameworks React, VueJS, Angular, etc.
    • Experience in MySQL Server or other RDBMS such as PostgeSQL, Oracle, SQL Server, DB2 or MS SQL
    • Multi-database and multi-language experience preferred

Very standard stuff, right? This list has been typical in job descriptions for at least a decade. I’m familiar with most of them, have struggled with some of them (STRUTS/Tiles) and developed a high level of proficiency with a few of them. But does any single project or programming position use ALL of those frameworks at the same time? No. Do average people talk or think like that? No, because at some point it’s just babel: a confusion of sounds or voices.

The idea that we can each program our way out of limitations in Grasshopper, individually solving the same problems over and over, or just load up on plugins and move forward with someone else’s abstraction layer, regardless of constraints and dependencies and costs of refactoring a large code base (changing horses in mid stream) is just more babel to me.


Updated. Now handles groups. Open source Python.


mMaterial_2020Jan30a.gh (29.5 KB)
mMaterial_bake_to_layer_2020Jan30a.gh (39.5 KB)
STEP1_mMaterial_bake_to_layer_2020Jan30a.gh (40.0 KB)
STEP2_mMaterial_bake_to_layer_2020Jan30a.gh (16.1 KB)

NOTE: The ~5 second benchmark in this video is more than three times longer than the same operation takes when not running the video capture software…?

What are blocks? Again, I’d rather not be distracted at all by any of this. It would be much better for everyone if bake functionality like this were included in standard Grasshopper.



Hi, Joseph. Thanks a lot fot the development, it works amazing!!

Hello. you can add the ability to bake a group of groups.

babel = open source
some parts fall apart, some make the tower much bigger and better. We’ll see what happens with your brick.

honestly, David is doing a great job, and even people that are not in this forum but use grasshopper on daily bases told me he was a genious for what he did with grasshopper. Thanks to many very smart hackers like the creator of weaverbird, elefront, kangaroo, pufferfish etc have brought improvements and knowledge to rhino and grasshopper that are worth much more than a “proper rebuild from scratch” every time. these have been bases for other people to push further their limits of knowledge and possibilities. In addition people having done plug-ins are very often on this forum to support users. We grow together.

But I admit to not understanding why you’d write in such a blaming manner. Someone writing a code in c# for you should not be blamed for not doing it “properly” as you wish. Take it as a base and develop. Grasshopper should not be blamed officially on your website for that detail just to underline you did a plugin by assembling code snippets found on this forum and take the probs for yourself…thats what I would call a overkill.
Do like a real nerdy hacker does, learn c#, learn rhino structure and write a nice and useful plug-in. like david did. like daniel did. like michael did. and then get real probs for it.

You use to be very helpful for users yourself in this forum. You managed to add a useful function for the community of non programmers and everyone being in the need of these functions will be very happy you did this work.

don’t get me wrong, you are very valuable and helpful for this community.