How to create physically Based Material with Python

Hi @cristiano.pasini, you would probably skip using a physically based material and use a simple bitmap based one with self illumination enabled.

_
c.

Hi @nathanletwory,

thank you for your example. I’ve tried to use it to write a converter which imports physically correct materials from another software into Rhino. I’ve found a few issues which i do not understand though:

How to add eg. “sheen” to the pbr and make it show up in the UI ? I’ve tried to set it like so:

pbr.Sheen = 0.4
pbr.SheenTint = 0.75

but the created render material does not show the slot for Sheen. If i enable it manually after creation from the “Detailed settings” dropdown, i see that it has set the pbr.SheenTint value, but not the amount, it stays at 0.5 ?!

I have the same problem with other properties, eg. setting pbr.Specular = 0.1 the Specularity section does not show up in the UI. If i add it by hand, i see that it has set my value to Specularity F0.

Setting pbr.AnisotropicRotation works, but setting pbr.Anisotropic fails (stays at 0.5)

btw. what is “PhysicallyBased.Opacitylor” property ? A typo in the documentation ?

thanks,
c.

The parameters are set, but not visible - that is a bug in the GUI part. You can verify the settings have been set by right-clicking the material you created programmatically and save it as a file. Then open the .rmtl and check pbr-sheen and pbr-sheen-tint tags. You’ll find that the values you set them to are recorded in that XML.

I have logged RH-69463 Setting programmatic PBR property doesn’t show in GUI for you.

Hi @nathanletwory,

thank you for adding this to the bugtracker. I ended up using a completely different way to create the Physical Material from scratch with all slots as defined in the rmtl. This way it was possible to skip messing around with SimulatedMaterial and there is access to all slots without “jumping through hoops” :wink:

Speaking about the Physical Material GUI, i hope there is still room for improvements. Since all the settings do not provide sliders but numeric input fields, i find them hard to edit. Especially if you position the mouse in an input field and scroll to change the value, it sometimes scrolls the panel content instead of the value. Can this be please improved ? It does not happen with eg. numeric fields of the Custom Material.

A real show stopper of the Physical Material is the fact that bump / normal displacement displaces only bi-directional. I’ve asked this some months ago and it would be really helpful to displace in normal direction (outward only). IMHO this is what an average user expects to happen by default instead of the current behaviour. Perfect would be to have a “Direction” dropdown which allows users to set this, eg. Bi-Directional, Normal, +Z, -Z or custom Vector as it works in Blender.

Hopefully @andy reads this too.

The bug i’ve reported that the Physical Material displays reflections incorrectly when only a displacement is applied is still there and no bug report exists for it. A user is required to assign the same image as bump to get proper reflections. Is this by design and if so, what is the purpose of it ?

The density level when displacing subd objects is ignored by the PM shader using rendered or shaded viewports. It’s stuck to the default division level but i need much higher levels to use it.

Please, please fix these bugs in Rhino 7 and don’t postpone them to Rhino 8.

thanks,
c.

The ui sections can be set to visible with the below code.

# Set pbr ui sections visible
new_pbr.SetParameter("pbr-show-ui-basic-metalrough", True);
new_pbr.SetParameter("pbr-show-ui-subsurface", True);
new_pbr.SetParameter("pbr-show-ui-specularity", True);
new_pbr.SetParameter("pbr-show-ui-anisotropy", True);
new_pbr.SetParameter("pbr-show-ui-sheen", True);
new_pbr.SetParameter("pbr-show-ui-clearcoat", True);
new_pbr.SetParameter("pbr-show-ui-opacity", True);
new_pbr.SetParameter("pbr-show-ui-emission", True);
new_pbr.SetParameter("pbr-show-ui-bump-displacement", True);
new_pbr.SetParameter("pbr-show-ui-ambient-occlusion", True);

I modified @nathanletwory code here below

import Rhino 
import scriptcontext as sc



bmtex = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.BitmapTextureType)
bmtex.Filename = "C:\\Users\\Nathan\\Pictures\\uvtester.png"

simtex = bmtex.SimulatedTexture(Rhino.Render.RenderTexture.TextureGeneration.Allow)

#
#print(Rhino.Render.ContentUuids.PhysicallyBasedMaterialType)

# first create an empty PBR material
pbr_rm = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.PhysicallyBasedMaterialType)

# to get to a Rhino.DocObjects.PhysicallyBasedMaterial we need to simulate the
# render material first.
sim = pbr_rm.SimulatedMaterial(Rhino.Render.RenderTexture.TextureGeneration.Allow)

# from the simulated material we can get the Rhino.DocObjects.PhysicallyBasedMaterial
pbr = sim.PhysicallyBased;

# now we have an instance of a type that has all the API you need to set the PBR
# properties. For simple glass we set color to white, opacity to 0 and opacity
# IOR to 1.52
pbr.Opacity = 0.0
pbr.OpacityIOR = 1.52
pbr.BaseColor = Rhino.Display.Color4f.White

pbr.SetTexture(simtex.Texture(), Rhino.DocObjects.TextureType.PBR_BaseColor)

# convert it back to RenderMaterial
new_pbr = Rhino.Render.RenderMaterial.FromMaterial(pbr.Material, sc.doc)
# Set a good name
new_pbr.Name = "My Own Glass"

# Set pbr ui sections visible
new_pbr.SetParameter("pbr-show-ui-basic-metalrough", True);
new_pbr.SetParameter("pbr-show-ui-subsurface", True);
new_pbr.SetParameter("pbr-show-ui-specularity", True);
new_pbr.SetParameter("pbr-show-ui-anisotropy", True);
new_pbr.SetParameter("pbr-show-ui-sheen", True);
new_pbr.SetParameter("pbr-show-ui-clearcoat", True);
new_pbr.SetParameter("pbr-show-ui-opacity", True);
new_pbr.SetParameter("pbr-show-ui-emission", True);
new_pbr.SetParameter("pbr-show-ui-bump-displacement", True);
new_pbr.SetParameter("pbr-show-ui-ambient-occlusion", True);

# Add it to the document
sc.doc.RenderMaterials.Add(new_pbr)

I am going to fix the YT issue so that users do not need to turn on the ui sections manually.

1 Like

Hi @maxsoder,

thanks for the example, it is basically what i ended up doing except that i skipped the simulated material creation which is not required when the slots are filled as they appear in the rmtl.

Can you comment about the bug i’ve reported regarding reflections of the PhysicalMaterial when only a displacement is applied ?

Is there a chance to make the displacement go into one direction only instead of both sides ?

_
c.

Sorry, I do not know about the displacement issue. Maybe @nathanletwory might know more?

There is currently no way to do that other than using mid-gray as your 0 level and use only 0.5-1.0 range to do only positive displacement and 0.0-0.5 to do negative displacement.

Hi @nathanletwory,

unfortunately i cannot use the PhysicalMaterial displacement if the range cannot be controlled in the regular way which is from zero (black) to one (white). I get the images from designers which have setup complex gradients in AI, sometimes as jpg or exr and there is no way to adjust the image using the Output Adjustment settings in Rhino to remap to the desired range. There is a clamp feature which would probably work but it unfortunately only allows to enter values from 0 to 1. Have you ever tried to create a displacement with high detail in Rhino and compared it with what is possible in Blender ? How would you control the displacement density eg. on a SubD object ?

Using a displacement range from 0.5-1.0 limits the possibilities for 8 bit grayscale images to 128 steps which is way to small to do something useful. Have ever tried the results you get if you create a 32 bit image (eg. exr) and compared it with the results of the same image saved as 8bit image ? I do get unexpected results and different displacement distances for both.

Which rgb color value would you recommend for a zero displacement ? Is it 127.0, 127.5 or 128.0 ? How would you set this zero displacement in a procedural material if you can only set integers in Rhino’s color fields ?

Attached is an example for the reflections which are ignored when a displacement is applied. Can you explain why the reflections are not shown on the displaced geometry but on the undisplaced geometry instead ?

Problem.3dm (14.2 MB)

(Please switch the perspective viewport to rendered display).

You can use below example file to repeat another bug. The display engine does not apply the bounding box properly and clips the displaced geometry.

Displacement.3dm (14.2 MB)

btw. if you take one of the above 2 files, you can repeat another bug. Please run _ClearAllMeshes on them and then use _SaveAs and enable “Save small”. It does not save small, instead it saves a 14.2 MB file which includes the render meshes…
_
c.

Lets ask @DavidEranen about the reflection in rendered mode when displacement is added. But quick guess would be that displacement is done after reflection has been calculated.

In Raytraced it all works just fine though

I know this is currently not very optimal for Rhino users. It would be great if all our procedural textures were able to take and make hdri images and give data in floating point accuracy, not clamped between 0.0…1.0.

These are two separate issues, but yes, I have done comparisons many times. I suppose you could bump the level on subd objects for raytraced. But be mindful that geometry amount can grow very quickly when you increase the subd level.

Yes, and no, 8bit images generally aren’t going to cut it for detailed displacement.

I guess 127 would be a safe start.

Back when we were first working on creating the PBR material I tried to get us do a mid-level input for displacement, but IIRC it was deemed to add too much complexity.

That is a known bug, it is logged already RH-63314 Objects with PBR displacement clip in the view.

There is no bug here. The size is due to the images you have added to the file. See in the Textures tab, remove all but the fBM and preview studio textures and save again. 2MB.

thanks @nathanletwory, my reports are only about rendered viewport. Imho it should display the reflections the same as it is in raytraced.

Yes, at least the ones which use the grayscale values. By changing the clamp values in the output adjustment section i was in the hope that i can trick the current range from -0.5 / 0.5 to 0.0 / 1.0. But the fields inputs are limited to 0 to 1.

Again, my reports only are about rendered display mode. Using the Physical Material and displacement, the level set for SubD objects does not seem to have any effect on the displaced mesh geometry. However, if i change the level and use the command _ExtractRenderMesh, the changes made to the level can be seen. Imho this is a bug. If users can change the Properties by enabling “Custom mesh settings”, it works for nurbs objects. But nothing happens in realtime display modes. It only affects raytraced. Why is that and how can you set the displaced mesh finer if the source object is a SubD object using a Physical Material ?

True, and different bit depths seem to generate different results. There are displacement errors at the borders using the fbm shader in my above example Displacement.3dm. I am not sure why this happens as this is a procedural. Any clues ?

127 will displace inward while 128 displaces outward. This gets noticable if you displace by a larger distance in the viewport. A floating point value of 127.5 would be required to keep the displacement at it’s zero level but it cannot be entered in the UI or with an 8 bit image:

Zero.3dm (162.6 KB)

The problem with accuracy at the zero level might not be important for average users and low displacement distances. But we extract displaced meshes and merge them with surrounding undisplaced meshes, eg. for 3d printing.

_
c.

That is the domain of @DavidEranen.

To be honest I am not sure, I find this not very clearly defined myself, either. We’ll have to ask @andy about clarification regarding what settings affect subd density for which display modes and use cases.

I have logged: RH-70023 Ability to control mid-level for displacement

Thank you @nathanletwory, this will help us a lot, especially when it can be set via code.

_
c.

Hi @clement,

Ideally it would all “just work”, and the reflections (the object’s normals) would be updated according to the new geometry. You can get most of the way there by assigning the same texture also to the Bump / Normal slot:

I created a YouTrack item for the issue: https://mcneel.myjetbrains.com/youtrack/issue/RH-70056/Display-Displacement-texture-doesnt-modify-object-normals

-David

Hi @DavidEranen, i’ve tried that but get the same as in your image. Actually if i copy the map from the displacement slot and paste it as instance into the bump/normal slot, i can see the proper normals for a nanosecond, then it changes to what you posted. Imho it should look like the picture @nathanletwory posted above using Raytraced.

Btw. can you explain the result when you use _ExtractRenderMesh on this sphere ? I see that the mapping changes and the rendermesh is undisplaced (after assigning eg. a new custom material). How would you extract the displaced mesh on the file Problem.3dm from above ?

Great, thanks !

_
c.

Hi @clement,

The reason it looks good at first and as time goes on the bump effect gets weaker is due to the following issue: https://mcneel.myjetbrains.com/youtrack/issue/RH-65842/Rendered-bump-texture-resolution-affects-bump-strength. It’s a combination of the fact that we are baking progressively higher resolution textures and the method we use to convert these textures into normal maps are sensitive to the resolution of the texture.

There’s no way to extract the displaced render mesh in this case, because the displacement happens during the rendering of the frame and therefore is part of the GPU shader code. Also, the displacement in this case is part of the material, not the object, so in that sense one could say it works as expected. We would need to do something a bit more fancy during ExtractRenderMesh to be able to bring the displacement along with the object itself. Of course, there’s always the “Displacement” object mesh modifier that you can use in the object properties. This one works differently, and will propagate through the ExtractRenderMesh command.

-David

Hi @DavidEranen, this is highly appreaciated. I think the user should be allowed to get (extract) what he sees on screen.

I would really like to use this feature, unfortunately it is no comparison to the displacement of the PhysicalMaterial. To reproduce what i see, please create a new sphere and apply the fbm texture using the Displacement modifier in Properties. I do get this user experience:

  1. Enable the displacement modifier to “On”
  2. Assign the fbm texture
  3. Wait 2 minutes or more
  4. Fail: “Displacement failed (mesh memory limit hit) - using default render mesh
  5. Change Mesh memory limit to 2048
  6. Wait 2 minutes*

The benefit of using the PhysicalMaterial is that i get instant updates when changing eg. the displacement distance. Thats why i’ve been so exited to use this new feature. Doing the displacement in the GPU (including tesselation) has a lot of potential, unfortunately the current implementation limits all benefits.

*I will make a separate topic about what happens after Step 6.

_
c.

Hi guys and @nathanletwory ,
I have adopted and modified this slightly for my use and wonder how I can assign this new material to an object through python. My issue is that sc.doc.RenderMaterials.Add(new_pbr) only returns True.
And if run the same script multiple times it just adds more materials with (1), (2) etc added to their names, so I can’t use name to identify the material either. Surely there is something simple I fail to understand so any help would be great.

import rhinoscriptsyntax as rs
import scriptcontext as sc

import System
import System.Collections.Generic
import Rhino
import os

def makeMaterial(filepathname):
    # split the filepathname into path and name, then split out the name without prefix
    imageName = os.path.splitext(os.path.basename(filepathname))[0]
    bmtex = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.BitmapTextureType)
    bmtex.Filename = filepathname
    simtex = bmtex.SimulatedTexture(Rhino.Render.RenderTexture.TextureGeneration.Allow)
    # first create an empty PBR material
    pbr_rm = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.PhysicallyBasedMaterialType)
    # to get to a Rhino.DocObjects.PhysicallyBasedMaterial we need to simulate the render material first.
    sim = pbr_rm.SimulatedMaterial(Rhino.Render.RenderTexture.TextureGeneration.Allow)
    # from the simulated material we can get the Rhino.DocObjects.PhysicallyBasedMaterial
    pbr = sim.PhysicallyBased;
    # now we have an instance of a type that has all the API you need to set the PBR properties. 
    pbr.BaseColor = Rhino.Display.Color4f.White
    pbr.SetTexture(simtex.Texture(), Rhino.DocObjects.TextureType.PBR_BaseColor)
    # convert it back to RenderMaterial
    new_pbr = Rhino.Render.RenderMaterial.FromMaterial(pbr.Material, sc.doc)
    # Set a good name
    new_pbr.Name = imageName
    # Add it to the document
    result = sc.doc.RenderMaterials.Add(new_pbr)
    print (result)

filepathname = "C:\\path\\image.png"
makeMaterial(filepathname)

Hi @Holo, just set it to the RenderMaterial property of the RhinoObject:

obj_id = rs.GetObject("Object", 4+8+16+32, True, False)
if obj_id:
    rh_obj = rs.coercerhinoobject(obj_id, True, True)
    rh_obj.RenderMaterial = new_pbr
    rh_obj.CommitChanges()

_
c.

So… this is all confusing…
I don’t need to add the material to RenderMaterials and can coerce a Rhinoobject that I can add the material to and then all is good? Geez this stuff is complicated. I just want to add an image to a terrain :slight_smile: :slight_smile:

I’ll try it out, but I know I will forget all about it unless I understand this stuff. (I will forget it anyway, but it will make it easier to fix any issues down the road though)