How to create physically Based Material with Python

Hi @maxsoder,

thanks for the example, it is basically what i ended up doing except that i skipped the simulated material creation which is not required when the slots are filled as they appear in the rmtl.

Can you comment about the bug i’ve reported regarding reflections of the PhysicalMaterial when only a displacement is applied ?

Is there a chance to make the displacement go into one direction only instead of both sides ?

_
c.

Sorry, I do not know about the displacement issue. Maybe @nathanletwory might know more?

There is currently no way to do that other than using mid-gray as your 0 level and use only 0.5-1.0 range to do only positive displacement and 0.0-0.5 to do negative displacement.

Hi @nathanletwory,

unfortunately i cannot use the PhysicalMaterial displacement if the range cannot be controlled in the regular way which is from zero (black) to one (white). I get the images from designers which have setup complex gradients in AI, sometimes as jpg or exr and there is no way to adjust the image using the Output Adjustment settings in Rhino to remap to the desired range. There is a clamp feature which would probably work but it unfortunately only allows to enter values from 0 to 1. Have you ever tried to create a displacement with high detail in Rhino and compared it with what is possible in Blender ? How would you control the displacement density eg. on a SubD object ?

Using a displacement range from 0.5-1.0 limits the possibilities for 8 bit grayscale images to 128 steps which is way to small to do something useful. Have ever tried the results you get if you create a 32 bit image (eg. exr) and compared it with the results of the same image saved as 8bit image ? I do get unexpected results and different displacement distances for both.

Which rgb color value would you recommend for a zero displacement ? Is it 127.0, 127.5 or 128.0 ? How would you set this zero displacement in a procedural material if you can only set integers in Rhino’s color fields ?

Attached is an example for the reflections which are ignored when a displacement is applied. Can you explain why the reflections are not shown on the displaced geometry but on the undisplaced geometry instead ?

Problem.3dm (14.2 MB)

(Please switch the perspective viewport to rendered display).

You can use below example file to repeat another bug. The display engine does not apply the bounding box properly and clips the displaced geometry.

Displacement.3dm (14.2 MB)

btw. if you take one of the above 2 files, you can repeat another bug. Please run _ClearAllMeshes on them and then use _SaveAs and enable “Save small”. It does not save small, instead it saves a 14.2 MB file which includes the render meshes…
_
c.

Lets ask @DavidEranen about the reflection in rendered mode when displacement is added. But quick guess would be that displacement is done after reflection has been calculated.

In Raytraced it all works just fine though

I know this is currently not very optimal for Rhino users. It would be great if all our procedural textures were able to take and make hdri images and give data in floating point accuracy, not clamped between 0.0…1.0.

These are two separate issues, but yes, I have done comparisons many times. I suppose you could bump the level on subd objects for raytraced. But be mindful that geometry amount can grow very quickly when you increase the subd level.

Yes, and no, 8bit images generally aren’t going to cut it for detailed displacement.

I guess 127 would be a safe start.

Back when we were first working on creating the PBR material I tried to get us do a mid-level input for displacement, but IIRC it was deemed to add too much complexity.

That is a known bug, it is logged already RH-63314 Objects with PBR displacement clip in the view.

There is no bug here. The size is due to the images you have added to the file. See in the Textures tab, remove all but the fBM and preview studio textures and save again. 2MB.

thanks @nathanletwory, my reports are only about rendered viewport. Imho it should display the reflections the same as it is in raytraced.

Yes, at least the ones which use the grayscale values. By changing the clamp values in the output adjustment section i was in the hope that i can trick the current range from -0.5 / 0.5 to 0.0 / 1.0. But the fields inputs are limited to 0 to 1.

Again, my reports only are about rendered display mode. Using the Physical Material and displacement, the level set for SubD objects does not seem to have any effect on the displaced mesh geometry. However, if i change the level and use the command _ExtractRenderMesh, the changes made to the level can be seen. Imho this is a bug. If users can change the Properties by enabling “Custom mesh settings”, it works for nurbs objects. But nothing happens in realtime display modes. It only affects raytraced. Why is that and how can you set the displaced mesh finer if the source object is a SubD object using a Physical Material ?

True, and different bit depths seem to generate different results. There are displacement errors at the borders using the fbm shader in my above example Displacement.3dm. I am not sure why this happens as this is a procedural. Any clues ?

127 will displace inward while 128 displaces outward. This gets noticable if you displace by a larger distance in the viewport. A floating point value of 127.5 would be required to keep the displacement at it’s zero level but it cannot be entered in the UI or with an 8 bit image:

Zero.3dm (162.6 KB)

The problem with accuracy at the zero level might not be important for average users and low displacement distances. But we extract displaced meshes and merge them with surrounding undisplaced meshes, eg. for 3d printing.

_
c.

That is the domain of @DavidEranen.

To be honest I am not sure, I find this not very clearly defined myself, either. We’ll have to ask @andy about clarification regarding what settings affect subd density for which display modes and use cases.

I have logged: RH-70023 Ability to control mid-level for displacement

Thank you @nathanletwory, this will help us a lot, especially when it can be set via code.

_
c.

Hi @clement,

Ideally it would all “just work”, and the reflections (the object’s normals) would be updated according to the new geometry. You can get most of the way there by assigning the same texture also to the Bump / Normal slot:

I created a YouTrack item for the issue: https://mcneel.myjetbrains.com/youtrack/issue/RH-70056/Display-Displacement-texture-doesnt-modify-object-normals

-David

Hi @DavidEranen, i’ve tried that but get the same as in your image. Actually if i copy the map from the displacement slot and paste it as instance into the bump/normal slot, i can see the proper normals for a nanosecond, then it changes to what you posted. Imho it should look like the picture @nathanletwory posted above using Raytraced.

Btw. can you explain the result when you use _ExtractRenderMesh on this sphere ? I see that the mapping changes and the rendermesh is undisplaced (after assigning eg. a new custom material). How would you extract the displaced mesh on the file Problem.3dm from above ?

Great, thanks !

_
c.

Hi @clement,

The reason it looks good at first and as time goes on the bump effect gets weaker is due to the following issue: https://mcneel.myjetbrains.com/youtrack/issue/RH-65842/Rendered-bump-texture-resolution-affects-bump-strength. It’s a combination of the fact that we are baking progressively higher resolution textures and the method we use to convert these textures into normal maps are sensitive to the resolution of the texture.

There’s no way to extract the displaced render mesh in this case, because the displacement happens during the rendering of the frame and therefore is part of the GPU shader code. Also, the displacement in this case is part of the material, not the object, so in that sense one could say it works as expected. We would need to do something a bit more fancy during ExtractRenderMesh to be able to bring the displacement along with the object itself. Of course, there’s always the “Displacement” object mesh modifier that you can use in the object properties. This one works differently, and will propagate through the ExtractRenderMesh command.

-David

Hi @DavidEranen, this is highly appreaciated. I think the user should be allowed to get (extract) what he sees on screen.

I would really like to use this feature, unfortunately it is no comparison to the displacement of the PhysicalMaterial. To reproduce what i see, please create a new sphere and apply the fbm texture using the Displacement modifier in Properties. I do get this user experience:

  1. Enable the displacement modifier to “On”
  2. Assign the fbm texture
  3. Wait 2 minutes or more
  4. Fail: “Displacement failed (mesh memory limit hit) - using default render mesh
  5. Change Mesh memory limit to 2048
  6. Wait 2 minutes*

The benefit of using the PhysicalMaterial is that i get instant updates when changing eg. the displacement distance. Thats why i’ve been so exited to use this new feature. Doing the displacement in the GPU (including tesselation) has a lot of potential, unfortunately the current implementation limits all benefits.

*I will make a separate topic about what happens after Step 6.

_
c.

Hi guys and @nathanletwory ,
I have adopted and modified this slightly for my use and wonder how I can assign this new material to an object through python. My issue is that sc.doc.RenderMaterials.Add(new_pbr) only returns True.
And if run the same script multiple times it just adds more materials with (1), (2) etc added to their names, so I can’t use name to identify the material either. Surely there is something simple I fail to understand so any help would be great.

import rhinoscriptsyntax as rs
import scriptcontext as sc

import System
import System.Collections.Generic
import Rhino
import os

def makeMaterial(filepathname):
    # split the filepathname into path and name, then split out the name without prefix
    imageName = os.path.splitext(os.path.basename(filepathname))[0]
    bmtex = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.BitmapTextureType)
    bmtex.Filename = filepathname
    simtex = bmtex.SimulatedTexture(Rhino.Render.RenderTexture.TextureGeneration.Allow)
    # first create an empty PBR material
    pbr_rm = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.PhysicallyBasedMaterialType)
    # to get to a Rhino.DocObjects.PhysicallyBasedMaterial we need to simulate the render material first.
    sim = pbr_rm.SimulatedMaterial(Rhino.Render.RenderTexture.TextureGeneration.Allow)
    # from the simulated material we can get the Rhino.DocObjects.PhysicallyBasedMaterial
    pbr = sim.PhysicallyBased;
    # now we have an instance of a type that has all the API you need to set the PBR properties. 
    pbr.BaseColor = Rhino.Display.Color4f.White
    pbr.SetTexture(simtex.Texture(), Rhino.DocObjects.TextureType.PBR_BaseColor)
    # convert it back to RenderMaterial
    new_pbr = Rhino.Render.RenderMaterial.FromMaterial(pbr.Material, sc.doc)
    # Set a good name
    new_pbr.Name = imageName
    # Add it to the document
    result = sc.doc.RenderMaterials.Add(new_pbr)
    print (result)

filepathname = "C:\\path\\image.png"
makeMaterial(filepathname)

Hi @Holo, just set it to the RenderMaterial property of the RhinoObject:

obj_id = rs.GetObject("Object", 4+8+16+32, True, False)
if obj_id:
    rh_obj = rs.coercerhinoobject(obj_id, True, True)
    rh_obj.RenderMaterial = new_pbr
    rh_obj.CommitChanges()

_
c.

So… this is all confusing…
I don’t need to add the material to RenderMaterials and can coerce a Rhinoobject that I can add the material to and then all is good? Geez this stuff is complicated. I just want to add an image to a terrain :slight_smile: :slight_smile:

I’ll try it out, but I know I will forget all about it unless I understand this stuff. (I will forget it anyway, but it will make it easier to fix any issues down the road though)

i totally agree with this. We need some updated rs like material handling methods.

Yes. I have a script which creates a render material and just assigns it to an object. This causes the render material to appear in the material browser.
_
c.

1 Like

Thanks that did it!
I did have to add it to the document with sc.doc.RenderMaterials.Add(new_pbr) though. But then I could use the fix you sent me. (rh_obj.RenderMaterial = new_pbr)

So now I can download a georeferenced terrain and have it textured with an aerial from all over the country by a click of a button :smiley:
Christmas came early!

Now I save the aerial that is automatically downloaded, but could I just assign the downloaded bitmap directly without having it saved as a png first too?

1 Like

Hi @Holo, i guess this is required if you want to save the Rhino file with the assigned texture residing somewhere on your harddrive. Is the downloaded bitmap in a different format which you cannot assign directly ?

_
c.

I guess it would work the same way just pasting a clipboard bitmap into Rhino? that bitmap isn’t stored on the drive is it? I just download a bitmap from a wms and then stores it as a png. (or what ever other format I would like) I do the same for the terrain from another wmas but then as a 32bit tiff and then read that in with it’s transfomationmatrix. It was complicated but now it works, so now off to the bugtracking and fixing.
Thanks for the help!

This script https://jesterking.github.io/rhipy/create_specific_rendermaterial.html I show directly assigning render material to the object without going through the rendermaterial table. I guess you followed https://jesterking.github.io/rhipy/create_pbr_material_with_textures.html as your basis - that I wrote to show you can add rendermaterials without assigning them to object which is something you can’t do with the old-style materials.

Anyway, you figured it out :slight_smile:

I hope that the scripts at https://jesterking.github.io/rhipy/ give you more insight into how these things work.

Yes, there’s some hoops to jump through with materials and all render content, unfortunately.

1 Like