unfortunately i cannot use the PhysicalMaterial displacement if the range cannot be controlled in the regular way which is from zero (black) to one (white). I get the images from designers which have setup complex gradients in AI, sometimes as jpg or exr and there is no way to adjust the image using the Output Adjustment settings in Rhino to remap to the desired range. There is a clamp feature which would probably work but it unfortunately only allows to enter values from 0 to 1. Have you ever tried to create a displacement with high detail in Rhino and compared it with what is possible in Blender ? How would you control the displacement density eg. on a SubD object ?
Using a displacement range from 0.5-1.0 limits the possibilities for 8 bit grayscale images to 128 steps which is way to small to do something useful. Have ever tried the results you get if you create a 32 bit image (eg. exr) and compared it with the results of the same image saved as 8bit image ? I do get unexpected results and different displacement distances for both.
Which rgb color value would you recommend for a zero displacement ? Is it 127.0, 127.5 or 128.0 ? How would you set this zero displacement in a procedural material if you can only set integers in Rhino’s color fields ?
Attached is an example for the reflections which are ignored when a displacement is applied. Can you explain why the reflections are not shown on the displaced geometry but on the undisplaced geometry instead ?
btw. if you take one of the above 2 files, you can repeat another bug. Please run _ClearAllMeshes on them and then use _SaveAs and enable “Save small”. It does not save small, instead it saves a 14.2 MB file which includes the render meshes…
I know this is currently not very optimal for Rhino users. It would be great if all our procedural textures were able to take and make hdri images and give data in floating point accuracy, not clamped between 0.0…1.0.
These are two separate issues, but yes, I have done comparisons many times. I suppose you could bump the level on subd objects for raytraced. But be mindful that geometry amount can grow very quickly when you increase the subd level.
Yes, and no, 8bit images generally aren’t going to cut it for detailed displacement.
I guess 127 would be a safe start.
Back when we were first working on creating the PBR material I tried to get us do a mid-level input for displacement, but IIRC it was deemed to add too much complexity.
That is a known bug, it is logged already RH-63314 Objects with PBR displacement clip in the view.
There is no bug here. The size is due to the images you have added to the file. See in the Textures tab, remove all but the fBM and preview studio textures and save again. 2MB.
thanks @nathanletwory, my reports are only about rendered viewport. Imho it should display the reflections the same as it is in raytraced.
Yes, at least the ones which use the grayscale values. By changing the clamp values in the output adjustment section i was in the hope that i can trick the current range from -0.5 / 0.5 to 0.0 / 1.0. But the fields inputs are limited to 0 to 1.
Again, my reports only are about rendered display mode. Using the Physical Material and displacement, the level set for SubD objects does not seem to have any effect on the displaced mesh geometry. However, if i change the level and use the command _ExtractRenderMesh, the changes made to the level can be seen. Imho this is a bug. If users can change the Properties by enabling “Custom mesh settings”, it works for nurbs objects. But nothing happens in realtime display modes. It only affects raytraced. Why is that and how can you set the displaced mesh finer if the source object is a SubD object using a Physical Material ?
True, and different bit depths seem to generate different results. There are displacement errors at the borders using the fbm shader in my above example Displacement.3dm. I am not sure why this happens as this is a procedural. Any clues ?
127 will displace inward while 128 displaces outward. This gets noticable if you displace by a larger distance in the viewport. A floating point value of 127.5 would be required to keep the displacement at it’s zero level but it cannot be entered in the UI or with an 8 bit image:
The problem with accuracy at the zero level might not be important for average users and low displacement distances. But we extract displaced meshes and merge them with surrounding undisplaced meshes, eg. for 3d printing.
To be honest I am not sure, I find this not very clearly defined myself, either. We’ll have to ask @andy about clarification regarding what settings affect subd density for which display modes and use cases.
I have logged: RH-70023 Ability to control mid-level for displacement
Ideally it would all “just work”, and the reflections (the object’s normals) would be updated according to the new geometry. You can get most of the way there by assigning the same texture also to the Bump / Normal slot:
Hi @DavidEranen, i’ve tried that but get the same as in your image. Actually if i copy the map from the displacement slot and paste it as instance into the bump/normal slot, i can see the proper normals for a nanosecond, then it changes to what you posted. Imho it should look like the picture @nathanletwory posted above using Raytraced.
Btw. can you explain the result when you use _ExtractRenderMesh on this sphere ? I see that the mapping changes and the rendermesh is undisplaced (after assigning eg. a new custom material). How would you extract the displaced mesh on the file Problem.3dm from above ?
There’s no way to extract the displaced render mesh in this case, because the displacement happens during the rendering of the frame and therefore is part of the GPU shader code. Also, the displacement in this case is part of the material, not the object, so in that sense one could say it works as expected. We would need to do something a bit more fancy during ExtractRenderMesh to be able to bring the displacement along with the object itself. Of course, there’s always the “Displacement” object mesh modifier that you can use in the object properties. This one works differently, and will propagate through the ExtractRenderMesh command.
Hi @DavidEranen, this is highly appreaciated. I think the user should be allowed to get (extract) what he sees on screen.
I would really like to use this feature, unfortunately it is no comparison to the displacement of the PhysicalMaterial. To reproduce what i see, please create a new sphere and apply the fbm texture using the Displacement modifier in Properties. I do get this user experience:
The benefit of using the PhysicalMaterial is that i get instant updates when changing eg. the displacement distance. Thats why i’ve been so exited to use this new feature. Doing the displacement in the GPU (including tesselation) has a lot of potential, unfortunately the current implementation limits all benefits.
*I will make a separate topic about what happens after Step 6.
Hi guys and @nathanletwory ,
I have adopted and modified this slightly for my use and wonder how I can assign this new material to an object through python. My issue is that sc.doc.RenderMaterials.Add(new_pbr) only returns True.
And if run the same script multiple times it just adds more materials with (1), (2) etc added to their names, so I can’t use name to identify the material either. Surely there is something simple I fail to understand so any help would be great.
import rhinoscriptsyntax as rs
import scriptcontext as sc
# split the filepathname into path and name, then split out the name without prefix
imageName = os.path.splitext(os.path.basename(filepathname))
bmtex = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.BitmapTextureType)
bmtex.Filename = filepathname
simtex = bmtex.SimulatedTexture(Rhino.Render.RenderTexture.TextureGeneration.Allow)
# first create an empty PBR material
pbr_rm = Rhino.Render.RenderContentType.NewContentFromTypeId(Rhino.Render.ContentUuids.PhysicallyBasedMaterialType)
# to get to a Rhino.DocObjects.PhysicallyBasedMaterial we need to simulate the render material first.
sim = pbr_rm.SimulatedMaterial(Rhino.Render.RenderTexture.TextureGeneration.Allow)
# from the simulated material we can get the Rhino.DocObjects.PhysicallyBasedMaterial
pbr = sim.PhysicallyBased;
# now we have an instance of a type that has all the API you need to set the PBR properties.
pbr.BaseColor = Rhino.Display.Color4f.White
# convert it back to RenderMaterial
new_pbr = Rhino.Render.RenderMaterial.FromMaterial(pbr.Material, sc.doc)
# Set a good name
new_pbr.Name = imageName
# Add it to the document
result = sc.doc.RenderMaterials.Add(new_pbr)
filepathname = "C:\\path\\image.png"
So… this is all confusing…
I don’t need to add the material to RenderMaterials and can coerce a Rhinoobject that I can add the material to and then all is good? Geez this stuff is complicated. I just want to add an image to a terrain
I’ll try it out, but I know I will forget all about it unless I understand this stuff. (I will forget it anyway, but it will make it easier to fix any issues down the road though)
Hi @Holo, i guess this is required if you want to save the Rhino file with the assigned texture residing somewhere on your harddrive. Is the downloaded bitmap in a different format which you cannot assign directly ?
I guess it would work the same way just pasting a clipboard bitmap into Rhino? that bitmap isn’t stored on the drive is it? I just download a bitmap from a wms and then stores it as a png. (or what ever other format I would like) I do the same for the terrain from another wmas but then as a 32bit tiff and then read that in with it’s transfomationmatrix. It was complicated but now it works, so now off to the bugtracking and fixing.
Thanks for the help!