Same layer, same material - Different appearance

These solids are on the same layer, have material by layer. As far as I can tell, there are no differences (this is a part of much larger file where there are multiple case of this “duality”.
image
I suspected a slight torsion in one of the surfaces, but when the solids are exploded the surfaces look exactly the same.

image

Here is the file:
240626 SameLayerDifferentRender.3dm (211.6 KB)

Thanks, N

Hi @nsgma ,

It looks like this is due to the Anisotropy channel in the material applied. The Sun is highlighting the pillars differently depending on the view as a result of this material feature. Anisotropy can be finicky to get right and involves many variables such as the geometry, lights and camera view. I believe this feature is working correctly here but you may choose to reduce the amount of it or remove it.

Hi Brian,
That is simply not correct. If you rotate around they will change color due to the anisotropy and in some angles they will look similar, but as I explained in this is part of a larger model where the behavior is inconsistent with the point of view.
That was why I copied them 1-2-1-2 and you can see that it is not due to the point of view.
Actually, if you switch their place the darker keep darker and the lighter keep lighter.


It apparently is due to the mapping being by surface, the anisotropy being governed by the UV Space and the UV spaces being different in the two solids.
What this actually means is not really clear to me :rofl:
But it seems to go away if you apply a box mapping, which is would work in this case.

However, @nathanletwory [or someone that understands what that means]
Is there a way to unify/match UVSpaces?
Why do the exploded surfaces seem to be matched, but when you rebuild the solid it goes back?
Thanks, N

I do not know how surface parametrization works between the separate surfaces and the joined whole. But as said, applying a mapping the same way between each object in question will ensure your UV spaces match.

for each object you have, even if its a nurbs object rhinos generates a rendermesh for it, meshes tend to be quite a bit different and small inpersivable differences like joined geometry might make them look different in renderings already.

UPDATE:
3 seconds after making this post i’ve noticed that it depends on the surface normal, it appears that the transparency isn’t applied to both sides of the plane

i have the same exact problem, but it’s more apparent:


i’m sorry but i’m somewhat new to rhino and i didn’t understand a thing from the previous discussion, can someone clarify how to solve this? Thaks… :sweat_smile: :cry:

turn off all the environments, set the background to solid for instance and turn off custom environment in the render properties, maybe it is reflecting these and looks naturally different then on every other geometry.

edit: after checking thoroughly i believe it indeed may be the anisotropy.