Image Sampler control domain

Hello all,
I am currently trying to implement an image sampler into a project.
Starting point for this project is a curved surface (untrimmed) that is subdivided into points in u and v direction. The points should then be moved upwards into z-direction based on a brightness distribution.
I managed to generate two different approaches based on information from this forum but they are slightly different in terms of how the image affects the points.

I would like to have more control over the exact arrangement of the image depending on the surface. My image sampler domain uses parametrised u,v values from 0 to 1 - does this mean the image is stretched into u and v in order to match the surface?
Actually I want to manipulate the surface by a specific Brightness distribution, but at the moment it is rather a matter of feeling or trial and error how I arrange the grayscales in advance.

Image Sampler Domain.gh (168.6 KB)

Looks like in version1, the shorter side of your surface is “U” direction(which means “X” domain of image sampler)and the longer side is "V"direction(which means “Y” domain of image sampler).
But if you use the points from version1 using Surface Closest Pointslike you did in version2, U & V directions may be completely opposite forImage Sampler… If I were you, I’ll stick to use version1 and if you’d like to swap your surface’s U & V directions, then have a try Lunchbox plugin’s Reverse Surface Directions or Pufferfish plugin’s Swap Surface Directions.

It looks like you have a good idea of what you’re doing. Bravo!

A few comments… Your ‘Version 2’ looks a little too clever with some distortion, I’d skip it.

Suggestions for improving ‘Version 1’:

  1. You don’t need Srf CP since you already have ‘uvP’ from SDivide.
  2. Using Bounds as the ‘S’ (Source) input for ReMap can distort the Z values, making the peaks appear slightly larger than if you just skip it and use the default domain of ‘0.0 To 1.0’. Bounds is optional, caveat emptor.
  3. The sliders that define the target domain can be Reals (Floating Point) instead of Integers.
  4. You can effectively swap your surface UVs, as suggested by @HS_Kim, by simply swapping the X and Y values of the ‘uvP’ points.

Before:


After:

Image Sampler Domain_2019Jul18a.gh (24.5 KB)

Unfortunately, a long neglected bug causes the Image Sampler to be lost when saving a modified copy of your GH file. You will have to copy/paste it from your original and rewire as shown above. :frowning:

@HS_Kim: Thank you!! Sometimes its fairly easy but quite effective :slight_smile:

@Joseph_Oster: Great input! Thank you! Luckily, skipping Bounds slightly improves the results from Surface From Points . I think the quality of the output is directly affected by the accuracy of the greyscale within the image. More homogeneous gradients would result in smoother topology adaptions.
But instread of adjusting the image, is there a smart way for “smoothing” the surface in order to not be dependant on influences such as image resolution?

I can’t speak to smoothing off hand but resolution is semi-independent of UV dimensions. Your image appears to be 1075 X 480 pixels, yet UV resolution as written is only 299 X 124. That can be increased by simply multiplying the UV values by a “factor”. This preserves aspect ratio and allows arbitrary resolution, though it does no good to go beyond the number of pixels.

By the way, you can also add an ‘X-1’ expression to the X and/or Y outputs of pDecon to flip the image horizontally or vertically.

Yes I see! I think I have to dig deeper into the provided image as these unevennesses on the elevations appear to me like inaccuracies in the grayscale.
I played around with Weaverbirds Laplacian Smoothing and could achieve a smoother surface. Depending on the level of iterations the surface tends to be quite smooth - with a lot of computing power.
Unfortunately, the details are just slightly visible in the rendering…
Red surface - level 3
grey surface - level 1
blue surface - no Wb
Smoothing_WB