From procedural texture to image texture (How To)

Here is a way to transform procedural texture to a texture file.

In this description a procedural texture is a function that transform a coordinate in euclidean space (X, Y, Z) to a colors (Red, Green, Blue and why not Alpha).

  1. Unwrap the object(s) you want to texture, this object could be also an object that is just used as support of UV mapping (used with Texture Matching)

1bis) You can edit the UV map in UV editor in order to optimize the placement of objects.

  1. Extract the render mesh(es) using ExtractRenderMesh

  2. Open the Grasshopper Script
    Set render mesh(s) the the Mesh Component, if there are many meshes, use Mesh Join
    Choose the number of pixels for the texture file (512, 1024 …)
    Click on do the work
    The component will output a list of points (XYZ)


  3. Transform these point to colors

  4. Give a path for a texture file, just PNG output at the moment, if extension is not png it will be changed to PNG
    Click on “Do The Work”

  5. Apply the texture to your material

  6. You have now a procedural texture in a texture file that can be exported


    the 2048 pixels version (a bit long ~ 1 minute)

A project I am doing with some camo. Here the component to be used to “support” the unwrapping

And the object with the raw texture (I have to soften it a bit)

texture_mesh_legacy.gh (183.1 KB)

In this link you’ll find some nice textures

15 Likes

Wrapping texture on meshes was something I always seen on other softwares but completely miss on on rhino.
I can see many situations where this might come handy by creating ultra-light models with alpha channel and disabled filtering.
Thanks for sharing this, Laurent!
Top!

1 Like

I had some fun working a bit more on the texture part. In order go fastest I took a noise method inside a C# component.
I use this one

My camouflage texture maker has some possibilities. Four colors, it could pixelate outputs.



The main advantage is that if the Unwrapping/Texture matching is well done there is no seams visible.

@nathanletwory it could be interesting to do the same with texture generator in Rhinoceros/Cycle/Grasshopper. Do you know if it is possible to get the color in physical space (xyz) not UV space? At the moment all the tools in Rhinoceros works well, thanks for that. I had some problem with Texture Matching but I wasn’t able to repeat simply the error in order to post the problem!

3 Likes

You’d determine the UV coordinate based on the XYZ coordinate. Lets ask @Jussi_Aaltonen what the SDK tools in Rhino are to do this, he maintains the texture baking code in Rhino.

It’d be quite interesting to bake Cycles textures indeed. Cycles has bake functionality where for one or more objects the entire raytracing process is baked to texture but that I haven’t hooked up. So in addition to the normal albedo pass you’d get everything you’d choose - shadows, highlights, etc. That is commonly used to speed up rendering in especially animations and game models.

2 Likes

That’s called WCS mapping. And in the C++ SDK you can access this via CRhRdkTexture::ProjectionMode.

Continuing this discussion, I recently implement an obj importer for Nautilus. As I tested the texture coordinates I wanted to go a bit further. I will add in Nautilus a tool that output as a mesh the mesh in UV space, the tool will also transform coordinate.
here is Spot from Keenan Crane, textured using Human.

So with a mesh (no quad!) having texture coordinate, you generate points in UV space [0, 1] and the tool will convert them in 3D space

With a procedural tool you convert these 3d point to a color, I use here a noise tool.


I use Bitmap plugin to convert a list of color to an Image.
Then the image is copied to the hard disk and applied to the mesh.
Here the difference between a mesh with color at vertex or texture.

I will surely add these tools to Nautilus with some helper and some tools that outputs a color (like a camouflage).

2 Likes

Continuing on this I made some components to generate a texture that is similar to Zebra Analysis.
The red is done by Rhinoceros and change when position of viewer change, the blue one is a model with a texture.

From what I understood stripes are along an infinite tube, so on some points of the object as reflexion is aligned with the tube there is convergence point. Rhino must not use an infinity tube but a sort of dome with stripes.
zebra analysispng


stripes on UV mesh

Texture for Spot

https://www.cs.cmu.edu/~kmcrane/Projects/ModelRepository/

1 Like

I think I would just do a planar projection of a stripe texture. The waves texture in Rhino will work just fine for that:

Short video showing how you can alter the projection of the planar projection through the widget. The mapping stays to the object otherwise.

Thanks for looking to this, but here the aim is to reproduce Zebra Analysis and make a texture (a bitmap) that could be further use to draw on it, something like that named “Static Zebra”

Yes, my point is just that a planar projection should suffice to get started, I don’t think a dome is needed. You can project an image texture with straight lines, then bake that to an unwrapped image texture.

Note though that getting good unwrapping that doesn’t break can be hard, especially when you don’t have a lot of geometry to work with. You’ll get mapping artifacts.

Some new tools are now on Nautilus 1.9.5. They Allow to calculate a Texture, look at the result inside Grasshopper. I am not sure to keep all the way it works but it seems quite useful. As I wanted to keep my plugin compatible with Rhino 7, I use Meshes to “transport” cloud of points, cloud of colors and/or cloud of Points and Normals. So my tools outputs many INVALID MESH but whatever they contain useful data.

Lets begin with a simple example.
Lets take an hollow tube


Here the unwrapped UV, it is important that there is some places between UV islands.

Then this mesh is plugged in Mesh To UV points, this component need a mesh with UV coordinates, a size (width and height) for the square texture


This components output points in 3D on the initial mesh

Then you can plug this mesh or these points from the mesh) in your procedural tool (a component thats gives a color or a weight depending of XYZ coordinate, Color = f(x,y,z))
Here I used a noise tool that output noise as weights and colors

This mesh containing colors and points is transformed to an image by a little components that writes a PNG in a temporary folder
image
You can see the result live


If happy, copy the image where you want and you can use it as a texture.
Rhino Render


Raytraced

Vray

This tool is quite fast ~5 to 10 s for 2048 pixels x 2048 pixels.

Need Nautilus 1.9.5
texture extraction.gh (22.6 KB)

4 Likes

Hi Laurent, very cool new addition! This could be useful in many ways, one of them is the ability to generate masks for various advanced materials allowing interesting procedural texturing. For example, by emphasizing the edges of a mesh, you can generate a mask for an edge-wear effect. Your tool will show its full potential when McNeel works out some missing components for creating materials and texturing in Grasshopper.

  1. I really hope that adding or preserving existing UV will sooner or later be part of the native tools in Grasshopper within the Model Object component, but is there a way for now to be able to transfer UVs from one mesh with the same topology to another? In this case, all I do is modify the vertex colors.

  2. Could you write something more about how to define the input mC (Mesh with colors)? I would like to use some arbitrary mesh with color information defined in any way and get a texture from it. There are potentially many different colored meshes obtained with different tools created by Daniel Piker or yourself, which would be nice to be able to process into a texture (let’s note that a normal unwrapped texture, not a special one like 1x256 px). With such a regular texture it is easier to continue working with defining materials.

I have one mesh with vertex colors here (which I unfortunately had to manually unwrap) could you show me how to process it so that I can convert its colors to a texture?

mesh colors unwrap.3dm (174.7 KB)


Some links and questions:

We can have a nice edges texture if we could translate it into image texture.

This method uses special elongated textures, I wonder if we can transform them into regular ones based on the Mesh/Brep UV?

Hi,
I am happy you find this useful. In order to be simple I have just square images because most Texture are/were square.
But my my concern with this tool is that it generates many images in Temp Folder, one random name at each iteration, I added Nautilus at the beginning of the name in order to be able to clean the folder. If you have some thought on the workflow I am interested.

here a way to use mesh color on vertex and transform that to a square texture.
Really see mC as a point Cloud (but with a number of colors = NxN as the texture is a square)


mesh color on vertex to texture.gh (21.5 KB)

You can apply this technique to all the colored mesh.

And yes this tool was intended to make texture, for Bump, Opacity, Emap …



For the other question I have to look more closely bu yesterday I wanted to see if I was able to transfer between differents UV map and for this I must add others tools.

In order to use elongated texture, not so simple but doable. With Daniel tools or mine that outputs weights (distance indeed if we speak of geodesic distance).
You first need a distance on each point of the Vertex of the mesh having UV.
here a simple value from 0 to 1 depending on Z

We need to generate a Gradient with the elongated texture.
Here I use one of my tool that uses an internet address of the image

Then this file is copied to the temp folder and read by a tool that uses an image to generate a gradient. By default a curve is drawn on the image and go from one corner of bounding box to the other

Then points that represent the pixels on the desired texture are put on initial mesh

The weight for each of theses points is evaluated using a component (that has no icon at the moment)
image

Then a component to be made is used to evaluate color at each value of the gradient


texture sqared from elongated texture Method 2 Better.gh (28.5 KB)

there is another solution but not good here, because if we just use the color on vertex it is not good if mesh has not enough resolution. Color is interpolated from red on bottom to white on top.

New component is added and will be on 1.9.6,
image

Difference of render
Mesh with color on vertex VS Same mesh but with 2048 pixels texture


ColorOnVertexVsTexture.gh (14.6 KB)

2 Likes

I don’t really know what is you imperative for this but there are others methods that the one proposed by Daniel (tis this one)

You can limit the weighting to close points and use Kernel weighting, this tool will put weights on points with a Kernel function depending on the distance

For my tool I choose a maximum on weight per Kernel to 1. You can choose to add weights or not.
For what you want it seems better not to add and just keep the maximum. The closest points are choosen using Rtree


Results is less smooth and fallof different

mesh_boundary_shade - triremesh.gh (153.0 KB)

If someone need to output a “classic” bitmap file for the output of Isopod, here is a way with Nautilus,

It will work just with 1.9.7 as my tool didn’t like 1 pixel wide image (Grasshopper tool Import Image has the same problem as mine as it puts pixel on mesh vertex)
The idea is the same as what was proposed before, I use ValueAt to measure the field on pixels.Then I tried to find the same function as Daniel to convert the image

The main problem with the texture is the transition of gradient from 1 to 0 as there is no interpolation there it leads to some problems.

I added this little tool that allows to copy on temporary folder a file from internet. Here the 1 pixel x 200 pixels wood grain texture.

And if instead of Gradient from object you feed you own gradient



woodgrain2_LD.gh (28.3 KB)

1 Like