Here is a way to transform procedural texture to a texture file.
In this description a procedural texture is a function that transform a coordinate in euclidean space (X, Y, Z) to a colors (Red, Green, Blue and why not Alpha).
Unwrap the object(s) you want to texture, this object could be also an object that is just used as support of UV mapping (used with Texture Matching)
Open the Grasshopper Script
Set render mesh(s) the the Mesh Component, if there are many meshes, use Mesh Join
Choose the number of pixels for the texture file (512, 1024 …)
Click on do the work
The component will output a list of points (XYZ)
Wrapping texture on meshes was something I always seen on other softwares but completely miss on on rhino.
I can see many situations where this might come handy by creating ultra-light models with alpha channel and disabled filtering.
Thanks for sharing this, Laurent!
Top!
@nathanletwory it could be interesting to do the same with texture generator in Rhinoceros/Cycle/Grasshopper. Do you know if it is possible to get the color in physical space (xyz) not UV space? At the moment all the tools in Rhinoceros works well, thanks for that. I had some problem with Texture Matching but I wasn’t able to repeat simply the error in order to post the problem!
You’d determine the UV coordinate based on the XYZ coordinate. Lets ask @Jussi_Aaltonen what the SDK tools in Rhino are to do this, he maintains the texture baking code in Rhino.
It’d be quite interesting to bake Cycles textures indeed. Cycles has bake functionality where for one or more objects the entire raytracing process is baked to texture but that I haven’t hooked up. So in addition to the normal albedo pass you’d get everything you’d choose - shadows, highlights, etc. That is commonly used to speed up rendering in especially animations and game models.
Continuing this discussion, I recently implement an obj importer for Nautilus. As I tested the texture coordinates I wanted to go a bit further. I will add in Nautilus a tool that output as a mesh the mesh in UV space, the tool will also transform coordinate.
here is Spot from Keenan Crane, textured using Human.
I use Bitmap plugin to convert a list of color to an Image.
Then the image is copied to the hard disk and applied to the mesh.
Here the difference between a mesh with color at vertex or texture.
Continuing on this I made some components to generate a texture that is similar to Zebra Analysis.
The red is done by Rhinoceros and change when position of viewer change, the blue one is a model with a texture.
From what I understood stripes are along an infinite tube, so on some points of the object as reflexion is aligned with the tube there is convergence point. Rhino must not use an infinity tube but a sort of dome with stripes.
Thanks for looking to this, but here the aim is to reproduce Zebra Analysis and make a texture (a bitmap) that could be further use to draw on it, something like that named “Static Zebra”
Yes, my point is just that a planar projection should suffice to get started, I don’t think a dome is needed. You can project an image texture with straight lines, then bake that to an unwrapped image texture.
Note though that getting good unwrapping that doesn’t break can be hard, especially when you don’t have a lot of geometry to work with. You’ll get mapping artifacts.
Some new tools are now on Nautilus 1.9.5. They Allow to calculate a Texture, look at the result inside Grasshopper. I am not sure to keep all the way it works but it seems quite useful. As I wanted to keep my plugin compatible with Rhino 7, I use Meshes to “transport” cloud of points, cloud of colors and/or cloud of Points and Normals. So my tools outputs many INVALID MESH but whatever they contain useful data.
Lets begin with a simple example.
Lets take an hollow tube
Then you can plug this mesh or these points from the mesh) in your procedural tool (a component thats gives a color or a weight depending of XYZ coordinate, Color = f(x,y,z))
Here I used a noise tool that output noise as weights and colors
This mesh containing colors and points is transformed to an image by a little components that writes a PNG in a temporary folder
Hi Laurent, very cool new addition! This could be useful in many ways, one of them is the ability to generate masks for various advanced materials allowing interesting procedural texturing. For example, by emphasizing the edges of a mesh, you can generate a mask for an edge-wear effect. Your tool will show its full potential when McNeel works out some missing components for creating materials and texturing in Grasshopper.
I really hope that adding or preserving existing UV will sooner or later be part of the native tools in Grasshopper within the Model Object component, but is there a way for now to be able to transfer UVs from one mesh with the same topology to another? In this case, all I do is modify the vertex colors.
Could you write something more about how to define the input mC (Mesh with colors)? I would like to use some arbitrary mesh with color information defined in any way and get a texture from it. There are potentially many different colored meshes obtained with different tools created by Daniel Piker or yourself, which would be nice to be able to process into a texture (let’s note that a normal unwrapped texture, not a special one like 1x256 px). With such a regular texture it is easier to continue working with defining materials.
I have one mesh with vertex colors here (which I unfortunately had to manually unwrap) could you show me how to process it so that I can convert its colors to a texture?
Hi,
I am happy you find this useful. In order to be simple I have just square images because most Texture are/were square.
But my my concern with this tool is that it generates many images in Temp Folder, one random name at each iteration, I added Nautilus at the beginning of the name in order to be able to clean the folder. If you have some thought on the workflow I am interested.
here a way to use mesh color on vertex and transform that to a square texture.
Really see mC as a point Cloud (but with a number of colors = NxN as the texture is a square)
For the other question I have to look more closely bu yesterday I wanted to see if I was able to transfer between differents UV map and for this I must add others tools.
In order to use elongated texture, not so simple but doable. With Daniel tools or mine that outputs weights (distance indeed if we speak of geodesic distance).
You first need a distance on each point of the Vertex of the mesh having UV.
here a simple value from 0 to 1 depending on Z
We need to generate a Gradient with the elongated texture.
Here I use one of my tool that uses an internet address of the image
Then this file is copied to the temp folder and read by a tool that uses an image to generate a gradient. By default a curve is drawn on the image and go from one corner of bounding box to the other
there is another solution but not good here, because if we just use the color on vertex it is not good if mesh has not enough resolution. Color is interpolated from red on bottom to white on top.
I don’t really know what is you imperative for this but there are others methods that the one proposed by Daniel (tis this one)
You can limit the weighting to close points and use Kernel weighting, this tool will put weights on points with a Kernel function depending on the distance
For my tool I choose a maximum on weight per Kernel to 1. You can choose to add weights or not.
For what you want it seems better not to add and just keep the maximum. The closest points are choosen using Rtree
If someone need to output a “classic” bitmap file for the output of Isopod, here is a way with Nautilus,
It will work just with 1.9.7 as my tool didn’t like 1 pixel wide image (Grasshopper tool Import Image has the same problem as mine as it puts pixel on mesh vertex)
The idea is the same as what was proposed before, I use ValueAt to measure the field on pixels.Then I tried to find the same function as Daniel to convert the image