Hello. I am trying to find out how to take 3D scan data and utilize the textures from the scanner to create a 2D texture file. Basically I want to take the scan data and create a larger 3D print of the object. Then I would like to use the textures to print on a vinyl and then wrap the 3D printed part. The textures come in as a UV texture map and not in 1 view. Does anyone know how to do this? PS the data was taken from our Artec Space Spider 3D Scanner
Here is the link to the data, feel free to use it. https://www.dropbox.com/s/7cyxguqnu3thfth/vans_RW.zip?dl=0
I had a lengthy discussion with another Rhino user offline about a very similar project. One issue is you cannot unfold a mesh and the texture in Rhino.
EDIT: Added another issue below concerning UVs
The other issues:
- Unfolding the scan topology directly would likely produce a mess, so creating a simplified version of the sneaker would be needed
- Changing the geometry from the scan will destroy the UVs, so there must be a consideration for projecting the original UVs onto the new or edited model so the texture is restored.
- Understanding of developable surfaces will help in remodeling/remeshing
- Mesh would have to be split up/seamed to produce separate areas that correspond to features of the sneaker.
There are several mesh based application that could handle the unfolding. UltimateUnwrap 3D is one that come to mind but there are others.
Also note the distinction between unfolding and unwrapping in mesh texturing applications:
- Unwrapping usually refers to producing a flat pattern where the triangle are distorted and DO NOT match the model.
- Unfolding usually refers to creating a flat pattern where the triangles match the 3D model exactly, no distortion/stretching. This is needed for the printed texture to match the 3D model.
You would like to 3D print and manually wrap the outcome with stickers, which show the texture.
Your problem is that texure has such a chaotic layout. Understood correctly?
The chaotic layout of the texture is common to 3d scanning, either by laser or structured light scanner or photogrammetry (ie Photoscan).
[quote=“cdordoni, post:6, topic:45003”]
The chaotic layout of the texture is common to 3d scanning,
[/quote]Yup, when matching pointclouds and baking vertex colours to texture those silly scanners unfortunately
don’t spend a single thought on how the part is actually put together :o)
I just wanted to make sure that I had understood the OP’s problem.
You can do this in Rhino since version 5:
- Use
_ExtractUVMesh
to unfold - Apply the texture using planar projection from the top to the UVMesh
In both steps, use the same two points to define (1) the UVMesh size and (2) the extremes of the planar projection.
To get an output with more control over mesh edges, borders, transparency etc. you might want to check this plugin. Note that this is 10 years old and creates just the image which the OP already has but without the mesh borders.
I think the problem is not that the process can’t be done in Rhino, it is more the quality of the mesh texturing from artec which needs to blend islands to avoid visible gaps in the 3d model once the texture is assigned.
The process to clean this up is a bit tedious and requires vertex colors…
c.
Clement, Maybe I am missing something … it appears _ExtractUVMesh unwraps (distorts) the geometry. What would be needed is a true unfold, where there is no distortion.
In order to fold the pattern and texture back up to produce a physical 3d object, the flattened triangles edges must be proportional to the original mesh.
I have not yet tried the Txtemplate plugin you referenced … will give it a go within a day or so.
Thanks
the _ExtractUVMesh
command creates a flat mesh with the vertices at 3d locations read from the 2d texture coordinates scaled by the size the user picks. Basically it creates the same mesh shown in the UVEditor. It however does not assign the texture used to the resulting UV mesh.
If you only need that, you may also check out the UnwrapRenderMesh plugin from my wiki page, it does the same and assigns the texture. I’ve tested with above mesh.
Yes, the proportions are done by unwrapping using something like LSCM or ABF. But both methods would not be able to perfom well using the way the UV atlas was created in above example. It requires a clean edge layout which would result in less stretch during flattening.
c.
Topology from any 3d scanning will be less than ideal, although some will be worse than others.
It would probably be best to just use the scan as a reference and completely rebuild the object it so it unfolds nicely. It’s far from a push button job.
I know in general its possible to take an existing set of UVs and reproject it on modified or new topology. I’m not entirely sure how well that would work here. Higher mesh density would be better as far as the texture is concerned, but makes the flat pattern more complicated.