Need to move mesh points in relation to the texture attached

I imagine this is fairly simple, but I haven;t been able to glean from the documentation which setting to adjust to accomplish it:

I have a plate object with curve between edge and centre that doesn’t match the position of that feature in the texture applied to it. When I scale the points associated with that feature in the mesh, the texture distorts to maintain the relationships between texture and points, I can see this would be valuable in many cases, but in this one I’d like the texture to remain unchanged and be moving the points in relation to it.

Hello - take a look at the UVEditor or applying different mapping types. (Object properties > Mapping page)


Thanks Pascal. I see how these controls can be used to adjust a texture to match an object or surface in many ways, especially with the UV editor. This is helpful, but can you point me to an explanation of how the object can be adjusted to match the texture? I can imagine importing the texture as a picture object and editing the object in an ortho transparent view mode on top of it. But there would be more simplicity, accuracy, and versatility in editing the object which has an applied texture while the texture’s projection is liberated from any attachment to the surface, maintaining the same position in space while the articulations of the object adust to align with the image.

Let’s say the image and model are both faces, but the model is crudely built and the nose is in the wrong spot acording to the texture we are applying, which is a photograph of an actual face. Let’s say the eyes and mouth and chin line up but the nose of the object aligns with the texture’s left cheek. We want to select the points forming the nose part of the object mesh and drag them so they line up with the texture’s nose. The way things are now, if we select the object’s nose points and drag them, the texture’s left cheek will warp with it and we will also be dragging the cheek of the texture image to the former nose position (warping the texture’s nose out of position toward the right cheek.) What we want is not to change the texture image at all, but rather to move the protruding nose part of the object so that it bulges where the texture’s nose is. I don’t yet see a means for facilitating that in the texture and mapping tools, all of which seem to offer a fixed connection between all points of a texture and all points of an object, albeit in a wide variety of orientations.

Moving the nose via control points moves the nurbs uv spans. The texture maps to those spans, thus image will morph. You have to remodel the nose at the correct location. Pound the bad nose back into the cheek, grow the new nose in the new spot by pulling points there.

Hello - if you can find a mapping that projects the image as you like it - say Planar mapping, then editing the internal bits of the object will move them relative to the mapping.


Oh Sweet!! That works nicely!

I thought there’d be something simple. thanks. When I first dealt with texture mapping 25 years ago using Imagine on an Amiga 3000, that was the only way it worked. A bitmap never stuck to a surface. Funny now to have to search for the option.

But I can’t get it to work. Here I am attempting to create PLanar Mapping and then editing a point on the object, but the texture remains attached to the point. What am I doing incorrectly?

The application of planar mapping (the gifs become too large if I take the time to position it correctly, but it’s visible that it’s happening by virtue of the misalignment.)

Moving a point here shows that it’s still attached to the texture:

Hello - is that embossed image the one that is being projected by the plane? Here’s what I get


That image is the only one attached. It’s attacheed as a bump map too, but that’s a different matter if I’m understanding this correctly. And the texture here is visibly remapped (out of alignment, as it happens here) by the planar mapping process. So it’s the one staying stuck to the object in the second gif.
You can see what I am doing by my commands in the CLI. Are we doing the same thing?
I have tried unchecking the bump map but the process still doesn’t work.

I have detached the texture from the model, loaded it as a picture beneath the platter object, edited the points as desired, and then reloaded the texture. That’s easy in this case because the picture plane of the texture map is the same as that of the view used for editing. But it most complex modelling tasks that won’t work out: for visual reference when moving the points, that editing would need to happen with the texture mapped on the object. I’d like to be able to do that if it’s in fact possible

Ths should work if the texture is mapped using one of the mapping styles other than Surface. The texture moving with the points seems to me only possible if it is not mapped using Planar, Box, Sphere, Cylindrical.

This file has planar mapping on the mesh - if you move the internal mesh points the mapping does not change. Does that make sense"
Planarmapping.3dm (206.7 KB)


I’m confused still.

If the UV editor is invoked on the file you just sent, the only texture movement registered by the previous editing is in the z axis. This result is what I’d like to accomplish by moving points around - but with continuous feedback. In other words, the texture remnains projected onto the surface (along the z axis) rather than stuck to the surface.

Just open my file as is, turn on points for the mesh and move them around - the checkers should not be affected at all in the Top view…?