Align plane to nearest two points on a mesh

I’m trying to align a plane to two points on a mesh. First, I have aligned the anatomical mesh in a desired orientation. I want to move the plane along an ordinal axis (e.g., Y axis) until it makes contact with the first two points tangential to the plane (note, the blue plane just touching the mesh in the attached file). Note that this requires the blue plane to rotate around the Z axis (see attached file).

In the attached file, I approximate the desired behavior by manually adjusting the blue plane with the sliders. If you look closely, you will notice the red points of the femur just piercing the blue plane. The goal is to automate the plane alignment in GH to both improve accuracy and workflow efficiency.

With MoveToPlane, the mesh/object only moves in a linear transform until it contacts one point in a plane (note the red plane in attached file). Thus, it is not a solution.

With Plane_3pt, it is not accurate as it is not possible to know which two points on the mesh are tangent to the same line/plane. Thus, it is inaccurate.

With Kangaroo (using gravity simulation), the mesh will lose its alignment as it will want to settle on the ground plane (XY) with three points of contact, not just the first two. I’m not aware of a way to confine the effect of gravity in Kangaroo to just one plane with a 3D rigid body.

Generally, I try to find a solution by first studying documentation, forums, and Vimeo/YouTube, but after a couple of days, I’m stuck. Your help is appreciated…

Mesh with 2-points of contact.gh (550.7 KB)

Hi @colorado1876, without a picture of your goal I wasn’t 100% sure I got it - then in you file given the plane manipulations you were doing it wasn’t clear to me which plane was the important one. Nonetheless, it seems you wish to rotate the plane, not the mesh, as presented by your topic title.

In this case I thought a way to do it is like this:
Mesh with 2-points of contact.gh (563.9 KB)

In this method the mesh is ‘segmented’ into two ‘halves’ to begin with, then the lowest point of each half, relative to the ‘base’ (red) plane, is found. Then the lowest of these two points is also determined, a line is formed, and the base plane is thus ‘aligned’ via Rotate Direction.

Check if it’s what you wanted.

Best,
RC

Hi René, I apologize for not being more clear about the goal. Yes, the preferred approach is to move the plane to the mesh, not move the mesh.

Essentially, the plane (XY, YZ, or XZ) motion is constrained to movement along the X, Y, or Z axis and rotation about an ordinal axis.

There may be another way of finding these two points, but this is how I think of it mechanically.

Thank-you for the file! I’ll take a look at it now.

1 Like

There’s an Evaluate Box component determining the plane from which you find said lowest points. This box evaluation uses a value of ‘zero’ (it’s a panel with a 0 in it), but if you replace that W coordinate with a 0 to 1 decimal slider, you can evaluate the box on its Z axis, thus moving the ‘local’ coordinate system. Anyway I still don’t know if I grasped the whole thing :stuck_out_tongue:

Here is a 2D representation of the goal. The two desire points are in the red circles. The only difference is that in 3D, the mesh points of contact will not be in the same XY plane (plane parallel to world coordinates). Does the image help?

Edit: I guess in 2D, I should have labeled the blue line as a “line”, not a “plane”. But it represents the plane in my application.

1 Like

the component Plane Coordinates is used here to take care of that.

In other words, the two low points are found with respect to the ‘base plane’ of the mesh’s bounding box. This base plane isn’t the same as Rhino’s XY - instead the bounding box is given an XZ plane for its orientation.

*I did notice you were creating plane surfaces with specific dimensions - that I didn’t do - I simply generated planes through the shape, to visualize them quickly. However you can now use the centers of these planes to generate your plane surfaces with specified X & Y dimensions in case those matter.

Sorry, I’m not sure I understand. I do understand using evaluate box, but how would the box be aligned to the mesh - manually? In my GH file, I do align the blue plane manually, but the goal is to automate that task to improve accuracy.

Perhaps I should try and state the goal another way, again using the 2D image for reference and simplification. Imagine the mesh (femur bone) is fixed in its current position. Imagine the blue line fell from the sky (Y axis). After bouncing initially, the blue line (plane) would rest on two points (in red circles). The goal is to find the coordinates of those two points.

1 Like

Did you check the file?

the bounding box looks like a default one except its orientation is changed by specifying a plane for it, in the case the XZ plane - this means you evaluate the box from ‘back to front’, or vice versa, as opposed to bottom to top. Looking at your sketch it seems I got what you wanted, but maybe not? Can’t know yet. If you intend to find the mesh’s plane you can try using Plane Fit from the mesh points, then feed this plane into the bounding box.

Yes, but it didn’t seem different than the file I posted. I’ll download and check again. :grinning:

1 Like

?

It was definitely different :sweat_smile: unless nothing came in the file? LoL - paste a screenshot of it?

I did use the same file name, though.

In your file the mesh was still crossing your manipulated plane - I meant to avoid this using the lowest points approach.

OK, I downloaded it again. It is definitely different. Will take me some time to study it.

I thought I had closed all GH files, but perhaps I opened the wrong file. My mistake!

1 Like

Cool - by the way in the Rotate Direction component you can swap the source and target directions, and feed it the mesh instead of the plane, in case you want to align the mesh instead of the plane.

Nice! That is a great feature.

It is close…but if you look at the yellow highlighted area of the attached image, you’ll notice that is approximately the point of contact, not the solved point.

On anatomy, if the curvature has a small effective radius, it may be “close enough”. But on anatomy with a very large effective radius (more flat), the point of contact may off by several centimeters. In simple geometry terms with lines and circles, the point of contact is not tangent to the line/plane - I hope this make sense.

FWIW, I used the femur as an example, but I plan to use such a tool for other anatomical analyses.

1 Like

Thank you that makes a lot of sense! *I think… In this case then using two points of contact isn’t going to give us the actual ‘automatic’ solution, right? I think more like two ‘lowest regions’ of points. Then in a considerably larger radius, could you need more than 3 points of contact? And the anatomy object shouldn’t really change its position, correct?

Yes, that is another way of thinking about it. And yes, the anatomy shouldn’t change its position.

BTW, thank-you very much for your assistance! I have learned a lot from you, not just in this thread, but other threads.

1 Like

Yes and no. It should not change relative to the global XYZ axes arbitrarily, but if it is easier to accomplish in a different known orientation, that is fine. In other words, if there was a way to constrain a gravity simulator (e.g., Kangaroo) to two points instead of three, that is fine as I know the change of orientation is 90 degrees from its current position. But an arbitrary anatomy position change is not desired.

1 Like

Thanks, I am learning as well!

I don’t foresee the need for that. In my application, there should always be a valley between the two “lowest regions” or points of contact.

1 Like