I’m trying to align a plane to two points on a mesh. First, I have aligned the anatomical mesh in a desired orientation. I want to move the plane along an ordinal axis (e.g., Y axis) until it makes contact with the first two points tangential to the plane (note, the blue plane just touching the mesh in the attached file). Note that this requires the blue plane to rotate around the Z axis (see attached file).
In the attached file, I approximate the desired behavior by manually adjusting the blue plane with the sliders. If you look closely, you will notice the red points of the femur just piercing the blue plane. The goal is to automate the plane alignment in GH to both improve accuracy and workflow efficiency.
With MoveToPlane, the mesh/object only moves in a linear transform until it contacts one point in a plane (note the red plane in attached file). Thus, it is not a solution.
With Plane_3pt, it is not accurate as it is not possible to know which two points on the mesh are tangent to the same line/plane. Thus, it is inaccurate.
With Kangaroo (using gravity simulation), the mesh will lose its alignment as it will want to settle on the ground plane (XY) with three points of contact, not just the first two. I’m not aware of a way to confine the effect of gravity in Kangaroo to just one plane with a 3D rigid body.
Generally, I try to find a solution by first studying documentation, forums, and Vimeo/YouTube, but after a couple of days, I’m stuck. Your help is appreciated…
Hi @colorado1876, without a picture of your goal I wasn’t 100% sure I got it - then in you file given the plane manipulations you were doing it wasn’t clear to me which plane was the important one. Nonetheless, it seems you wish to rotate the plane, not the mesh, as presented by your topic title.
In this method the mesh is ‘segmented’ into two ‘halves’ to begin with, then the lowest point of each half, relative to the ‘base’ (red) plane, is found. Then the lowest of these two points is also determined, a line is formed, and the base plane is thus ‘aligned’ via Rotate Direction.
There’s an Evaluate Box component determining the plane from which you find said lowest points. This box evaluation uses a value of ‘zero’ (it’s a panel with a 0 in it), but if you replace that W coordinate with a 0 to 1 decimal slider, you can evaluate the box on its Z axis, thus moving the ‘local’ coordinate system. Anyway I still don’t know if I grasped the whole thing
Here is a 2D representation of the goal. The two desire points are in the red circles. The only difference is that in 3D, the mesh points of contact will not be in the same XY plane (plane parallel to world coordinates). Does the image help?
the component Plane Coordinates is used here to take care of that.
In other words, the two low points are found with respect to the ‘base plane’ of the mesh’s bounding box. This base plane isn’t the same as Rhino’s XY - instead the bounding box is given an XZ plane for its orientation.
*I did notice you were creating plane surfaces with specific dimensions - that I didn’t do - I simply generated planes through the shape, to visualize them quickly. However you can now use the centers of these planes to generate your plane surfaces with specified X & Y dimensions in case those matter.
Sorry, I’m not sure I understand. I do understand using evaluate box, but how would the box be aligned to the mesh - manually? In my GH file, I do align the blue plane manually, but the goal is to automate that task to improve accuracy.
Perhaps I should try and state the goal another way, again using the 2D image for reference and simplification. Imagine the mesh (femur bone) is fixed in its current position. Imagine the blue line fell from the sky (Y axis). After bouncing initially, the blue line (plane) would rest on two points (in red circles). The goal is to find the coordinates of those two points.
the bounding box looks like a default one except its orientation is changed by specifying a plane for it, in the case the XZ plane - this means you evaluate the box from ‘back to front’, or vice versa, as opposed to bottom to top. Looking at your sketch it seems I got what you wanted, but maybe not? Can’t know yet. If you intend to find the mesh’s plane you can try using Plane Fit from the mesh points, then feed this plane into the bounding box.
Cool - by the way in the Rotate Direction component you can swap the source and target directions, and feed it the mesh instead of the plane, in case you want to align the mesh instead of the plane.
It is close…but if you look at the yellow highlighted area of the attached image, you’ll notice that is approximately the point of contact, not the solved point.
On anatomy, if the curvature has a small effective radius, it may be “close enough”. But on anatomy with a very large effective radius (more flat), the point of contact may off by several centimeters. In simple geometry terms with lines and circles, the point of contact is not tangent to the line/plane - I hope this make sense.
FWIW, I used the femur as an example, but I plan to use such a tool for other anatomical analyses.
Thank you that makes a lot of sense! *I think… In this case then using two points of contact isn’t going to give us the actual ‘automatic’ solution, right? I think more like two ‘lowest regions’ of points. Then in a considerably larger radius, could you need more than 3 points of contact? And the anatomy object shouldn’t really change its position, correct?
Yes and no. It should not change relative to the global XYZ axes arbitrarily, but if it is easier to accomplish in a different known orientation, that is fine. In other words, if there was a way to constrain a gravity simulator (e.g., Kangaroo) to two points instead of three, that is fine as I know the change of orientation is 90 degrees from its current position. But an arbitrary anatomy position change is not desired.