I have a bunch of curves (“rays”) pointing out from the center of a mesh sphere with a somewhat irregular surface. I want the mesh sphere to cut the “rays”. But a mesh doesn’t seem to be capable of cutting curves(?).
What I am trying to do is to measure the TOTAL length of all the cutted curves, and divide that length with the number of curves (in this case 19). And what would the result represent? (Hint: The Ave. Rage. Ra. Dius. Of the irregular spherical surface. Ahum.)
So how do I achieve my goal, if I absolutely cannot cut my “rays”?
(If not obvious, the green paper plane is only there to illustrate how the rays are distributed towards the mesh surface (from a center point). Like a cross, that is.)
@Chuck & @clement,
Unfortunately the mesh sphere is full or inner junk, so running the _Sphere _FitPoint command pushes the result out of center (same thing with volume, which will give just as arbitrary results). So I would need only the “out most” surface without all the crap inside in order to get the center point, and the radius (This is actually a bone structure, with cavities and bone marrow and… “junk” in this context )
This is the result of running the command _Sphere _FitPoint :
… and the inside (bottom perspective view) of the spherical point cloud (which explains why the sphere was pushed out) :
It seem I will have to resort to intersect/cut using a bunch of surfaces, resulting in those ugly curves, but they can be fixed (as I learned in the other post today), and from those curves I can reconstruct a surface, and so on, till I have my sphere center.
This looks more and more like a job for Grasshopper, since this will be a task to be repeated many times, so I hope that the commands that works for the manual task is also available in GH…
if scripting is an option you might also try Intersection.MeshLine. You’ll get many points and then find the farest point from your ray origin. Before doing this, avoid measuring points which are “behind” and filter these first. You might create a plane from the ray vector (so that the vector is the z-axis of the plane) and measure only points which have a positive z-value in that plane.
What if you set the CPlane perpendicular to the ray, and project the endpoint to the mesh. That should give you all of the intersections of the line with the mesh. Then take the point closest to the end of the ray, in the right direction.
When I put the starting point for the rays approximately near the center of the mesh, then I get the results mentioned in the previous post. But when I put the starting point like this (only a short arbitrary distance inside the mesh), then the calculation result is only about half the radius :
As you can see I have "spread out the rays to cover most of the spherical area that seems meaningful to scan, but the method fails to arrive at a correct radius when not starting from near center. So I modified the code to specify a starting point which is defined by the center of a “guide sphere”. That will force the algorithm to always “watch towards the center”, if that guide is only on the inside of the mesh.
Still have not found a good way to determine the final (Mesh) sphere center (I also tried to add a plane surface based on the “point cloud”, but that surface tends top end up being placed way out in cyber space, so apparently I don’t understand how the AddPlaneSurface works).
the ray lines in my file where drawn by hand of course, so some user interaction is required to find out which area of the bone is spherical. If this user interaction is allowed, you might get to a useful result by going with my first advice. Let the user select a few mesh faces which visually contribute to a sphere using:
then from the extracted mesh faces, get the topology vertices using _ExtractPoints and finally fit a sphere using _Sphere _FitPoints.
The only way i know finding the sphere without any user interaction from the mesh is to use RANSAC and provide the number of points to search for and a tolerance value. You can do that in cloudcompare using the bone mesh vertices or the convex hull of the bone mesh vertices.
I get fairly good results generating the points from my rays, now that I introduced a point that is guaranteed to be inside the mesh for the intersections to orient towards. But it doesn’t seem like the command _Sphere _FitPoints is available from Python? I can’t find it in the help file. (and the command rs.AddSphere doasn’t have the option _FitPoints).
If I had _FitPoints in Py the problem would be solved (I have a Grasshopper def to orient the curves to relevant parts of the mesh sphere, so no problem there either).
I may make a GH Script component for this task, so I just found that GH has a component for this (result is pictured below). I have concluded that I will go for using the points generated from the rays/mesh-intersections in the code.
The generation of points by selecting polygons on the surface is too slow. I will instead make the GH-rays cover more area, and skip such intersection points that change too abrupt in distance (grooves must not be included anyway), and that takes manual interaction, which is now done by a MD-slider in GH allowing the user to rotate the rays away from grooves and shrink the spreading angle of the rays to stay only within the spherical area.
Thank you very much for your help so far.
With this I have found the optimal solution with the least demand for precision from the user. The groove on the mesh sphere is a lesion which have to be located separately and manually anyway, and the ray/mesh intersections gives all the control that the user needs to semi-automagically locate whatever needs to be located.
Fig.1. Similar and even better than the sphere below can be achieved with more dense patterns of ray/mesh-intersection->points :