How to split curves with a mesh?

OK, here’s the next silly attempt.

I have a bunch of curves (“rays”) pointing out from the center of a mesh sphere with a somewhat irregular surface. I want the mesh sphere to cut the “rays”. But a mesh doesn’t seem to be capable of cutting curves(?).

What I am trying to do is to measure the TOTAL length of all the cutted curves, and divide that length with the number of curves (in this case 19). And what would the result represent? (Hint: The Ave. Rage. Ra. Dius. Of the irregular spherical surface. Ahum.)

So how do I achieve my goal, if I absolutely cannot cut my “rays”? :slight_smile:

(If not obvious, the green paper plane is only there to illustrate how the rays are distributed towards the mesh surface (from a center point). Like a cross, that is.)

I would like to become a very, very, very happy Rhino user. :innocent:

// Rolf

Hi! MeshToNURB will create a polysurface version of the mesh.

You might try this, it involves all sphere points without line intersections:

  1. _ExtractPt the deformed mesh sphere points
  2. Create a NURBS sphere using _Sphere _FitPoints
  3. Measure the radius of the NURBS sphere

c.

Or you could find the volume, divide by 4pi/3, and take the cube root. Should be pretty close.

@Chuck & @clement,
Unfortunately the mesh sphere is full or inner junk, so running the _Sphere _FitPoint command pushes the result out of center (same thing with volume, which will give just as arbitrary results). So I would need only the “out most” surface without all the crap inside in order to get the center point, and the radius (This is actually a bone structure, with cavities and bone marrow and… “junk” in this context :slight_smile: )

This is the result of running the command _Sphere _FitPoint :

… and the inside (bottom perspective view) of the spherical point cloud (which explains why the sphere was pushed out) :

It seem I will have to resort to intersect/cut using a bunch of surfaces, resulting in those ugly curves, but they can be fixed (as I learned in the other post today), and from those curves I can reconstruct a surface, and so on, till I have my sphere center.

This looks more and more like a job for Grasshopper, since this will be a task to be repeated many times, so I hope that the commands that works for the manual task is also available in GH…

// Rolf

Hi Rolf,

if scripting is an option you might also try Intersection.MeshLine. You’ll get many points and then find the farest point from your ray origin. Before doing this, avoid measuring points which are “behind” and filter these first. You might create a plane from the ray vector (so that the vector is the z-axis of the plane) and measure only points which have a positive z-value in that plane.

c.

Hm, I just started to think that perhaps Drape can be used?

I’ve never used it, but it “sounds” like a solution, if it stops hanging on the first points it encounters (thus avoiding the inner-junk-problem).

// Rolf

Hi Rolf,

if you can post a small partial mesh and a few of the ray lines i might try something as described above.

c.

Here,

This is a low quality mesh, but within the tolerance requirements so no problem.

test_humerus_head_000.3dm (1.3 MB)

// Rolf

What if you set the CPlane perpendicular to the ray, and project the endpoint to the mesh. That should give you all of the intersections of the line with the mesh. Then take the point closest to the end of the ray, in the right direction.

@chuck, this is what i suggested above.
@RIL, see below example script and file containing some test lines…

test_humerus_head_cg.3dm (1.4 MB)

…wait found some bug…

c.

@RIL, as long as your ray lines are long enough, this should work:

Rolf_BoneSphere.py (1.2 KB)

btw. you might also try RANSAC to find a sphere in the mesh, without any ray lines…

c.

@clement,
Big smile. :grin:

I think it works pretty good. I had tried eye-balling earlier and made a note of a radius; R=20.69. Your script lands at R=20.77. Pretty close.

Perfetto. I let Grasshopper generate the rays, like so:

Now it remains to find the mesh sphere center and orient the new sphere to that location.

Thank you very much, this script made things much easier for me.

// Rolf

@clement,
When I put the starting point for the rays approximately near the center of the mesh, then I get the results mentioned in the previous post. But when I put the starting point like this (only a short arbitrary distance inside the mesh), then the calculation result is only about half the radius :

As you can see I have "spread out the rays to cover most of the spherical area that seems meaningful to scan, but the method fails to arrive at a correct radius when not starting from near center. So I modified the code to specify a starting point which is defined by the center of a “guide sphere”. That will force the algorithm to always “watch towards the center”, if that guide is only on the inside of the mesh.

Still have not found a good way to determine the final (Mesh) sphere center (I also tried to add a plane surface based on the “point cloud”, but that surface tends top end up being placed way out in cyber space, so apparently I don’t understand how the AddPlaneSurface works).

Latest version of the code :
OrientHumeralHead.py (2.9 KB)

// Rolf

Hi Rolf,

the ray lines in my file where drawn by hand of course, so some user interaction is required to find out which area of the bone is spherical. If this user interaction is allowed, you might get to a useful result by going with my first advice. Let the user select a few mesh faces which visually contribute to a sphere using:

_-ExtractConnectedMeshFaces _SelectFaces=LessThan _AngleBetweenFaces 4

then from the extracted mesh faces, get the topology vertices using _ExtractPoints and finally fit a sphere using _Sphere _FitPoints.

The only way i know finding the sphere without any user interaction from the mesh is to use RANSAC and provide the number of points to search for and a tolerance value. You can do that in cloudcompare using the bone mesh vertices or the convex hull of the bone mesh vertices.

c.
_

I get fairly good results generating the points from my rays, now that I introduced a point that is guaranteed to be inside the mesh for the intersections to orient towards. But it doesn’t seem like the command _Sphere _FitPoints is available from Python? I can’t find it in the help file. (and the command rs.AddSphere doasn’t have the option _FitPoints).

If I had _FitPoints in Py the problem would be solved (I have a Grasshopper def to orient the curves to relevant parts of the mesh sphere, so no problem there either).

// Rolf

It should be available through RhinoCommon if you look here.

c.

Ah, thanks!

I may make a GH Script component for this task, so I just found that GH has a component for this (result is pictured below). I have concluded that I will go for using the points generated from the rays/mesh-intersections in the code.

The generation of points by selecting polygons on the surface is too slow. I will instead make the GH-rays cover more area, and skip such intersection points that change too abrupt in distance (grooves must not be included anyway), and that takes manual interaction, which is now done by a MD-slider in GH allowing the user to rotate the rays away from grooves and shrink the spreading angle of the rays to stay only within the spherical area.

Thank you very much for your help so far.

With this I have found the optimal solution with the least demand for precision from the user. The groove on the mesh sphere is a lesion which have to be located separately and manually anyway, and the ray/mesh intersections gives all the control that the user needs to semi-automagically locate whatever needs to be located.

Fig.1. Similar and even better than the sphere below can be achieved with more dense patterns of ray/mesh-intersection->points :


Edit: Workflow so far (a bit messy still, but will be cleaned up and automated as much as possible after enough accuracy is achieved) :

https://drive.google.com/file/d/0B2OlFpI0gNEGOTlkYUZhdmFtdzg/view

// Rolf