There’s various ways to do that but in cases like the mesh captured use (add the obvious checks etc etc):
BTW: Do you know all things related with mesh V, E, F connectivity? Can you map vertices indices to topology vertices indices? (and the other thing). Do you have any decent Ball Pivot C#? (or Delauney for some cases).
OK, I can get the inner loops. But since they are Polylines with segments with lost relation to the Edge indices, I need to do some drilling to get hold of the edge indices, somehow.
It seems indices is always needed - to get hold of any of the other indices! Hm.
I bet there must be a way to get the edge index, but in the documentation I have so far only arrived at; -“you need an index for the index you are looking for”, kind of.
All things related? Hm, that would be an exaggeration. I have read up on the half-edge concept, but I have not drilled into Rhino Mesh. Without a class diagram for all the TopologyLists if feel kind of lost. I always arrive at yet another index which I need in order to get the index I’m searching for. Aso.
Nope. I have no idea about that concept. Is the concept documented somewhere?
The right thing to do is (a) find all inner holes/topo vertices/blah blah and then (b) do a Delauney (requires cleaning/fitting) or a Ball Pivot (requires measuring edges in order to provide the correct search R) . Of course for planar (or almost planar) meshes - where D works OK - this is easy (Mesh Machine plays ball as well in such cases) … but the more the mesh is like a blob (or a very hilly terrain) and the holes are non planar … things become quite complicated. That said all my BP things are strictly internal.
You guys are seriously over complicating things here, the method (i.e. get naked edge loops and triangulate these) I linked to above will likely work. Although there might be situations such as these that require one to further break down the polylines forming the edge loops to ensure valid meshing (in this case, these three holes might be returned as one polyline which will have overlapping vertices):
As for distinguishing between outer/inner loops, I think I’d simply set a MaxLoopSize input parameter, and then only triangulate/mesh edge loops of a certain size. That way you’d negate the whole surface topology concern entirely.
Indeed it works for some cases. But the general case (“big” holes + some way to fill the holes [or some way to make a mesh soild] with “as - on average - same looking as possible” faces) is not that easy.
Anyway for Rolf that one (freaky MTV, MV connectivity matters etc etc) :
Sweet Jesus: that thing not only is a scooter but is a Suzuki as well. Plus is painted in Suzuki racing colors (hope dies last: last time that S won the top class was 2.5+M years ago). But I do admit that the greatest racer ever was a Suzuki rider (2.5+ M years ago).
Moral: Ducatium Amamus Dum Spiramus.
Other than this very serious matter … that poly.TriangulateClosedPolyline(); Method does meshes that are even uglier than the Suzuki (any Suzuki to be honest). This rises the 1GZ question: where is this world heading? (with regard Suzuki matters, forget meshes).
Shameless plug: For anyone interested in high capacity computing (HPC) and my son’s contributions to the field by, among other things, making Spotify’s workflow system Luigi fit for research in bioinformatics (simplified use) as well as his own creation, the fast & close to the metal slim alternative platform for workflow processing “SciPipe”, have a look at the following links. One of Pachyderm’s developers (Whitenack) compares three current top workflow systems for HPC & streaming data where he lists Pachyderm, my son’s version of Luigi (SciLuigi) and his "close to the metal SciPipe as the most noteworthy.
Main objectives; Simplication (ease of use, restore verifiability and repreducibility, stability, restartability after failed week-long runs, etc) and speed, by going closer to the metal.
For researchers in the field of bioinformatics and anyone with ned for HPC;
Problems for which all the above serves as part solutions: Recent research on the increasing technical complexity in bioinformatics ruining the classic approach of independent verification & reproducibility of earlier research.