I am trying to run a geometry recognition from a point cloud scan. So far I’ve been able to clean my mesh to clean the desired shapes from unnecessary parts of my mesh (like ground plane). I now want to match previously defined breps inside the mesh where they are represented. I tried to do it with Hausdorff-Distance but found it very difficult since the pieces are to similar. Teaching a neural network also poses difficulties because the breps consist of a different amount of faces making it difficult to define training data for the neural network (so far I was only able to use the volume). The seperation of mesh into the geometries was done with k-means, but I don’t know how to use it for this step of processing since the breps and meshes are not sorted in the same way and also posses a very different amount of faces.
I’m really lost here on how to proceed further. Originally, I thought about doing some kind of packing of the training geometries inside the mesh and optimize its positioning with galapagos till it fits best. The real challenge is - I guess - that the mesh, where I want to fit my breps into consists of a composition of multiple breps, which need to be arranged inside.
Output goal:
Predefined breps oriented correctly to their position as defined by the point-cloud scan/mesh. This would give me their position in space and all parts the mesh is made of as individual pieces.
Please see screenshots for further explanation