Working with survey data

Hi guys,
How do you manage big survey data sets?
Rhino very slow in general working with big meshes and big number of points compare to say blender.

For example if you have something like this (image below) and you need to add roads , make cut and fill or use Kangaroo for path finding. It just does not work.

Are there smart ways to split the data or process it. Or maybe reduce the point cloud and work on simple version?

What do you do in practice?

Thanks

This is about 300,000 points

Go scripting.
Like c# (best in gh context) or python (still very good, sure).

Ok. I am familiar with C#. Should not be a problem.
But how to approach this problem?
Even Cull Duplicates takes ages :open_mouth:

what is that?

You talked about “add roads” or “make cut and fill” or path finding… “smart ways to split the data or process it”… “reduce the point cloud and work on simple version” …
… and you have 300k points, not a mesh.
The context is totally vague, but generally speaking, with big numbers making a dedicated and specific script code usually works best, in GH.

Also, from the picture it seems the map (is it a map?) was already a low detailed mesh (we can see triangles/polygons), but then a much denser grid of points was projected over it, and then some points were removed.
If I were you I would try to get hands on the original data, the original mesh and such…

1 Like

Ok the problem is how to prepare point cloud to be able to work with it in Grasshopper.
You have one .CSV file with 300K points in it (or more) any operations on it takes ages too calculate.
Any grasshopper component take time to compute

Yes. This is a map. The origin of the data is not important I think.

The only way I can think of (as you say and it is what I was doing) project a grid of point to construct a simple mesh to work with.

But:

  1. Sometimes you can’t construct a detailed mesh because it is so huge
  2. And if you got it to work you need a way to apply all your changes back to original point cloud

Here C# Create a mesh with a hole inside - #31 by maje90 a script that uses rhinocommon delaunay method to build a mesh and then remove unwanted edges, 100k vertices in less than a second.
There are other examples that are even better, from experts in the forum.

I still have no idea of what you need to do… so …

Why so? Can you post your data? Your actual code? Anything?

…I’m not understanding this. Please elaborate… and attach something.

In fact for road routing in real life you’ll need a MOA algo (Multi Objective A-Star). Using the trad road objectives (and their weights): distance, slope, path curvature and some more (obstacles, non walkable nodes and the likes) try to combine them into a single value for a classic single objective (i.e. distance) min cost routing solution. Note: compute value on the fly using the parent, current, target node as in any A-Star algo (an Adjacency Matrix is a big waste of time). That said avoid Dijkstra at any cost: old and very slow (since it lacks the “heuristics” - so to speak - of the newest algos).

1 Like

Hi @maje90 apologies for late response.

Just to give more context. Say you have an CSV file with hundreds of thousands points. Survey of the land. And you need to propose some road routing and plot placement. The land is not flat , so it would be good to have conceptual cut and fill.

My process:

  1. Construct mesh from original CSV. Problem because point cloud is big it takes ages to construct it using grasshopper delaunay. Ok potentially this could be solved by C# script using RhinoCommon. But again it should process the information in parts, as Grasshopper I am sure uses RhinoCommon under the hood. So it is not just a simple delaunay algorithm.

  2. Ok I finally got the mesh. It is huge I arms strangle to navigate in the viewport.
    I project a grid of points and construct simplified mesh to work with.

  3. I use simple mesh for all the things I need it. Propose road, place plots, make conceptual cut and fill.

  4. Now I need to take parts of new mesh (road, cut, fill, plots) and merge them with original one.
    And I just simply can’t see how it is possible.

Unless there is other way?

I do not have a specific file I can share, it just a frustration I have with Rhino every time I need to work with big survey files. I am sure I am not alone, thought someone had an experience with surveys and can advice.

I know there is Rhino Terrain. Did anyone tried it?

Given the opportunity: Delauney is only suitable for “flat” collections (because is written for flat stuff). For real-life terrains (i.e. general case) AND provited that your surv data are “evenly” distributed in space use some parallel Ball Pivot thingy (but is unlikely to find any pro level BP solution out there - for more than obvious reasons). That said parallel computing is not what most people believe: without a robust thread safe policy is a waste of hope and effort.

PS: mastermind a way to interactively reduce Mesh data for your first approximations: for instance assume that you want to preview (so to speak) some routing in a mountain. Chances are that a solution that works (or it appears working) for a far less dense Mesh … works for the real thing as well (Karma is a must). Other than that a “mild” landscape … well … doesn’t need a zillion Mesh vertices.

PS: IF R was a solid modeller (it isn’t) you could try Plan B as well: get a Brep out from some “reduced” state of your Mesh … and play solid ops games for your routing.

I’m not a fan of working with heavy meshes.
But i’m also not a fan of threads where after 12 posts there is not a single attached file with a real scenario/example.

If I were in your position, I would try to work with the original data, not that much denser projected grid (which is nonsense).
That point cloud of your is a projected grid, so delaunay triangulation would work, imho (still talking over the single picture you posted)
From your picture it is obvious it existed a much lighter mesh, where then you projected a dense grid over it. Use that mesh, don’t build a new one.

Or, whatever. Impossible to saying anything more relevant until you attach something or give a big and exhaustive description of the situation.

1 Like