PointCloud Analysis

Hi Peter,

I’m not an expert but I think you’ll have to call this parallel task with a range of integers, for example:
And you’ll have to create a list for each result, like:
Result = range(Cloud.Count)

You cannot append the results to a list as you will not get a meaningful printing inside a paralleled task.

If I understand correctly what you are trying to do, i.e. filtering points which are inside a box-like volume I would probably just check the coordinates numerically. You can transform the points to fit to the box orientation. This may be faster than brep.IsPointInside (if it does not have a special case for boxes).

Thanks for the replies. I’ll test code up both options and do some testing with larger point cloud samples and see how it goes.

I have assembled the following using script using ghpython.

def Threaded_PointTest(Brep,Cloud):
    ## Make Sure Test Polysurface is Closed ##
    if rs.IsPolysurfaceClosed(Brep)!=True:
        print "Test brep is Not Closed"
    ## Get a Private Copy for Multi-Threading ##
    ## Turn Object into a PointCloud ##
    ## Holder for Points inside Surface ##
    def TestPoints(Pt):
            if brep.IsPointInside(point,tol,True)==True:

    if r==True:

The script runs fine, I am able to scan through and evaluate a set of 20 million points in around 80 seconds. I am going to explore some alternative methods for accelerating this calculation. It looks to me the Cloud.GetEnumerator() for a large cloud is a rather lengthy way to pass the points into the parallel.run().

Also I think I am having issues with garbage collection. In the script I am creating 20 million 3dpoint objects in memory, some of which are used for calculating a best fit plane and never added to the document. Other Rhino objects have .Dispose() function that address this but I don’t see that option for a 3dpoint. After running the script I’m seeing around 4 GB of RAM being used. After panning around in Rhino a bit the garbage collection seems to kick in, realize the 3dpoints are not being used and returns the expected levels. I have also tried the ClearUndo command to clear the history and seen no drop in RAM consumption.

If anyone can point me the right direction that would be great.

The reason that Point3d does not have dispose, is that it is a value-type, meaning that it is allocated on the stack, rather than the heap. When these go out of scope, the memory will be cleared. Any Point3d in a collection (list, dictionary, …) will need be garbage collected when the collection is collected. You can force garbage collection using System.GC.Collect()

The way I would approach this is with an RTree data structure of the point cloud. Unfortunately, the RTree in RhinoCommon is too limited to support searching on different levels in the tree, but there are python packages that have RTree functionality so you could use these I guess.

The idea is that an RTree contains hierarchical data on bounding boxes that contain points. So, you would traverse the tree from the root on up and test if the bounding box at each level is contained within the Brep. If at one level it is contained within the Brep, you don’t need to test the higher levels of the tree. This way, you can more quickly decide if a (possibly large) set of points is inside the Brep. The overhead of creating the RTree will likely outweigh the is-inside-brep test of 20M points.

I realise this is an old thread but is it possible to edit the code to return just the points inside the Brep, i have tried to edit the code and commented out the last portion regarding the plane but not had much luck.

I have been doing this is a Grasshopper Python Component mind. So maybe this is why its failing

Thanks Matt

Hi Matt,

please note my 2 scripts posted in this thread. Both use multi-processing, the second is a bit more complex.


Hii @5chmidt, I really do not have more experience in scripting However this script looks really good and even I can use it. I am currently writing my master thesis. I have to do some progress monitoring stuff. I have to compare BIM As-built model (i.e. point cloud) vs BIM As-planned model. I was also trying the same thing with “Ispointinside_brep” then it shoud turn out to color indicating that "brep have some points. but in progress monitoring its not somehow working.

could you please tell me what is this script is about? Also, I wants to know " Turn object into a Point cloud" which object we should turn out into a point cloud? My questions might seems to be silly. but it would be great if you could reply it.
Many Thanks


Is used to cast the point cloud which is stored as a more memory efficient object into an object that has enumerated points we can iterate through for the purposes of our test.

It will take a lot of work to make this script function to compare a BIM model with a point cloud. The script as it currently uses a hand drawn brep to section out a segment of points from within a larger cloud. To do this efficiently you would need a prep script to make rough boundaries around areas of the model where you expect to see shapes. If the BIM model specifies a pipe location you would want to grab only the points in the area where you expect a pipe to be present, so you would need a brep to test them.

Once you have the points you want to analyze you can run a best algorithm. Rhino includes rs.PlaneFitFromPoints which is helpful, but you will probably need to fit other types of shape. Here is some documentation on fitting a pipe: cylinder best fit. You would presumably need to develop other methods to fit I-beams and other architectural features.

I hope this helps. Good luck.

@5chmidt Thankyou so much for your reply. I am testing on the road model and with Grashopper defination I have already merged and geomtry of the road. Its automatically merged as its Geo-referenced so. In this road there are 6 layers, I have control over each layer. I created this geometry using Grasshopper defination and also I have created the closed breps. Bascially its a layer detection process from Geomtry to PCL.
Looking forward to your reply. Thankyou so much
Could you please suggest me something I am also sharing pictures. Can I implement this “fit.plane” approach here?

It looks like you already have your target road segments from your grasshopper model. You may not need to do a best fit plane. Instead you can create a bounding box around the segment of road you want to analyze and select a subset of points that are near the geometry you are matching to. Once you have a smaller subset of points you can find the distance from the point to the geometry and color the points accordingly. IE green = within tolerance, orange = slightly out, red = way out of tolerance.

As an additional note, I’m not sure grasshopper is the best tool for this task. If you are creating your road models parametrically, grasshopper it maybe a good tool, but running large point cloud analysis in grasshopper will really weigh you down. In my previous example it was taking me 80 seconds to segment up 20 million points. Looking at your example, you will have dozens of road segments that would all need similar analysis. I generally try to keep grasshopper scripts light enough to run an iteration in a few seconds. If something is going to remain static, (like the as planned road) bake it out and so it is not part of the point cloud calculation script. Or if you can write your program in C#, python, or any of the programming languages Rhino supports it could help to prevent your grasshopper definition from becoming a giant spaghetti monster that take 15 minutes to redraw.

@parimalvala5022 what are your inputs to do this analysis? Is it a text file containing every XYZ describing the point cloud from your BIM model? And the meshes or breps you want to test for containment?

@rawitscher-torres @5chmidt
Thankyou for your reply.
I used “Volvox” plugin to import pointcloud in Grasshopper. Volvox plugin supports (.E57) point cloud files.

In comparsion I have to compare point cloud with Road model. I have created a Grasshopper defination to create road layers in Grasshopper (i.e. Asphalt layer, Base, subbase layer etc). In total there are 6 layers.
I have developed a script which is iterating the “points in brep”
In total there are 6 road layers - In each layer there are currently 51 segments and the point cloud after subsampling process there are 21000 points. Now there is only one problem about " Threshold points".Threshold points are nothing but the " minimum number of points in each brep to iterate"
For example- I set threshold point values to 20.Then once the computation starts- if in one brep there are 19 points found then it will not change the color other wise it will change the color.

How to set this threshold points? (There should be a convincing reason to set the threshold point because if I set the threshold point to 25 then anyone can ask why 25?
Am I going on rightway. Kindly request you to suggest.

@5schmidt you are right computation times takes almost 15 minutes for one iteration of one road layer.

I am also sharing with you guys the sample GH defination. If you guys can help me out with this it would be a great help.

looking forward for response.

Rhino_scriptsyntax_pythonnode.gh (25.4 KB)

@5chmidt Thankyou for your suggestion,

I would like to try this approach, could you please show me something visually by sharing some screen shots or drawing? Here you mentioned me that “find the distance from point to Geometry” here geomtry means “Segment” or the full road layer?
Lookinf forward to know more on this.
Thankyou verymuch in advance. It would be great if you could show me by drawing something or any post if you know related to this kind of task.

This is a great thread, it’s great to know you can parse 20m point cloud in around 80s.

There are a lot of interesting features in Cloud Compare (free software) that are let down by lack of scriptability (which Gh has in spades). Have a look/read of their implementation of Octree (similar o RTree?) for point cloud rendering, I don’t know if Rhino has anything comparable. The number of points in each box has implications for performance.

Below is a coloured comparison of the difference between two clouds, a very handy visual indicator of divergence.
I’m not sure about fastest way of performing similar analysis in GH, something like the Fast Mesh Ray analysis in Sasquash might be an appropriate method, if this approach helps your problem.


Hi @parimalvala5022

I did some quick tests, on clipping a point cloud with given meshes.
The thing to clarify here is that I did not use the PointCloud class , but just a very large collection of points for a proof of concept.

The nice thing as well is that I am not outputting the actual geometry in the view port, so it has very fast display.

It would be great if you could tell me, for example, what do you do with the output from Volvox and why is your GH file taking 15 minutes to update, what are you trying to do?.

Perhaps soon In can code a new PointCloud class with some new implementations. But would be nice to hear your goals.

Below are some images of clipping 64 million points and only displaying those which are inside a geometry. For 64 million points the exact method takes 62 seconds and the whole code takes 1.1 minutes in total to execute.

Objects for clipping

64 million point cloud

Clipped cloud


I did not post the file, because its boiler plate code

1 Like

@rawitscher-torres Thanks for your reply.
Basically I used volovx to import the point cloud (i.e .E57 file format point cloud) and also I have used that plugin for subsampling process and that plugin has much more capabilities. I am sharing with you a presentation in order to get some idea.

Its taking time because after subsampling process there are 21000 points and its checking in all Breps.
there are in total 51 Breps and the iteration process in script checking in each brep for points.

@rawitscher-torres What you are trying to do?

In my thesis, I have to do progress monitoring. In simple word, How much construction work is finished on that particular date (i.e. scan date)?
I got scans for 2 different dates. So, idea was to create Brep in road geometry and if the “points found in Brep” considered as a “Built”. In my first scan, there was an excavation work . So bascially in road there are 6 different layers. I have created Breps in each layer. By using my script,I have found that maximum number of “points found in 4th layer” meaning "excavation is done till that 4thlayer.

problem is that I have run that script by setting “Threshold” about 25 points. There is not a concrete reason that why 25? However, its giving correct result. there should be some relation with “points density” per square metre.

If you have any another idea to implement in Grashopper then I am looking forward to do it.
Looking forward to your reply. If you still didn’t get in your head, I can provide more details.Point cloud detecttion_process.pdf (1.7 MB)

Thanks for your reply. But I have to use Grasshopper.
If you have any other idea that I am looking forward to implement it.

I am not exactly up-to-date on the latest point cloud processing plugins for grasshopper. But I suspect it will be rather difficult to implement the specific functionality your are looking without coding a few of your own grasshopper components.

Here is a great tutorial on creating a python component.

If you are not an experienced coder, I would recommend starting with python it is not strictly typed and the syntax is nicely readable.

Thankyou for your suggestion. I will look at it.

Weird thread touching on ‘reverse engineering’ workflows.

I wish Rhino had this particular constituent ability, but sadly Rhino still can’t do much in this regard.

Maybe someday.

V7 is what V6 should have been. Just many years too late. Where to next?