Help needed to develop a pointcloud script

Background: We work a lot with pointclouds at my office, our workflow is changing now.
In our previous workflow, when we exported pointclouds, we could decimate the pointcloud when exporting from the software, so we output a “thinner” pointcloud for areas we didnt need as dense. Other areas, we could output, areas at “full strength” to get a dense pointcloud.

The new workflow doesnt allow us to thin out the pointcloud when exporting so our output files are HUGE. This level of density is not needed on every area that we are working on. We already have scripts to remove the points from the pointcloud that are out of the area we want by window selecting them. This deletes ALL of the points contained within the window selection.

**what i am requesting is a way to window select an area and have the script either leave selected or unselect every other point visually or every 3rd point, or whatever by an X factor variable, and remove the points from the pointcloud so they can be deleted.
This would allow “thinning” of the pointcloud at desired locations and be able to leave certain areas at full density.

I prefer to do all that processing in CloudCompare first.

https://www.danielgm.net/cc/

Just to clarify for readers who might be able to help: In your “previous” and “new” workflows are you exporting from Rhino or some other software? Is Rhino the destination software? In other words, can you be a little more specific about what changed?

Yes sorry.

Old: Faro Scene for processing and registration. Exported out in individual .pts files. I could reduce the output file size with an option within the Scene software when exporting. Rhino is the destination. Where i will clean up further “trimming down” the pointcloud to tighten it up to just the area of the building i need to work with.

New: Leica Cyclone REGISTER 360 for processing and registration. Can export out individual setups as .e57 files or one whole PC in .pts filetype, but each is at FULL density, ( a small job and this one has 430M points). Again Rhino is the destination for cleanup, but now not only do i have to do mass cleanup as selective exporting of points is not available, but i need some way to “thin out” some areas that i dont need at full density. We are already using a macro utilizing the native '_Pointcloud command to remove unwanted points.

@pascal, Do you have any ideas for this?

@Mason, i am dealing with large clouds and would suggest to get familiar with CloudCompare application first. Once you know how certain commands work, you may just get away using the command line interface to subsample or crop your clouds. It works with clouds having normals and colors and supports .e57 files natively. The cmd line options you should look for are -CROP and -SS.

For insanely large clouds containing raw data, i would recommend to avoid opening them before the cleanup. Eg. when i get 4GB of data which i know is raw, i just drop the cloud files on various *.bat files, depending on purpose.

Eg. below will subsample the dropped clouds with 2mm distance and convert them into asc format. The files are saved in the input folder(s) and will be outputted with a time stamp suffix after the original file name. (You can even overwrite the original files if desired). Additionally a log file is created containing possible errors. This is a single line, you might have to set the paths used to your liking and save it as bat file for testing:

for %%f in (%*) DO "D:\Software\CloudCompare\2.7.0\CloudCompare.exe" -SILENT -O %%f -C_EXPORT_FMT asc -SS SPATIAL 2.0 -LOG_FILE "D:\Temp\results.txt"

If i drop eg. “Cloud1.asc”, “Cloud2.asc” on this bat file, the output is:

Cloud1_SPATIAL_SUBSAMPLED_2019-06-27_01h08_43.asc
Cloud2_SPATIAL_SUBSAMPLED_2019-06-27_01h08_40.asc

If you get a window showing you the conversion options, make sure the correct fields (in case of ASCII based clouds) are set for coordinates, colors and normals, then choose ApplyAll and let it do its thing.

Of course you can control the CC command line tool using python, but you have to deal with file I/O. Another option would be to access MeshLab server, it is slower and, at least for me, crashes a lot.

If you really want to do all this in Rhino, using custom code, you might read about cloud decimation, subsampling, octrees etc. It can be done multiprocessed.

_
c.

2 Likes

yes, what i am wanting is to know the commands/script functions so that i can work this into some custom code. I have searched but havent found other posts about cloud decimation specifically but that would be what i am after. To be able to decimate a pointcloud by a window or box selection. I already use the native POINTCLOUD command to separate and delete out unwanted points so i would imagine having the new custom script operate similarly to that where the pointcloud is selected, then a window/box selection of points is made, a default decimation value is printed to the command line that could be changed by user, then the script downsamples the selection by the decimation factor, and the points are removed to either pointcloud or points output, from where i can delete or change to another layer (if desired).

Are there scripting variables that can downsample a selected set of points within a pointcloud in order to make this work?

@stevebaer i saw you had posted on pointcloud posts before, do you have anything to suggest?

I don’t know of any “built in” functions to perform different reductions on point clouds. Depending on the type of cloud you are working with I would imagine different algorithms could be used. For decimation, this could be grouping local points through sorting and then either averaging or picking a single point out to represent a set. I would imagine there may also be need to eliminate “noise” with points that don’t belong at all. Neither of these are simple scripts that I can whip together in a discourse post.

Clement is pretty spot on with his suggestions.

Hi @Mason, on how to window select points in a pointcloud see my answer and example script here.

For decimation, you might either try above using python and command line access or learn how to import PCL which has a python binding.

Not really. There is Rhino.Geometry.Point3d.CullDuplicates which allows to set a tolerance to discard points, but it is not the fastest, especially when there are large amounts of points.

_
c.