Remove duplicates?

Hi everyone!! I am working with this volumes, and I would like to ask if anyone knows what is the best way to cull the volumes that are overlapping, generating duplicated boxes. They should be an entire column, like you can see in the screenshot below (yellow squares are showing the overlapping and would like to cull), and I would want to delete them or at least cull them. Is this possible? Thank you all!!

olverlaying (448.5 KB)

cull duplicates component.

Avoid creating the duplicates in the first place?

The purple group (below) is optional as SUnion is slow (~12 secs.). It organizes the small cubes into four branches.

olverlaying (427.2 KB)

thank you! but how do I delete the volumes that are marked in this screenshot? I need to empty that column of volumes.

duplicates for brep?

Right-click on the CullPt component and change from ‘Leave One’ to ‘Cull All’:

Better yet, avoid duplicates in the first place.

P.S. Making this change breaks the code in the purple group. It can be fixed but probably made no sense to you anyway?

Ok, thank you very much! I will now open it. but What does it mean it changes the code?

You mean you didn’t bother before? :man_facepalming: You won’t get far in any form of programming, including Grasshopper, if you can’t research, experiment and learn some things on your own.

1 Like

Yes, of course I opened it before, and I used it. So thank you very much for your help, and please don’t be that rude to people. I thought you sent a second gh file but then I realsied it was a screenshot. Again, thank you for your help.

no, the one @Joseph_Oster used.

Duplicate threads are rude:

Ignoring answers you have been given while scolding your teachers is very rude indeed.

Use Sort Duplicate Breps:

Is there any way to find the duplicated without flattening?

Where is the problem with flattening?

is that a native gh component? i think it’s a plug-in, maybe you can mention the name for @ffeldsberg

1 Like

By this way, you can preserve the datatree.

olverlaying (455.3 KB)

thank you @HS_Kim , you are always so helpful!!