Heron Plugin - Import LAZ - not retrieving points

I’m using the Heron plugin. Have added the node for “import LAZ”, connected the file filepath.
Panel shows zero points.


PLEASE NOTE, THE FORUM WOULD NOW ALLOW THE UPLOAD OF AN *.LAZ FORMAT FILE, SO I HAVE TEMPORARILY CHANGED THE EXTENSION TO *.X. JUST CHANGE BACK TO LAZ TO REVIEW
Points.x (642.5 KB)

Hi @Josh_Verran ,
Can you try saving the file out of CloudCompare as LAS or LAZ? I built ImportLAZ on laszip.net which can be picky about the source/formatting which created the LAZ.

-Brian

Hi Brian

Thanks for the reply and appologies for not replying sooner.
Saving the file as LAS from CloudCompare worked as you suggested.

This has opened up a tonne of workflows, really appreciate your efforts.

Hi @Brian_Washburn - I am also having the same issue. I tried opening and saving a copy with CloudCompare, but it still persists. Do you have any advice for me?

The .laz file can be downloaded from here : https://opentopography.s3.sdsc.edu/pc-bulk/NZ21_Taranaki/CL2_BJ29_2021_1000_0249.laz

Thank you :pray:

Take this with a grain of salt, as I’m mostly self-taught. I had CloudCompare open, selected the point cloud, and chose “Save File” as “LAS cloud”. Make sure the file extension is .las instead of .laz. This worked for me. More knowledgeable users might elaborate further or explain why this method is effective.

Cheers @Josh_Verran - I’ll give that a go but I am trying to stream the geometry for an art project so it would be great to be able to load directly from the Amazon S3 URL without having to go through CloudCompare.

If this is the only way then I’ll have to process over 5500 (360 GB) tiles from OpenTopography… which I’m trying to avoid… :sweat_smile:

:pray:

P.S. Re-saving the file in CloudCompare didn’t solve it

1 Like

Hi @krishna.duddumpudi ,
I just published Heron v0.4.3-beta.1 to the pre-releases in the Package Manager.

With this pre-release I’ve allowed an HTTP input for the filePath input which simply downloads the file to your temp folder then uses this local file to import. It also fixes a bug when reading the point count in LAZ v1.4 files that may fix @Josh_Verran 's issue. Previously I wasn’t picking up “extended” parameters in v1.4 headers so some v1.4 files would report 0 points and not run through the import loop.

Let me know if this release fixes your issues.

-Brian

1 Like

Note, you can also input multiple files and use the boundary to clip what you want from adjacent tiles.

-Brian

1 Like

Amazing @Brian_Washburn - thank you very much, this is working great!

Do you know if .laz files could work with the classification data OpenTopography provides by any chance? It would be great to filter the points using these classifications.

I have only managed to do this with ASCII / Text exports using the headers but this is quite laborious for large datasets. If you have any advice on how to work with .laz files directly that would be great!

Is there any way you can expose these parameters in your component?

Thanks again for the patch and the amazing plugin! :pray:

1 Like

Hi @krishna.duddumpudi ,
The points output from the component are organized in a data tree whose branches correspond to the classification (if the classification is included in the scan data).

Here is a little cheat sheet from the LAZ 1.4 spec found here:

-Brian

1 Like

Hi @Brian_Washburn

Thanks for the information above, I am about to try myself.
With regards to the HTTP input, part of that address looks familiar.
However when navigating the opentopography website, I can’t find similar myself.
It appears to be coming from the bulk download section.

I’m not sure how to navigate to it either, I was just using the link above (and for the multiple inputs, I guessed the file names knowing they were tiles). @krishna.duddumpudi can you share your methodology?

@Brian_Washburn - Ah thanks, this is great!

Hey @Josh_Verran & @Brian_Washburn,

Once you know which OpenTopography data set you are working with, you can search through the corresponding bulk downloads folder hosted on AWS S3. Within this folder, you will find a .zip file that contains a .shp file which has a polygon outline and index for each tile within that data set.


It can be hard to find the .zip file amongst the thousands of other files using the browser so I used Cyberduck and connected to the Amazon server with the credentials as suggested by the pop-up that comes up when you click bulk-download.

Once you read the .shp file in GH using a Heron or in my case @IT component, within the meta-data (features) of each polygon, is the URL link to the specific point cloud that corresponds to each tile that you can now call directly into GH with Brians latest update :zap:

I’ve filtered the tiles in the screenshot below to just load ones within a certain radius. It seems only some datasets have RGB data.

There’s perhaps a more efficient way - but I’m figuring this out myself. I hope this helps!

3 Likes