3D photogammetry: Drone photos -> Memento -> Rhino (Private project)

This is one of the most amazing things I have done lately:
We needed a 3D model of our house with outdoor areas to plan the further progress. An architect friend brought his Phantom 4 drone and flew around the house (automatically) and took about 120 photos (on a cloudy day to avoid hard shadows) from 40 meteres height. These were uploaded to Memento and a few hours later the model was downloaded and converted to obj and brought in to Rhino in 1:1 scale in mm with an amazing accuracy. (I am not connected to Memento in any way)

As you can see it captured a lot of details.



Shall I just add that extra building to the map and get it over with? :wink:

Thank you for sharing Jørgen. I was not aware of this technology.
It looks very nice!!

1 Like

Phtogrammetry has made huge advances in the last five years. I use PhotoScan photogrammetry software to go from photos of boats and monuments to .obj models, and then import those models into Rhino for refinement, obtaining lines, creating drawings, etc. I use the standard version of PhotoScan which is $179 for a stand-alone perpetual license. http://www.agisoft.com/

Yeah, thanks, that saves me the trouble with getting the papers right. Phew! :smiley:
It’s just a garage, and we are trying to find the best placement regarding the appartment we are building in the basement. We had to dig around the house to fix the draining so the terrain doesn’t mach up with the map. As you can see I extracted the heightcurves from the new terrain as well. (0.5 meter equidistance)

1 Like

Nice to know, thanks.
Test out Memento, it’s cloud based and in Beta so it’s free for the time being.
It would be nice to know how it compares to PhotoScan.

What’s the accuracy on this thing? Max dim deviation?

Well, I haven’t checked, I’ll do a check tomorrow, but the 3D model I allready did of the house fittet right in there with only a few cm deviation in the longest direction. (in the scan the length of the house was 11.98 and 12.02m)

I’d use measure tape or distance finder to take off some overalls and measured the same spots in the model to determine the deviation.

We ended up with our own Phantom4 last week and has just started testing it out. Here’s the first test on a timber house between some trees. It’s being extended a bit and the celing angles were difficult to measure, so a scan would be a good start.(The mesh is strongly reduced, but you can still see a lot of details)

I made a displaymode with custom lighting and an untextured, matt white material to highlight the mesh structure propperly.

The first scan that had imagery from further away and came in 1:1 with medium tolerance in overall scale. But the detail scan came in too big. So it seems like it uses the GPS position to help on overall scaling and therefore having images that are far appart helps eliminate the bad GPS tolerance. I’ll dig more into this though, but it seems like it is important to have some real life referance points for properly scaling the 3D scan. The internal proportions seems very good though. More testing will be done.

1 Like

I just saw this tread…

1 Like

I’ve had some fun with VisualSFM.

I did some stuff around my yard, and my friend’s motorcycle, which came out good, but when I did product jug, the lighting might have been wrong, or it needed some features to grab onto. I remember a commercial product that included a roll of target stickers, that you would put on a car fender.

Meshlab is helpful for creating mesh surfaces from the colored point data.

Then you can import it into Rhino : )

The creepiest thing is: it will show you where you were when you took the pictures of the grassy knoll.
It’s Cuda enabled.

There is a tutorial video. https://www.youtube.com/watch?v=V4iBb_j6k_g