ObjectCaptureFromPhotos for RhinoWIP (Mac-only)

The PackageManager now contains Mac-only package: ObjectCaptureFromPhotos available to those running macOS Monterey (or later) on Apple Silicon Macs. This command uses photogrammetry to process a set of photographs to produce a mesh and an associated material:

Try It

  1. Verify you are on macOS Monterey (or later) and running on an Apple Silicon Mac.
  2. Update to the latest RhinoWIP and run the PackageManager command. Search for ObjectCaptureFromPhotos and Install it.
  3. Download this set of sample photos: Shoe-Photos.zip and unzip them.
  4. Start a new modeling window and run the ObjectCaptureFromPhotos command.
  5. Navigate to the folder with the photos you downloaded in step 3 and click OK.
  6. There are number of command options*, but just accept the defaults and press Done.
  7. Processing begins. With this sample and default options, it should take about 90 seconds.
  8. When processing is done, you should see a new mesh in the modeling window.
  9. Switch the Perspective view to Rendered mode to see the associated material.
  10. Please check out Capturing Photographs for RealityKit Object Capture for tips on how to take photos that will result in higher-quality meshes and materials (acceptable formats are: jpeg, heic, and png). Also test out the sample photo sets provided by Apple.

This is an early draft, bugs and all. As always, feedback is appreciated.


*Options

  • Detail Level:
    • Preview: course meshes and low-quality materials, but fastest processing
    • Reduced: better quality than preview, slower processing
    • Medium: denser meshes and detailed materials, slow processing
    • Full: best quality meshes and materials, slowest processing
  • Sample ordering
    • Unordered: the photos were not taken in a spatially sequential pattern, slower processing
    • Sequential: if the photos were taken in a spatially sequential pattern - like a turn-table - this can result in faster processing
  • Feature sensitivity:
    • Normal: For objects with typical shapes, edges or textures
    • High: For objects that do not contain many discernible structures, edges or textures
21 Likes

Cool! Windows in the near future?

2 Likes

Unfortunately not…this will be a Rhino for Mac only feature for the foreseeable future.

3 Likes

why? Because windows has many implementations of photogrammetry already?

There is a new core operating system feature provided by Mac that Dan is using.

2 Likes

I assume its related to this?

3 Likes

Yes.
-wim

ObjectCaptureFromPhotos uses these Apple RealityKit Object Capture APIs.

1 Like

Awesome!
Will this be AppleSilicon only?
AFAIK it should support Mac with graphic cards with 4GB…

Cheers

This is Silicon only.

Got it. I am curious why did you decide to only make it Apple Silicon compatible, if it would in theory work? My naive guess is, that this would not necessitate any software development work from your side?

Cheers

Hi Rudi-

It was very slow to process the images on Intel when compared to the Apple Silicon. Going forward, supporting it would be challenging.

omg, i am not often emotional when it comes to new features but Xmas came most definitely early this year @dan you are hereby receiving the Santa Claus Special Award :trumpet::drum::volcano:

…now where are those ma(x) minis with the new chips finally…

2 Likes

Tried this tonight using the Rhino shoe.

Preview- 90 secs-25k vertices- not the best texture mapping, “soft” low-detail mesh, but usable for preview
Reduced - 4 mins - 24,999 vert- better texture mapping, and meshing detail is improved
Medium - 4 mins - 50k vertices- texture mapping appears similar to the Reduced setting, meshing detail is a step up again with corners, edges looking crisper
Full- 5 mins - Status bar completes, pauses to think about 30 sec until the timeout, then got this error.ObjectCaptureFromPhotos error occured. Failed to create: “longFileName”/baked_mesh.obj

Also this command does not show in Repeat Last Command dialog box on right mouse click.

On import, the size comes in as a unit of the current measuring system. It might be 1.2 mm, 1.2 in, 1.2 ft. in your workspace.

This was all just quick playing to see what was up. Gonna play with the Nike files. Very nice potential here. Thanks for tossing a bone to the Mac folks!

2 Likes

Hi Ukktor-

Thank you for testing this out!

Yep. :grimacing: That’s this bug:

RH-66252 ObjectCaptureFromPhotos: progress should include file import step

I implemented that part rather poorly and I bet that will prevent people from testing out really large captures until I fix it. I’ll prioritize that one.

That was intentional…but perhaps misguided on my part. I presume you’d want it to be repeatable? Logged here:

RH-66337 ObjectCaptureFromPhotos: Remove DoNotRepeat status

Ah yes, that :slight_smile: We might be able to be smarter about the units…I’m not sure. If the source images are HEIC from a modern iPhone, we could infer some real-world measurement (the stereo cameras allow for this). Otherwise, I’m not sure what people would expect. Perhaps we need some sort of calibration?

1 Like

Thanks for the response!

re: RH-6637- If by “repeatable”, you mean access from right click command history, then yes. Of course I want to repeat it easily because I ran it about a dozen times in a row last night. :grinning:

Re: Units- Yeah, kinda feel like something is needed here, but not sure what. Many 3d printing software packages can give a “Object is very small/large. Do you want to convert it?” size warning on import. Usually just meaning you are in Metric mm or Imperial inches In Rhino it is a bit trickier as you could be in feet, inches, mm, meter, cm, etc. I used several different size templates. Of course, for the shoe- “Small Objects - Feet” is the only the correct choice. :drum:
— I did think I saw that the TrueDepth cameras can maybe give something close to actual measurements since they use it for FaceID. I wonder if the LIDAR in the newest phones can be tapped. It would be cool to have a “Smart Opening” if that info was passed in from Apple.

Lastly, the Nike shoe files from Apple seemed to display very nicely, but of course were slower to process due to double the photos, but not bad (again, got the error for Full setting), taking about twice as long. These are very good scanning examples made by someone that took care and planning on their scan. There is an art to creating a quality scan.

For others to reference for speed, I am using a 2020 16GB M1 mini

100% this :point_up_2: …there are flaws in the shoe photos I posted above that result in flaws in the mesh (I left these alone because I wanted to show the wabi-sabi state of this work).

It’s an easy fix. I’ll fix it.

Yes, this may be a good way to go. I want to sit with this for a bit as there’s clearly more work to be done in the dialog and with the bugs that are there.

Thanks again for testing it. I just got my new MacBookPro M1 and I’m going to test ObjectCaptureFromPhotos with it soon…still transferring my account via TimeMachine right now.

1 Like

HI.

Just tested the shoe and works great.

Cant seem to get it to work with any of my own photo sets.

Can you use images straight from an iPhone or do you need to post process somehow ?

Thx

Hi -

I don’t have a machine running Apple Silicon so I can’t test this but, yes, you can use images straight from an iPhone or any other camera.

It might be helpful if you could provide more details about the “not working” part.
-wim