Sunday, October 2, 2022

How to build an iPhone-Mac Photogrammetry Tag-Team System...

The "RealityKit Object Capture" API Apple has added to MacOS is very very cool. It allows Apple Silicon Macs to take a series of photos and convert them into three-dimensional models (.usdz files). 

Here are a few examples: Gum. Leatherman. Cup. These are better viewed on an iPhone – it'll both drop them into an augmented reality scene, or allow you to fiddle with the model by itself.

But Apple hasn't added the API to iOS yet, which is a problem. You can capture the object on your phone, but the processing has to happen on a Mac. 

I've figured out a way to make both of them work together, so you can make 3-D models when you're out and about. It's not a seamless solution, and it's going to require you to do some command-line work on your own, but once it's working, all you have to do is take a movie on your phone, then share it. A few minutes later, a finished model pops up in a folder on your phone. 

I can't package everything up so its works with a click or two, but I can tell you the steps. 

Again, this only works on Apple Silicon Macs – the M1 and M2 Macs. It will not work on an Intel-based Mac.

Here's what's going on in the background:

  • You take a movie on your iPhone.
  • Share it via a "Share Sheet" shortcut that tosses the movie file into an iCloud folder. 
  • iCloud syncs the movie file to your Mac
  • Your Mac is watching that folder and kicks off an AppleScript to tear the movie into individual frames.
  • The script then tosses the photos to Apple's command-line photogrammetry app (the one that makes the model).
  • The photogrammetry app makes the model and saves it to an iCloud folder, which then gets synced back to your iPhone.

What you need to get set on your phone is Apple's photogrammetry app and a very cool tool called "ffmeg," an open-source program that can take a movie and break it into individual photos.

I have them both installed on my Mac, but it was a little complicated to get them there, and I don't know how to set it up to install on every Mac out there.  So you're going to have to do this part yourself.

You're going to need to install a few things on your Mac before you get to ffmpeg. 

If you get errors along the way on any of these installation steps, just Google the errors. Others have hit these issues before and have been kind enough to explain them.

Once you have ffmpeg all set you next need the Photogrammetry command-line app. The AppleScript checks to see if you have this program on hand, and copies it to your Mac if you do not. 

(Here's the Apple Developer session where they have you build your own: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app. This requires Xcode, which I have, but barely even know how to play with.) 

Now, here's the script:


Here's an iCloud link to the Shortcut: 

https://www.icloud.com/shortcuts/0498225bb30d40cd9876e587d0045c68

It's very short – just allows you to select the movie you've shot and saves it to an iCloud folder (iCloud Documents:TagTeam:Input)


https://www.dropbox.com/s/5xw8oru8cqfhz7m/Mac-iPhone%20Tag-Team%20Photogrammetry.zip?dl=0

Leave it in downloads and run it from there; it will create the right folders, move itself into the right folder, and set itself up as a Folder Action that watches the "Input" folder that you are dropping movies into. 



/usr/local/bin/ffmpeg