One of the more exciting augmented reality announcements Apple made at its WWDC keynote on Monday came in the form of Object Capture, a new 3D scanning feature coming to macOS Monterey.
That cry you just heard in the distance came from the chorus of companies that have published 3D scanning apps that use the more recent LiDAR sensors or have built their business models around 3D recording services.
Meanwhile, Unity sits in the catbird seat of Apple’s Object Capture car. Among the early access partners working with Apple on Object Capture, Unity will support MARS Object Capture for AR development through the iOS version of its AR Companion app.
Released as open beta earlier this year, the AR Companion app allows developers to compose scenes for an AR experience and scan them into real-world environments.
While the production app, along with Object Capture integration, won’t be out until this fall, Unity has provided a sneak preview of how the feature will work.
Object Capture in the AR Companion app starts with an interactive interface that forms a virtual shell around the object. As users take photos from different angles, the app places green pins in the corresponding polygons that make up the shell. To anyone who has captured a Photo Sphere with Google Camera, this process will look very familiar. In addition, if the app detects a poor quality photo, a red pin will appear to indicate that the sector needs to be re-recorded.
The minimum coverage area to proceed to model rendering is 70%. Once that threshold is reached, users can then transfer the photos to Unity for processing via a new local wireless file transfer protocol or otherwise. Rendering is a two-step process, including a sample model where parameters can be adjusted, followed by a full-quality model.
Unity developers don’t need to use the AR Companion app for this process, however, as Object Capture also supports photos taken with traditional cameras.
According to a company spokesperson, Unity has been working with Object Capture for six weeks, making this progress all the more impressive. Meanwhile, other companies focusing on 3D scanning are catching up.
The introduction of LiDAR on high-end iPhones and iPads has revitalized the 3D scanning app segment, with Occipital Canvas, Polycam and 3D Scanner App as standout options. What Apple gave with LiDAR, it has now taken away with Object Capture.
Object Capture is even more problematic for companies that offer 3D scanning services. For example, Jaunt moved from VR to 3D recording before being acquired by Verizon. Perhaps it’s no coincidence that Jaunt co-founder Arthur van Hoff joined Apple after the VR pivot.
It’s not the first time Apple has introduced a new feature that makes entire apps or businesses obsolete. However, when it comes to AR development tools like ARKit and Reality Composer, Unity is still ahead of the curve. Object Capture makes Unity’s development environment, especially its flexibility to create AR experiences for different operating systems, all the more important.