While Apple introduced new AR features for iOS 15 and Object Capture for Reality Kit 2 at the WWDC 2021 keynote, updates to ARKit were curiously absent from the official presentation.
That doesn’t mean there’s nothing new in Apple’s mobile AR app toolkit, and the next update will help AR experiences evolve through ARKit into the sci-fi dream of the Metaverse, otherwise known as the AR cloud. .
The main feature for ARKit 5 is Location Anchors, an extension of the permanent content functionality introduced in ARKit 2, which allows apps to run AR experiences in precise locations in real locations, from famous landmarks to friendly neighborhoods, based on latitude, longitude, and altitude coordinates.
However, Location Anchors will be limited to London, New York, and other select US cities at launch. You also need an iPhone Xs, iPhone Xs Max, iPhone XR, or newer devices to experience location anchors.
Another new feature arriving in ARKit 5 is App Clip Code, which allows developers to embed content from ARKit apps or App Clips onto a printed or digital markup code.
In addition, ARKit 5 will include Motion Tracking improvements and add Face Tracking support to the iPad Pro fifth-generation Ultra-Wide camera and front-facing cameras for devices with at least the A12 Bionic chip (iPhone SE and later). Devices with the TrueDepth front camera can also track up to three faces at once.
Apple will release an on-demand session on June 10 exploring the features of ARKit 5.
But location anchors are key here, as it signals Apple’s move towards the shared and persistent AR experiences that are the hallmark of the AR cloud.
Google has started tackling the concept through Cloud Anchors for ARCore, but it acts more like a true save state for user-generated content. Microsoft offers a similar solution in Azure Spatial Anchors for iOS, Android, and HoloLens.
A closer approach to Apple’s Location Anchors is Landmark AR technology for Snapchat, available to creators through templates in Lens Studio. Landmark AR uses location plus visual positioning to anchor AR content to buildings and landmarks.
More recently, Niantic has begun accepting applications for the private beta of its Lightship platform, which will eventually use a Visual Positioning System to enable developers to anchor content to real landmarks. Part of this effort involves crowdsourced 3D mapping, accomplished through in-game tasks in apps like Pokémon GO.
Facebook and Epic Games are also developing their own flavors of the Metaverse, along with startups like Ubiquity6.
All this to say that Apple isn’t necessarily late for the AR cloud, but rather fashionably late. The segment is very much a forward-looking pursuit, and the companies already at the party are just getting started. But now that Apple is here, the fun can really begin.