Mobile augmented reality for iOS via ARKit, but Apple borrows a few pages from Google’s playbook and brings AR straight to iOS 15.
During the keynote presentation at WWDC 2021, Craig Federighi, senior vice president of software engineering for Apple, unveiled Live Text, a new camera mode for iOS 15 that delivers much of the same functionality that Google Lens offers for Android smartphones and Google Photos. .
Live Text allows iPhone users to direct their captured photos and interact with text. In addition to copying text, users can search based on selected text or call a phone number recognized in the image. The same functionality will be available through the Photos app.
Additionally, Apple is introducing its own version of the AR walking navigation mode that we’ve seen in Google Maps Live View in iOS 15.
Like Live View, Apple Maps offers AR navigation directions in the camera view as iPhone users navigate from point A to point B on foot. AR mode will launch in a limited number of cities later this year, with more coverage over time.
While Apple is catching up with Google on AR walking navigation, Google is on track to add other useful features to Live View, such as indoor navigation and landmark overlays.
Apple founder Steve Jobs once quoted Picasso as saying, “Good artists copy; great artists steal.” Google Lens and Live View are easily two of the more useful AR features Google has introduced in recent years, so it’s on-brand for Apple to want the same functionality in iOS. In addition, adding these features as part of iOS ensures that iPhone users don’t have to give preference to third-party apps.