Home / Technology / Apple follows Google Lens & Maps Live View with iOS 15

Apple follows Google Lens & Maps Live View with iOS 15



Google’s largest augmented reality offering for consumers today is Lens and Live View in Maps. At WWDC 2021, Apple announced that they plan to go for the two features with iOS 15 this fall.

Apple’s Google Lens competitor is primarily available in the camera and photo apps. When the phone is pointing to text or you are looking at an existing image with words, an indicator is displayed at the bottom right to start an analysis.

This is in contrast to the lens viewfinder being accessible from the Search app, the Assistant and the home screen (both as an app icon and in the search box in the Pixel Launcher), as well as Google Camera and other third-party clients. On photos you have already taken, Lens is in Photos, while it is also available in Google Images. For Google, it makes sense to surface visual lookup in the search tools, but the approach feels a little overwhelming.

At a high level, the iOS features fall under the “Intelligence”

; umbrella. However, Apple strongly emphasizes “Live Text” over visual search:

Let’s say I just finished a meeting with the team, and I want to capture my notes from the board. I can now just point the camera at the board, and an indicator will appear at the bottom right. When I press it, the text jumps straight out. I can use my usual gestures for text selection. Just drag and drop. Now I can switch to Mail and paste them and then send this to Tim.

The default behavior for Apple is to copy text. It makes sense as optical character recognition (OCR) is a great time saver compared to manually transcribing text in an image.

Meanwhile, “Look Up” and “Translate” are more secondary and require explicit confirmation to see instead of all information being passively displayed at the same time. Apple visually searches for “recognized objects and scenes.” It can recognize pets / breeds, flower types, art, books, nature and landmarks.

Visual search is a useful ability to have, but it is more “fun” than something you want to use every day. Apple’s priorities reflect a lot of people’s use of Google Lens today. That said, visuals are the future. While it may be too late for Apple to create its own web search engine, today’s announcement reflects how it would very much like to own AR search, not to be dependent on Google when the AR glass form factor appears.

Another feature that will excel at wearables in AR navigation. Apple showed the ability to hold up the iPhone and scan the buildings in the area to “generate a very accurate position.” Apple Maps then goes on to show detailed directions, such as big arrows and street names.

Like Live View, Apple focuses on driving directions first. Google Maps has already moved past that, and at I / O 2021 it announced plans to let you get Live View outside of navigation to scan the surroundings and see location details.

Apple’s latest announcement in the AR room today takes pictures of an object to create a 3D model on macOS.

After iOS 15 was launched this fall, AR tools will be common on the two largest mobile platforms. These features are useful on phones, but they will be very useful on glasses. It’s clear that both Apple and Google are expanding the accuracy of these services before focusing on the next form factor. It will take a while to get there, but the foundation is being actively laid, and everyone can soon preview this future.

FTC: We use auto affiliate links. More.


Check out 9to5Google on YouTube for more news:


Source link