Live Text

How to use Live Text to recognize text in photos on macOS 12

Apple introduced Live Text in iOS 15, and while it works best on the iPhone, you can also use it on macOS 12 Monterey. Live text allows you to extract text from photos and can translate text using the camera on the iPhone or iPad. It's similar to Google Lens, but is a system-wide feature on iOS 15, and iPadOS 15. It can recognize text from signboards, menus, books, and more. If you're an Apple user, you might end up using Live Text a lot more than you think as it can be quite useful.

How to extract text from photos with Live Text on iPhone and iPad in iOS 15

Live Text is a new feature introduced in iOS 15, and is similar to Google Lens. You can use this feature to recognize and extract text from photos, using the camera, within apps, and more. Unlike Google Lens, which requires you to use the Google Search app, Apple's Live Text is available anywhere across iOS and iPadOS. It may not be as accurate as the competition from Google, but works great and is available in an instant.