Header Ads

Real-time translation, dining features rolling out now to Google Lens

Using Google Lens to identify a bunch of bananas as seen on the camera of the OnePlus 7 Pro.

At Google I/O 2019, the company announced some exciting new features coming to Google Lens, its augmented reality search and information tool. Today, Google revealed that two of those new features are rolling out now.

The biggest and most important of these new features is the ability to translate printed text in real time. At I/O, Google showed how powerful and simple this process is: just hold your phone up to a sign or other printed text, and Google Lens will auto-detect what language the words are written in and then translate them. What’s more, the translated text is superimposed over the original so it’s easy to keep everything in context.

Google says this will work for over 100 languages starting today. Check out how it works in the GIF below:

A GIF of Google Lens translating foreign text in real time.

The other major new feature — which was also shown off at Google I/O 2019 — is the ability to see photos of a particular menu item at a restaurant simply by scanning the menu with Google Lens. Once you’ve got the menu in Lens, you can tap on the description of a dish and Google will pull photos from Google Maps to show you what that dish looks like.

Editor's Pick

From there, you can also easily find out what people say about that dish.

These new features join other tools within Lens such as calculating bill totals at restaurants, shopping for clothes just by photographing them, copying and pasting text on signs and documents, and giving you information about landmarks and areas of interest.

Google Lens is available through Google Assistant and Google Photos on Android devices. Many OEMs also integrate Lens into a phone’s native camera app.

NEXT: Smart Google Lens upgrade lets you analyze any picture on your phone



from Android Authority http://bit.ly/2XevnJu

No comments

Powered by Blogger.