Sunday , June 7 2020
Home / Google / CES 2020 News / The Google Assistant will be able to read articles out loud in 42 languages

The Google Assistant will be able to read articles out loud in 42 languages

Google is “previewing” a new feature here at CES for the Google Assistant on Android phones that turns it into a supercharged screen reader. When you say “Hey Google, read this,” it will find the main text on whatever webpage or article you’re looking at and read it out loud to you.

Screen readers aren’t new on phones, but Google says it has improved the Assistant’s ability to parse sentences and therefore speak them with more natural, human-sounding cadences.

The flashiest feature, however, is that you can also ask the Assistant to read it out loud to you in a different language, up to 42 of them. You can hear an example of how the voice tries to sound more natural by parsing sentence patterns in the promo video Google has made for the feature:

The Google Assistant has always been able to parse what’s on your screen if you give it permission to do so. The feature was launched in 2015 — before the Assistant even existed — and called “Now on Tap.” Primarily, it was designed to let you take Google-based actions like searching based on what’s on your screen. It launched with much fanfare, but since then the feature hasn’t gotten prime placement in Google’s UI; it’s been reduced to a suggested action button when you bring up the Google Assistant.

This new screen reading feature may change that, but probably not for a hot minute. As this is just a preview, we don’t know when it will be released. Google says it’s also looking into ways to auto-scroll webpages as the Assistant reads, as well as automatically highlight text.

While that feature is just a preview, Google is able to use the Assistant to translate on the fly in more real-world contexts. Google calls it “interpreter mode” because it takes a traditional Google Home Smart Display and sets it to always be ready to translate by default, so that it can be used at places like hotel desks. It’s available on phones now, but the feature originally shipped for smart displays.

This year, interpreter mode is coming to more of Google’s partners, including JFK Airport Terminal 4, some airport lounges, banks, and several new hotels. Google is also working with Mercy Corps to provide interpreter mode speakers for its charity work.

The preview and the expanded interpreter mode announcements go alongside other new Google Assistant features announced today, including better integrations with smart homes, notes, and slightly more transparency on privacy.

This Article was first published on theverge.com

About IT News Ug

Check Also

Diesel’s newest Wear OS smartwatch is slightly smaller and has a translucent case

Another bold Diesel design

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

//graizoah.com/afu.php?zoneid=2572107