As promised earlier in the week, Google announced a major update to Google Translate for Android and iOS on Wednesday. The new features include Word Lens integration for automatic translation of written text on street signs or menus and other printed items. The new Google Translate app also adds improved real-time conversation translation.
Google says the new update will “transform your mobile device into an even more powerful translation tool.”
Why this matters: Whenever one company horns in on an area that a rival company is famous for, competition quickly escalates. Google’s Translate improvements appear to be a response to Skype Translator beta, the automatic real-time translation tool being built into Skype. But Google’s translation tool currently functions as a stand-alone app, and while some translation features are built into the Chrome browser, Translate is currently separate from Google’s suite of apps. It will be interesting to see if Translate’s new stand-alone features are a preview to a Skype Translator-like feature for Hangouts.
Word Lens integration
Google acquired Quest Visual, makers of Word Lens, in May, and is only now integrating those features into Translate. Just as with Word Lens, to use the new feature in Translate all you have to do is tap the camera button in Google Translate, point your phone at the text in a foreign language, and a translation appears on your screen in a matter of seconds. There’s no need to snap a photo and drag your finger across text to highlight it, as you had to do with previous versions of Translate.
Beefed up conversation mode
Google has been playing around with a conversation mode for Translate since at least 2010. The company says it officially added real-time conversation mode to Android in 2013. Conversation Mode is handy if you need to speak to a non-English speaker, but using the feature is painfully slow—not to mention it’s somewhat hidden.
We haven’t yet had a chance to try the new Translate app, but Google says the new version radically speeds up conversations.
Here’s how the old version worked on Android.
Let’s say you want to have a conversation with a French speaker. First, you had to set Translate up to convert English to French. Then you had to either speak or write a word in English to be translated. Then when you saw the translated word, you had to press the vertical menu button that appeared in the blue translation section.
From that menu you’d pick Start conversation and then each person had to alternate speaking. You’d say something in English, wait for the translation, and then the French speaker would respond, wait for the translation, and so on. If either speaker wanted to interrupt the alternating flow they would have to press their translation button and speak again.
Under the new version Google says it has improved things by automatically recognizing which of the two languages is being spoken. This would do away with the button presses required to interrupt a conversation in previous versions.
What Translate cannot do is automatically detect the language being spoken as it can with written text. Instead, you first have to set up the app in conversation mode between the two languages. Only then will Translate be able to detect the language being spoken.
We haven’t had a chance to test Translate yet so we can’t say whether this new feature speeds up conversations. Nevertheless, the new features should be a step forward for Google’s translation tool in functionality if not natural flow.
The new Google Translate update is rolling out over the next few days to Android. At this writing an update was not available for iOS in the App Store.
This story, "Google beefs up Translate app with Word Lens integration, better conversation mode" was originally published by PCWorld.