When you are in such a situation, you are always worried about either running into a problem or missing out on stuff of your interest (locations, food, etc.) because you cannot communicate or read in the local language. Now, Google has launched a smart iPhone app, Google Translate, that supports translation of 57 languages.
The app allows you to translate in and out of using words or using your iPhone's microphone. While Google Translate doesn’t allow you to translate without an active data connection, you can store your most likely used translations and phrases and use them offline.

When Google Translate generates a translation, it looks for patterns in hundreds of millions of documents to help decide on the best translation for you. By detecting patterns in documents that have already been translated by human translators, Google Translate makes intelligent guesses to judge the most appropriate translation as output. This process of seeking patterns in large amounts of text is called statistical machine translation (SMT). However, in SMT, since the translations are generated by machines, not all translations are perfect.
While it is a great utility in time for travelers, it is still a machine mediator between you and the people around you who do not speak and understand your language. And the capability to translate the natural, free communication between the user and the people in real-time does not exist still. As a user of the application you are never sure of other persons’ interpretation of the translation either because of inter-language subtle cultural differences, voice and tone of the language, and/or lack of gestural explanations.
A company called Quest Visual has tried to resolve this small yet important aspect of communication. Quest Visual has invented a magical ‘World Lens’ application for the iPhone with amazing augmented really capabilities.
The application allows users to scan live views using the iPhone’s camera and translate words from Spanish to English (or vice versa) in real-time, replacing the originals in the same size, color, orientation and perspective. So now it doesn’t matter anymore whether it is a newspaper article, website content, hotel menu, or signage, all you need is Word Lens. The other exciting stuff is that the application doesn’t use active data connection to do the translations and therefore saves hundreds of dollars for travelers with active roaming data charges.
The application pushes handheld computing capabilities to make the optical character recognitions algorithms perform. The program runs the captured image through a filter to remove shadows and identify the sharpness around the text and removes everything that isn’t sharp enough. It then converts the image into black and white to further enhance text readability and do a dictionary look-up of the probabilistic text string for a closest match.
Finally, it re-renders the originally captured image by removing the original text and replacing it with the translated text string on top. Although it may sound straightforward, it is not really that simple by hand-held computing standards.
With companies such as Quest Visual creating such interesting and innovating stuff to make the overall travel experience more convenient, the future will be of an app-mesh created between augmented reality apps such as Google Goggles, World Lens, and utility apps like Google Translate with sophistication of two-way voice-to-voice translation.
0 comments:
Post a Comment