Automatic translator allows conversation in 2 different languages

CNET reader Will Powell, a programmer with a background in Adobe Flex and AS3, thought about how handy it would be if we could see subtitles when someone speaks in a foreign language. Based on this idea, he decided to build some glasses that make you feel like you’re in an arthouse movie.

Powell was able to hack together a working automatic translation system by using 3D spectacles, microphones, a few cables, and two Raspberry Pi mini-computers. He based his glasses on Google’s high-concept Glass project. The translation system that Powell created allows you to have a full conversation with someone who speaks another language, while both still talking in your own mother tongue.

Powell says the glasses he uses are transparent, so it looks like they are in your normal field of vision, like a pilot’s head-up display. He also uses a Microsoft API that can translate 37 languages and the Raspberry PIs that run the latest version of Debian Linux to power the subtitle interface and a TV display.

The result is very impressive. It’s a little slow for a complicated real-life conversation but if you consider this is a home-baked project, this is impressively fast. Powell sayd that the Vuzix 1200 Star glasses are connected to the S-Video connector on the first Raspberry Pi and the Jawbone Bluetooth microphone that connects to a device such as a smartphone or tablet. This makes for a clean, noice-cancelled audio feed.

Although Raspberry Pis are tiny, it’s still slightly impractical to carry around two of them, along with an S-Video cable and a couple of microphones. But still, the project is a magnificent but of garden shed tinkering.

When you say something into the Bluetooth microphone, it streams what has been said across the network. This is then recognized and passed through Microsoft’s translation API, with a caching layer to improve performance of regularly used statements. Because the sentence have to pass through this API service, there is quite a bit of delay in the subtitles.

Once translated, the server passes back the text and translations that are picked up by the Raspberry PI driving the TV and glasses displays. If both persons in the conversation wear their own Raspberry PI, glasses and Jawbone microphone, they could all have the same experience.

So this is a really fantastic project and Will Powell deserves a big bravo. Although there is room for improvement and some changes. This is a fantastic idea and I’m sure that it will be picked up and used later on to make commercialized automatic translators.