This article is more than 1 year old

Google goes bilingual, Facebook fleshes out translation and TensorFlow is dope

And, Microsoft is assisting fish farmers in Japan

Roundup Hello, here's a quick roundup of what's been happening in the world of AI. Google has a new framework to help researchers develop reinforcement learning algorithms, and Google Assistant is now bilingual. Also watch how Microsoft's Azure Machine Learning Studio is helping Japanese fish farmers.

Unsupervised machine translation at Facebook: Researchers at Facebook have been honing a technique that may help the social media platform learn to translate between rare pairs of languages in the future.

Translations between popular languages like English to French or German to Spanish are better since there’s a lot of existing parallel data, but not so much for pairs like Vietnamese to Welsh or Maori to Tamil.

For these rare language pairs developers can’t really rely on supervised machine learning. Models can’t learn how to translate between languages by relying on one-to-one mappings, where the same sentence is written in two languages. Instead, the researchers have been exploring unsupervised machine learning methods.

The system is fed different texts in various languages and converts each word to a vector space, known as word embeddings. The idea is that the distance between two vectors describes how closely related the words are. So, for example, the word embedding for ‘dog’ should be closer to ‘animal’ than ‘skyscraper’. These patterns should be pretty similar in other languages too.

“Because of those similarities, we proposed having the system learn a rotation of the word embeddings in one language to match the word embeddings in the other language, using a combination of various new and old techniques, such as adversarial training. With that information, we can infer a fairly accurate bilingual dictionary without access to any translation and essentially perform word-by-word translation,” Facebook said in a blog post.

Researchers used this to translate between English to Russian, English to Romanian, and the even rarer case of English to Urdu. It’s pretty clever, but don’t get too excited as the results aren’t that great yet. Facebook’s model did seem to do better than other unsupervised learning techniques, but wasn’t always better than traditional systems that use supervised machine learning.

The research (here’s the arXiv paper) will be presented at the EMNLP (Empirical Methods in Natural Language Processing) conference in Belgium later this year in October.

Here’s what AI agents need: Dopamine: There’s a new TensorFlow-based framework for developers interested in reinforcement learning to explore. It’s called Dopamine and is built by folks over at Google.

Dopamine has been designed to run algorithms quickly so researchers can compare benchmark results. Google has provided the full training data of the four different agents that can play a range of the 60 different games in the Arcade Learning Environment, a common platform to test reinforcement learning algorithms.

“Our hope is that our framework’s flexibility and ease-of-use will empower researchers to try out new ideas, both incremental and radical. We are already actively using it for our research and finding it is giving us the flexibility to iterate quickly over many ideas,“ it said in a blog post.

The code is open so you can play with it here.

Google Assistant is bilingual: Hurrah, your Google Home can now understand and reply to you in two languages (if they’re supported).

It’s good news for users who can speak a mixture of English, Spanish, French, German, Italian or Japanese. You can now pick two of them for your Google Home. Developers have had to include a language-understanding layer made up of two different speech recognition systems for each language.

These have to work in parallel so that Google Assistant can easily switch between them. It means that commands given are run through both systems at the same time. The goal is to identify the language as quickly as possible so it doesn’t have to waste processing power.

“If the system becomes certain of the language being spoken before the user finishes a query, then it will stop running the user’s speech through the losing recognizer and discard the losing hypothesis, thus lowering the processing cost and reducing any potential latency,” Google explained in a blog post.

After Google Assistant identifies the language, it then carries on as normal and works out what’s being said in order to carry out a particular command, such as playing a certain song or predicting the weather.

Help me sort these fish!:Microsoft researchers in Japan have teamed up with Toyota Tsusho Corporation, a member of the larger Toyota conglomerate, to help Japanese researchers sort fish.

Fish is a large part of the Japanese diet, and, unfortunately, high demand has led to overfishing. Species are declining, and the researchers at Kindai University’s Aquaculture Research Institute have turned to fish farms to help conservation efforts and ensure supplies.

They raise a range of species from Pacific Bluefin Tuna to Red Sea Bream. The eggs are taken and hatched into tanks until they have grown to about the size of a finger. These ‘fingerlings’ are then shipped off to bigger fish farms across the country where they can reach their full size.

It’s a complex process as the fingerlings are sucked from water into tubes and laid out onto conveyor belts for inspection. Employees check if the fish have reached the right size and pluck ones that are deformed out from the conveyor belts. But if water pumps too quickly, it means there will be too many fingerlings to cope with, and if it's too slow then production time suffers.

Adjusting the water flow is a tedious task that could be done autonomously with machine learning. Here’s where the Microsoft and Toyota Tsusho Corporation come in. They have built a sorting system using Microsoft’s Azure Machine Learning Studio and Azure IoT Hub to count the fish on the conveyor belt to make sure there’s a good amount flowing in from the pumps.

You can watch how it works.

Youtube Video

If there are too many, it makes automatic adjustments to the water pipes and if there aren’t enough, it’ll speed it up a little.

Looks like sushi is safe for the future thanks to AI. ®

More about

TIP US OFF

Send us news


Other stories you might like