Google announced the development of a multimodal neural network called MUM (MultiTask Unified Model). It is designed to improve the results of the issuance of organic search and other products of the company. Artificial intelligence trained to work on 75 languages and can combine information from them all. This will simplify obtaining the desired result for users, reported on the blog of the developers.
The creators clarified that MUM works on the architecture of deep neural network transformer. Unlike the BERT project, the novelty is 1000 times more powerful. Due to its multimodalities, MUM can form a more extensive understanding of the world around. She knows how to not only understand the text and pictures, but audio and video files.
For example, when requesting “Hike to Mount Fujiima”, MUM will provide information on the most comfortable routes. In addition, the search may understand that in the context of hiking, the user may need special equipment, and will suggest it. Because of the deeper knowledge of the surrounding world, the neural network can take into account the weather. For example, the rainy season begins in the autumn on the mountain, so the AI can offer to buy a water-fried jacket. You can also take a picture of a subject and ask the search engine whether it will suit the goal.
The launch time MUM is not yet specified. According to the developers, they just started testing a neural network. Despite this, they plan to start its integration in the coming months.