Google users will love their MUM soon. Of course, if that’s the case with most people literally translating the term (mom). As far as Google is concerned, MUM stands for Multitask Unified Model.
As Google explained at the I / O conference last May, MUM makes it easier for Google to respond to complex queries. Using Google’s example, let’s say you’ve hiked Mount Adams and next year plan to hike Mount Fuji and want to know the different preparations you need to do.
As Google pointed out in its announcement, you will have to do several searches: “ you will have to research the altitude of each mountain, the average temperature in autumn, the difficulty of the hiking trails, the equipment to use, etc. After a certain amount of research, you will eventually get the answer you need ”. But if you were to go to a hiking expert and ask them, “ What do I need to do differently to prepare?” You would get your answer.
The MUM will understand that you are comparing two mountains and will therefore know what information, such as elevation, is relevant. Here is another example. Say you are using Google Lens and see a photo of a shirt. You ask Google to find the same pattern that was used on the shirt but on a pair of socks. It could be a tough challenge to type in exactly what you’re looking for.
Google Will Soon Use MUM To Make Searching Easier And More Intuitive
You could type “Victorian Floral White Socks” on Google Search. But that doesn’t include the design you saw on the shirt. Google explains, “ By combining images and text in a single query, we make it easier to visually search and phrase your questions in a more natural way .”
Another example: you have to repair a part of your bike whose name you do not know. Using Google Lens, you focus the camera on the part and type in “how to fix”. Immediately you get results showing videos and other places that show or instruct you on how to fix your bike. Today, Google announced in a blog post that new features related to MUM will be launched in the coming months.
The search giant explains that it has discovered. That a complex search like this requires an average of eight different searches to get all the answers it needs. However, according to Google, MUM understands and generates languages. He is trained to handle multiple tasks at the same time and can work with 75 different languages. Being multimodal, it can understand the information presented in the form of text and images.
MUM technology will also be integrated into Google Search in order to make the consultation of queries more “natural and intuitive”. Let’s say you are planning to redecorate your home and need more information on acrylic paint. If you google “acrylic painting”, your research will determine which topics related to that theme are of most interest to people.
MUM Will Help Users Find “More Webpages, Videos, Images, And Ideas”
There are over 350 different topics related to acrylic painting and Google claims that using MUM will put users on the right track. And even give them research ideas they might not have considered. This includes researching a topic such as “how to paint acrylic paints with household items”. This returns content that could be extremely useful to the person redecorating their home.
Google also makes it easier for users to zoom in on a topic to help them “narrow and broaden searches”. And when it comes to videos, Google search results will use MUM to display videos. That contains content related to your search, even if that content or topic is not explicitly mentioned in the video. Research can do this through its advanced understanding of the information contained in the video.
According to the company, “ Through all of these MUM experiences, we are eager to help people discover more webpages, videos, images, and ideas that they might not have encountered or searched for. otherwise ”. But there is more. Google helps users decide if the information they receive from the business is credible. And will also make shopping easier for merchants large and small.