Exciting news from Google this week – the search engine revealed some cool new technology that will dramatically affect the way Google understands and responds to complex search queries.
The technology is called Multitask Unified Model (MUM), and was announced this week at Google’s I/O conference.
Similar to BERT, which was launched in 2019, MUM is built on a transformer architecture but is 1,000x more powerful. MUM is also capable of multitasking in order to connect pieces of information for users in ways that no other search engine can do.
What makes MUM different from BERT is that it is trained to understand 75 languages and can perform numerous tasks simultaneously. It’s also able to understand information in the form of images, text, and video.
During the I/O conference, Google SVP Prabhakar Raghavan used the following query example to demonstrate how MUM works:
“I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” Leveraging MUM, Google will provide the user with highly relevant results, including articles for the equipment to hike Mt. Fuji, along with the differences and similarities between the two mountains.
This would enable users to conduct searches that were previously thought to be far too complex for search engines. It’s not yet known what this will mean in terms of SEO, as MUM is still in the pilot phase, and no launch date has been announced.
Comentarios