MUM: Multitask Unified Model

Today along with discussing Google’s MUM ranking signal, we’ll also briefly understand the next version of Google search. Multitask Unified Model ranking signal (MUM) is an AI model that will change Google search in the coming days and how people are currently searching on the internet.

  • What is MUM?
  • What makes MUM so unique?
  • Why is MUM different from BERT?
  • Why MUM AI model is different from other AI models like GPT and ChatGPT?

I’ll answer all of these questions in this guide.

If you haven’t read previously discussed Google ranking signals, you should first read about them.

Here is the list of already discussed Google ranking signals/factors in chronological order:

  1. BERT Explained: Bidirectional Encoder Representations from Transformers
  2. Crisis Information Systems – CIS
  3. Deduplication Systems
  4. Exact Match Domain System
  5. Freshness Systems
  6. Helpful Content System
  7. Link Analysis Systems and PageRank
  8. Local News Systems

Google invented a neural network architecture called Transformer. It was an extraordinary architectural design because when the user gave the task to Transformer, it digested all the input first and acted accordingly.

Before Transformer, the RNN model was used in AI research, Recurrent Neural Network architecture, used to understand the text word-by-word.

With the help of transformers, it has become easier and faster for Google to train its system on datasets of billions of words.

When Google realized that transformers technology could speed up AI development, they open-sourced it to the public. Therefore, now whoever wants can use the transformer architecture.

With the help of transformer architecture, a company named OpenAI created the GPT AI model, which has multiple versions of its own, i.e., GPT1, GPT2, GPT3, and GPT3.5.

ChatGPT AI model is created on the GPT3.5 AI model of OpenAI. ChatGPT is a branch model of the InstructGPT AI model, which takes instructions through a prompt and acts according by providing a details answer to the given instructions.

Google has developed multiple models based on transformer architecture, BERT and MUM on the one end and LaMDA on the other.

OpenAI’s GPT1, GPT2, and GPT3 are text-generation AI models, and the ChatGPT is a chatbot AI model. That has revolutionized the way people have been searching on internet.

Google’s BERT and MUM are translation AI models, and LaMDA is a chatbot AI model.

All of these AI models are created for their unique purposes. Later in this article, I’ll discuss the differences between BERT vs. MUM and GPT vs. MUM.

Google uses AI and ML (machine learning) at multiple levels in Google searches. It uses AI in Google Ads, and Google Analytics 4 (GA4) heavily uses ML and AI.

The “OK Google” function in Android phones also uses the AI model. Google’s crawling, ranking, and indexing processes also get some help from the AI models.

Now, back to today’s discussion’s original topic, MUM (aka Multitask Unified Model).

In May 2021, Google introduced MUM, severely affecting the search results system in the coming years. And for that reason, we’ll try to understand the AI model in detail.

MUM: Multitask Unified Model Features

MUM has some specific functions and features which are essential to be understood.

  1. MUM has been trained to understand more than 75 languages. Whenever MUM has to be trained on a new topic, it can use all 75+ languages simultaneously, study the content in all languages, and learn from them. The data that Multitask Unified Model has is not limited to any language or culture. It is not like MUM will give the best search results only to English-speaking countries and not others. Multitask Unified Model uses all the languages simultaneously; therefore, it gives the best results to all the people covered under its 75+ languages.
  2. The way that Multitask Unified Model understands languages. Usually, when we or a computer have to understand or learn a language, we try to find such content first, which is common in both languages. For example, if you want to learn Turkish, then you’ll find a book that contains the meaning of Turkish words in your native language as well as in English. Because that will help you easily understand the meaning of that Turkish word. But Multitask Unified Model can understand even those words that are not common in multiple languages. With the help of this feature, MUM can understand and show results even for those mostly less-spoken less-known languages which doesn’t have much literature online and aren’t spoken commonly around the world.
  3. Along with understanding different languages, MUM can generate the languages.
  4. MUM is a multimodal model. Multimodal means that MUM is not limited to text; it can also fetch data/information from images, videos, PDFs, and screenshots. Google has introduced a feature where you can use any image and text together to search for something in Google Lens.

Purpose of Multitask Unified Model

The primary function of Multitask Unified Model is to translate, but we can utilize this ability of translation in multiple ways:

  1. Sentiment Analysis: You can analyze any image, video, screenshot, or text’s sentiments and check if they are negative or positive, angry or happy, abusive or scary, or any sentiments. MUM can check the sentiments automatically on billions of pages.
  2. Making Summary: MUM can easily create a summary, a gist, and a TLDR of any big text or video. This function has been launched in Google Docs, where you can summarize your documents written in G-Docs. Google Ads has this feature where Google can create a skippable ad out of a big video by trimming and stitching multiple chunks of video parts.
  3. Question and Answers: As Multitask Unified Model can understand 75+ languages and the content of these languages provided in the form of text, videos, and images, based on this capability, MUM can answer the questions as well. If a question in English has a perfect answer in another language, then MUM can grab that non-English answer and show it to the user after translating it to English.

All of these tasks weren’t possible in the past, but now they are possible in a limited way. Why limited? I’ll talk about that later in this discussion.

When Google announced MUM, it claimed that Multitask Unified Model was 1000 times more powerful than BERT.

As I have already discussed BERT, I won’t get into those details again. You can read about BERT’s impact on SEO. Okay, let’s discuss the difference between MUM and BERT.


BERT understands the purpose or meaning of a word or phrase based on its previous or next word.

MUM starts where BERT understanding ends. Because BERT was limited to text, Multitask Unified Model understands users’ search in the form of anything like text, image, video, etc., and even in any language.

For example, if you ask BERT about the things to include in your backpack for visiting Hawaii in summer, then BERT will suggest an article by understanding your feelings.

But MUM will work like your friend and suggest each item (not an article) you should bring for the trip. Because maybe your required details aren’t available in one place, or there are multiple types of detail available in multiple articles, blogs, videos, infographics, product links, etc.

Then, Multitask Unified Model gathers the data and, using its artificial intelligence, suggests the exact items and product links you may need.

According to Google, in the past, when people searched for such complicated queries, Google had to perform the searches nearly eight times, but with the help of MUM, it was all done in a single search. (Speed matters!!!)


OpenAI developed GPT to predict the next word. That means, when we are writing, the GPT model suggests the following possible word we may need to write.

As MUM is better than BERT, Multitask Unified Model is not limited to one-word prediction but can even fill in the blanks of any sentence. This AI model can suggest a re-written sentence with meaningful corrections if you have written a partial sentence that needs corrections.

So, GPT’s purpose is to suggest the next word based on the previous word.

But MUM’s purpose is to fill in the gaps in content, based on the complete information about the content, by suggesting the next word and previous words based on the context of the sentence/paragraph.

Where MUM is being used by Google?

Even though MUM is even more powerful than this, Google is still using it in a limited way. Currently, it is being used in 3 ways:

  1. For showing COVID-related news.
  2. For showing the website’s featured snippets.
  3. For finding the answer to queries with the help of Google Lens.

Google is gradually inducing MUM into searches because there are still a few inaccuracy issues with the results. And Google knows that a small mistake can significantly impact results when you perform more than 3.9 billion daily searches (source: Live Internet Stats, 2023).


It would be best to learn about the changes MUM will introduce and induce in search results and other Google-related products. It would be best if you redid/updated your website content so that, in the future, Multitask Unified Model shouldn’t ruin your years of effort. By that, I mean when you’ll have all types of information (images, videos, infographics, PDFs, etc.) related to a topic on the same page or interconnected sub-pages; MUM will love that and will always show your content in search of the relevant topic.

For reference: Here is the list of → All Most Important Google Ranking Signals for 2023.

Share This Page:

Leave a Comment

Your email address will not be published. Required fields are marked *