The most important thing in the world to Google (aside from hauling in $76bn net income last year), is understanding language. How we type, why we type the way we do. Understanding not just hundreds of different languages around the world but regional variations of different words in the same language and colloquialisms.
Ask people around the UK what they call a bread roll and you’ll have a cacophony of answers. Is an Australian searching ‘thong’ in Google UK looking for underwear or flip flops? And that’s before Google deals with misspells and typos.
With 15% of the billions of searches Google processes every day being completely new, it needed a way to not only understand what we’re all typing, but ensure it’s returning the best results and even predicting what we’re going to want next.
In 2018 Google introduced a neural network-based technique for natural language processing (NLP) – this was called Bidirectional Encoder Representations from Transformers. This was quite a mouthful for even hardcore techies so Google called it BERT.
BERT was all about understanding how words relate to one another, both in the search and on their own. Which words are most important, which words are stop words and which words are likely going to be used to help progress the search. By using machine learning, Google was able to understand how we talk and type with more sophistication than ever before.
As with all things Google, evolution was inevitable. In 2021, Multitask Unified Model was announced, MUM. MUM is 1,000 times more powerful than BERT. This time, Google didn’t want to just understand how we talk the way we do, it wanted to be able to respond. MUM is designed to help generate expert information by using machine learning to not just answer a simple question, but offer expert advice.
By assimilating data across 75 different languages, it can access a wider range of information than any one person could. This is led by AI with Google’s team of human experts checking up and ensuring MUM isn’t incorporating any machine learning bias or generating too large a carbon footprint.
When Google announces new updates to how it’s processing content or what it’s looking for, it’s all done off the back of these NLP bits of tech.
Google announced its desire to see ‘Helpful Content’ in Aug 2022 with an update in December. In December 2022 it also added an extra ‘E’ to E.A.T value content. Google not only wants to see Expertise, Authority and Trust in your content, but Experience too.
And yet, after all this, it’s still too common to see content farms promising they can produce content that can ‘beat’ Google’s algorithm. If websites looking for shortcuts spent half the energy they do on producing generic blog posts (looking at you ‘ten ways to belly weight’ and ‘how to make millions on the stock market’!) on producing real content, written by real human beings who actually know what they’re talking about, the internet would be a better place.
How to produce SEO friendly content Google actually likes:
- Throw the phrase ‘SEO content’ out the window
- Stop writing to a keyword target or a keyword density
- Only use a content writer who actually knows what they’re talking about – do you sell pet food? Great! Use a writer who is a pet expert!
- Remember your E.E.A.T:
- You’ve got an Expert content writer
- Do they have Experience to prove they’re an expert?
- If you’re an Authority, other people must show this to Google – is your content being read by humans? Is it being shared on a wider social media network (content only shared by the same four members of your marketing team doesn’t count). Is it being cited (or even linked to) from other pieces of high quality content?
- Is your content Trusted? Do you have lots of nice juicy reviews? Are people saying nice things about you in those reviews (if you’re not sure, we can help!). How about some awards? Or some non solicited write ups in industry leading publications?
