Google announced a Core Algo Update. Google’s Danny Sullivan stated that Google was not targeting low quality websites. He explicitly confirmed that there is nothing wrong to fix in sites that lose rankings. This announcement is important because it squashes the usual Phantom Update speculation. Google is not targeting “low quality” signals.
What is a Broad Core Algorithm Update?
Google does not generally announce updates to it’s core algorithm because the core algorithm updates happen almost every day, possibly twice a day (read below for more details). This update is different. This update is not one of the usual daily updates. This update is a kind of update that happens several times per year. Google calls this update a Broad Core Algorithm Update.
Broad Core Algorithm Update Clues
Google offered few clues. But here is what we know:
- The update was focused on providing better search results
- There is nothing wrong with sites that lost rankings
- There is no way to “fix” sites that lost rankings
- The improvements are focused on the content but it is not a “quality” issue
Here is the text of Google’s announcement:
“Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year….”
This is likely similar to previous core algorithm updates, only on a broader scale. Google has been updating the core algorithm on a daily basis since at least 2012 (see citation about 665 updates per year below).
Why Google May Have Been More Transparent
The SEO industry consistently assumes that Google’s core algorithm updates focus on “targeting” “low quality” web pages. This is unfortunate because it ignores the possibility that Google’s core algorithm updates might simply be improvements to providing the best answer to a search query.
Thus, if Google updates the core algorithm to improve it’s ability to answer search queries, the SEO industry would have been incorrect to assume that the sites that lost rankings did something wrong.
Until this announcement, Google has declined to clarify what kinds of improvements they were making. So the SEO industry speculated that Google was “targeting” low quality web pages. Over and over, year after year. As if Google only knew how to create algorithms that targeted low quality web pages.
Google’s Algorithm Does More than Target Low Quality
There is a big difference between an algorithm that “targets” low quality web pages versus an algorithm that identifies the best answer to a search query. The SEO industry behaves as if Google’s algorithm is obsessed with targeting low quality, to the exclusion of improvements to answering search queries.
Google Consistently Dismissed Phantom Update Speculation
For the past few years, the SEO industry has been operating as if Google’s algorithm was exclusively targeting low quality web pages. Google not only denied it, Google’s spokespersons actively discouraged such speculation.
Google’s response to the Phantom Update chatter was consistently that there was no significant update, only the daily changes to their core algorithm. How often does Google udpate their core algorithm? Quite possibly twice a day, every day.
If that is true, then an SEO can pick any day of the week to declare an update and they would be correct. But they would be incorrect to state that the udpate was far ranging and even likelier to be wrong to assume that the core algorithm update was targeting low quality. As we noted above, Google’s core algorithm does more than “target” low quality.
Google Possibly Updates Core Algorithm Twice a Day Every Day
A few years ago Google published information about its algorithm that implied Google rolled out at least two changes to their core algorithm every single day. Here is what Google revealed:
“…our most experienced search engineers carefully review the data from all the different experiments and decide if the change is approved to launch. It sounds like a lot, but the process is well refined, so an engineer can go from idea to live on Google for a percentage of users in 24 hours. Based on all of this experimentation, evaluation and analysis, we launched 665 improvements to search in 2012.”
Google launches 665 improvements to search in 2012 alone. When you consider that amount, one can understand why Google spokespersons keep saying there was no upate. It’s because there was no update.
“Reports calling this a single “update” or calling it “Fred” don’t reflect what we actually said: there were several minor changes that happened as they routinely do in any particular week.”
Gary Illyes was reluctant to give a daily core algorithm update a name. When you understand that such updates happen nearly every day, the idea of selecting one such update out of sixty in any given month, you can begin to understand how ludicrous the idea really is. When pressed about it, Gary Illyes virtually threw his hands up and suggested naming it after his pet fish, Fred.
Google Squashes Usual Phantom Update Speculation
That Google’s Danny Sullivan went out of his way to specifically announce there is nothing to fix can be seen as a pre-emptive move to stop the baseless speculation that Google was “targeting” low quality websites.
Takeaway 1: What is Google Improving?
I keep up to date on information retrieval research. I read the most recent research papers and patents. Research today is focused on 22 areas. None of that research focuses on targeting low quality web pages.
Here are the areas that relate most to SEO:
1. Understanding user intent
2. Understanding content
Those are the areas that are likely a part of Google’s Broad Core Algorithm Update. If Google improves how to better understand content, how do you optimize for that?
Here’s an example of recent research: Learning for Efficient Supervised Query Expansion via Two-stage Feature Selection. You can’t buy a thesaurus and sprinkle your content with synonyms to beat this kind of algorithm.
That may be why Google’s Danny Sullivan said this isn’t something to fix. This Broad Core Algorithm Update may have something to do with improving how well Google understands user intent and content.
Takeaway 2: This is Likely Not About Low Quality
You will notice that I do not mention research about finding low quality web pages. That’s because it’s not something that Google’s researchers focus on.
One has to reach back many years to find research papers and patents that are related to finding low quality content. The majority of information retrieval research (relative to SEO) is focused on user intent and understanding content itself.
If those two points are what search engines are focusing on, then it follows that the core algorithm updates also focus on those to ranking areas and not on “targeting low quality” web pages.
If Google was as obsessed with low quality web pages as SEOs are, there would be more research and patents filed on those topics. But there aren’t. That’s the fact.
Here’s a link to Google’s research papers, knock yourself out looking for those “low quality web page” algorithms. See for yourself.
Takeaway 3: Seriously, What Can Be Done to Regain Rankings?
Although Google stated there is no fix, SEO is proactive. Google recommended waiting for your content to rise relative to other pages. But that’s predicated on your content being “great” and more important, on your content being greater than your competition.
In my opinion, a way out means creating content that is not focused on keywords, but on solving problems. Focusing on the problems that site visitors want to solve may be productive. Google’s search results are about solving user queries. That means that your content must also focus on solving the problems that users have when they make any particular search query. There is more to be said on this topic as it relates to user experience.
Google says to make your content great. What makes content great? It’s not that it’s relevant to a user’s keywords. What makes content great is revealed in the positive response of that site visitor to your content. How convenient it is, how easy it is to find an answer, how easy it is to compare products- all of those user experience features put together are what makes content great.