In a Google Webmaster Hangout Google’s John Mueller was asked why content published on an established site tended to rank higher. The publisher asked why articles on this site consistently received “top Google rankings.”
There is no simple way to answer this question. Google’s John Mueller offered a nuanced explanation of why Google trusted some sites enough to consistently rank them at the top.
Google’s John Mueller discussed the effort necessary to make an authoritative site that Google trusts enough to rank consistently at the top.
The question was asked if the success was due to a lack of competition or “is it somehow even though each individual site is a sub site of the main site, any blogging gets you ranked because” of the website itself.
John Mueller responded that it’s not directly related to the domain.
“It’s more a matter of your putting out new content… that’s relevant for people who are searching at the moment and that we’re ranking them based more on that.
That’s something that we often see from various sites like, should I be blogging, should I write… ten articles articles a day or five articles a day?
…from our point of view it’s not a matter of going out and blogging and creating so many articles a day… but more a matter of… you have some really fresh content here, some of this content is really relevant for some searchers at the moment so we’ll show that.
…it’s not that blogging itself is something that makes the site rank higher or makes the content rank higher. It’s just you happen to have some new content here that happens to be relevant so we’ll show that.”
There’s an approach to content that seems to focus on quantity and quality but leaves out the part about relevance. A common mistake I see in site audits is chatty and conversational content, like you might hear at the water cooler.
For certain situations, content that is focused on relevance to a person’s situation, their goals or aspirations are more appropriate. I believe that’s what John Mueller was getting at when he encouraged the publisher to create content that is relevant to the searchers at the moment they were searching.
I think it’s worth pointing out that he didn’t say to be relevant to the keywords. He encouraged the publisher to create content that is relevant to the searcher.
John Mueller went on to focus on the blogging part of the question, whether blogging was the secret behind the site’s top ranking.
But that answer might not have been what the questioner was hoping for. She appeared to be focused on whether the domain itself, perhaps some kind of authority, was powering the rankings.
Thus, the publisher asked again in an attempt to get John Mueller to focus on whether or not the domain itself was powering the rankings.
“…so it’s completely independent of the domain that I’m blogging on? There’s a lot going on on that website other that has no effect if I… start my own dot com it was blogging it would have the same effect?”
John Mueller responded,
“Pretty much… there are always some kind of supplemental effects with regard to us able to find the content quickly, us being able to understand that this website is generally creating high quality content. So there is some amount of… additional information that we collect for the website on a whole.”
This is interesting because it expands on his previous statement that you just can’t create content and expect it to rank. Here he adds that there is a process whereby Google gains an understanding that the site is a good resource to rank. He alludes to “additional information” that Google collects in order to make the determination that a site is creating high quality content.
What might he be referring to? Google’s algorithm has so many moving parts to it that it could be any number of things.
Just as an example of the complexity involved, there’s a patent filed in 2012 called, “Classifying Sites as Low Quality Sites” that discusses a number of factors that Google could use to create a “link quality score” that could be used to classify an entire site as low quality.
The patent classifies inbound links to a site as Vital, Good, and Bad.
According to the patent, Google could then use this link rating system to lower a website’s chance of ranking:
“The system decreases ranking scores of candidate search results identifying sites classified as low quality sites.”
The above is an example of a patent that may or may not be in use at Google. The point is that there are so many ways that a site can be ranked, from links to the content itself. The reference to “additional information” can be a reference to so many things including the plethora of ranking factors themselves.
Google’s John Mueller goes on to say,
“So it’s not that you could just create random URLs on the web and put your blog post up there and we would find them magically and rank them number one.
It kind of does require some amount of structure within that so that we can understand that over time actually this is pretty good content and we should check it regularly to make sure that we don’t miss any of the updates. “
At this point the publisher tried a third time to get Google’s Mueller to say that there is something about the domain that is helping posts published on that domain to rank better.
“Okay, so there is something to the domain itself and that it’s got your attention.”
Google’s John Mueller declined to say that was the case. He offered that it’s not possible for him to comment on a site he hasn’t looked at.
He then suggested that it was her option to choose to build her own site but that it would take time for the site to get established. He positioned it as a choice between taking the time to establish something of her own for the long run or taking the easy route and using the established website to rank her articles on.
It’s not enough to just create content.
Content must be relevant to a user at the moment they are searching
BrightLocal has launched a new free tool called “Local RankFlux,” designed to alert marketers to changes in local search rankings across multiple industries.
Exclusively focused on the Google local algorithm, it offers tracking for 26 verticals. The ranking fluctuations of individual industries can then be compared to the overall sample.
Tracking over 14,000 keywords. Local RankFlux tracks roughly 560 keywords per industry vertical in 20 cities, according to BrightLocal’s blog post. It “plots the ranking position of each business in the top 20 search results and compares that ranking to the previous day’s position to determine the daily change.”
Changes in higher SERP positions (e.g., 1 – 2) are weighted more heavily and are treated as more significant than changes in lower rankings (e.g., 19 – 20) in its scoring. “Local RankFlux then multiplies the change in position between today’s and yesterday’s rankings by the weighting to create a total daily fluctuation. This total is then converted into an average based on the number of keywords that returned meaningful results^ and a score produced for All Industries and for each individual industry.”
Scores above 6 suggest an update. BrightLocal explains that scores between 0 – 3 indicate nothing meaningful has happened – given that there are regular, even daily fluctuations going on. Scores of more than 3 but less than 6 indicate a minor change in the algorithm, according to BrightLocal, while scores of 6 to 10 suggest a local algorithm update. The spike in the chart below (industry average of 6.1) on August 8 suggests a meaningful change in the algorithm.
Local RankFlux score: legal category vs industry average
In early August Google made a core algorithm update. But the last time there was a significant local impact was in August of last year (and possibly in June, 2019 after another core update). In August 2018, SterlingSky’s Joy Hawkins detailed the ways in which her small business customers were impacted by that 2018 core algorithm update.
Why we should care. This free tool will be a useful way for local SEOs to reality check against broader industry benchmarks, to confirm whether there was indeed a local algorithm update. Informally, a number of local SEOs praised the tool based on early exposure.
Take a look and provide feedback on whether it aligns with your observations and experiences. And be sure not to miss SMX East’s full–day track on local SEO and location-based marketing for brands.
About The Author
Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.
In the latest instalment of the #AskGoogleWebmasters video series, Google’s John Mueller answers a common question about JSON-LD structured data.
Here is the question that was submitted:
“Is it possible to insert JSON structured data at the bottom of theinstead of the? It seems to work fine for many websites.”
In response, Mueller says “yes.” JSON-LD structured data can absolutely be inserted in either the head or body of the page. Just as the person who submitted the question assumed – it will work fine either way.
What’s the Difference Between JSON-LD and Other Structured Data Types?
Before answering the question, Mueller gave a brief explanation of each type of structured data and how they’re different from each other.
There are two other types of structured data in addition to JSON-LD. Here are the differences between each of them.
Microdata: An open-community HTML mspecification used to nest structured data within HTML content.
RDFA: An HTML5 extension that supports link data through additional attributes added to existing HTML tags on the page.
Although all of these types of structured data are acceptable to use, Mueller has gone on record saying Google prefers the use of JSON-LD.
ConsumerAffairs provides buying guides for everything from mattresses to home warranties. But they also direct consumers on purchasing hearing aids, dentures, diabetic supplies, and even lasik surgery. Many have questioned the legitimacy of ConsumerAffairs buying guides, largely because top-rated brands often have financial relationships with the organization. ConsumerAffairs’ health content has been hit in the post-medic world, but now it seems they’ve found a way to circumvent the algorithm update by hosting slightly modified versions of their buying guides on local news websites around the country. Google “hearing aids in Phoenix” and you’ll discover just how well this strategy is working. Local ABC affiliate station ABC15 hosts all of ConsumerAffairs’ buying guides, including those in the health category, on their new “reviews” subdomain. So far, I’ve counted almost 100 of these ConsumerAffairs content mirrors. Despite cracking down on low-authority medical advice and subdomain leasing, Google seems to be missing this huge hack on their ranking algorithm.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
About The Author
Abram Bailey, AuD is a Doctor of Audiology and the founder of HearingTracker.com, the leading independent resource for informed hearing aid consumers.