Connect with us

SEO

June 2019 Broad Core Algo Update: It’s More than E-A-T

Published

on


Two high authority websites have lost traffic from the June Core Algorithm Update. Their losses challenge the conventional thinking that expertise, authoritativeness and trustworthiness (E-A-T) were the path for recovering from an update.

High “Authority” Websites Lost Rankings

Authoritative websites are said to have lost rankings in the June 2019 Core Algorithm Update. This exposes a weakness in the theory that factors such as E-A-T (Expertise, Authoritativeness and Trustworthiness) are directly associated with ranking declines.

These sites had no problems with expertise, authority or trustworthiness.

Screenshot of a discussion in Google's Webmaster Help Forums posted by a Daily Mail employeeA Daily Mail employee reached out to Google’s Webmaster Help Forums for help in diagnosing why the Daily Mail was suffering traffic losses due to the June 2019 Core Algorithm Update.

A UK news site, The Daily Mail, suffered major traffic declines due to Google’s June 2019 algorithm update.

Now, a bitcoin news site, CCN.com is reporting that they are shutting down because of the Google Update.

“Google’s June 2019 Core Update rolled out on June 3th 2019 and CCN’s traffic from Google searches dropped more than 71% on mobile overnight.

Our daily revenue is down by more than 90%.”

In a blog post explaining why they are shutting down, CCN noted that another bitcoin news site, CoinDesk, was also losing traffic:

“Why would simple fixes be the cause of the immense Google-listing drop, when other similar sites are experiencing the same blowback? The leader in blockchain news, CoinDesk, has experienced a 34,6% drop according to Sistrix.com:”

CoinDesk is a leader in their space. So it would be remarkable if they too have lost traffic due to Google’s June 2019 Broad Core Algorithm Update.

Is Trustworthiness a Problem for the Daily Mail?

Screenshot from MediaBiasFactCheck.com showing a poor rating for the Daily Mail website.MediaBiasFactCheck.com states that the Daily Mail is a questionable source of news.

MediaBiasFactCheck.com states that the Daily Mail is an unreliable news source, because of various click bait articles. Here is what MediaBiasFactCheck.com stated:

“…the Daily Mail tends to publish stories utilizing sensationalized headlines with emotionally loaded wordings such as “Woman, 63, ‘becomes PREGNANT in the mouth’ with baby squid after eating calamari”, which is obviously a fake news story.”

Sounds pretty outrageous, right?

Wrong.

The article is based on a real incident. The authoritative source of the information was the United States government’s National Institute of  Health.  That was not an “obviously fake news story” as MediaBiasFactCheck reported. Had MediaBiasFactCheck actuall fact checked the Daily Mail article (like I did), they would have found this link to NIH.GOV that relates the true story of baby squid attaching themselves into a woman’s mouth.  It’s not fake news. It’s real.

Who are MediaBiasFactCheck.com?

MediaBiasFactCheck.com is an independent for profit organization.

According to their FAQ:

Media Bias Fact Check, LLC is a Limited Liability Company owned solely by Dave Van Zandt. He also makes all final editing and publishing decisions.

Dave Van Zandt obtained a Communications Degree before pursuing a higher degree in the sciences. Dave currently works full time in the health care industry. Dave has spent more than 20 years as an arm chair researcher on media bias and its role in political influence.

I will leave it up to you to decide whether MediaBiasFactCheck is a trustworthy source of information.

What Does Pulitzer Prize Winner Politifact Say?

Politifact is a trustworthy non-profit organization. Here is their web page about the Daily Mail  that links to a page with one citation of one click bait article about Brexit.  That’s all that Politifact has to say about the Daily Mail’s trustworthiness.

Screenshot of Politifact's page about the Daily MailPulitzer prize winning Politifact cites only one article as being click baity.

Politifact does not seem to share the same opinion of the Daily Mail as MediaBiasFactCheck.com. I will leave it up to you to decide who to believe on the matter of trustworthiness. But it sure seems to me, my opinion, based on the fact that I read the Daily Mail, that it is reasonably trustworthy.

Google Webmaster Help Forum Fails to Help

In my opinion, a failing of Google’s Webmaster Help Forum is that they offer the same rote advice. When the advice does not fit the situation, the forum sometimes turns against the person asking for help nitpicking perceived failures but never actually diagnosing why a site may have lost rankings.

According to CCN, Google’s Webmaster Help Forum failed to offer useful advice:

“We have tried to find out why our stories are no longer visible on Google by asking for guidance in Google’s Webmasters Forum. While we appreciate the help of the experts from the Google Forum, their theories for why Google has decided to basically “shut down” CCN does not appear to be entirely accurate.”

Among the less than useful advice was this response:

“The website has no information about the valid organization of the publisher.

The website has information about some organization with the name CCN, however, this brand does not have unambiguity and a fragment of Google Knowledge Graph in the SERP.

This contradicts to the following recommendations of Expertise, Authoritativeness, Trustworthiness – EAT of Google:
● Who (what individual, company, business, foundation, etc.) is responsible for the website.
● Who (what individual, company, business, foundation, etc.) created the content on the page.”

The person offering help also advised CCN to register the news site with Google My Business.

Like Talking to a Chatbot

Someone from The Daily Mail news site posted asking for help. Nearly the exact same advice was given to the Daily Mail as was given to CCN.com.

The answers focused on download speed, mobile friendliness and again, Authorship:

“The information about the author such as Martin Robinson is contrary to the following Google recommendations for publishers…”

As for the site not being mobile friendly, that’s incorrect. As you can see below, the site is mobile friendly.

 

Screenshot of Google's Mobile Friendly Testing Tool

There are some page loading errors but those are scripts that are blocked by third party ad servers, something common across the Internet. The rest are warnings about things like deprecated scripting.

Is that enough to kill the rankings by 50%? What do you think?

The Webmaster Help Forum might as well be a chatbot because all the responses are essentially pre-scripted. The advice is mostly ripped from the pages of the Quality Raters Guidelines.

Google’s Algorithms are Not Summed Up by Quality Raters Guidelines

Google’s Algorithm updates cannot be summed up by what’s in the Quality Raters Guidelines. So why do SEOs depend on it to solve Google update problems?

Well…

Danny Sullivan tweeted that the Quality Raters Guidelines can be used as a reference guide for creating quality content.

He said to focus on making quality content.

Here is what Danny Sullivan tweeted:

“We tell lots of things to do. Improve site speed. Consider secure. Etc. But that’s not what this update was about. It’s broad. And respectfully, I think telling people there’s no particular thing to “fix” is indeed helpful. It means, hopefully, they think more broadly…”

Followed with:

“Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That’s like almost 200 pages of things to consider: “

Screenshot of a tweet by google's Danny SullivanGoogle’s Danny Sullivan recommended reading Google’s Quality Raters Guidelines for tips on creating quality web pages.

 

An SEO responded to Danny by correctly pointing out that the Quality Raters Guidelines is for content creation, not for diagnosing why a site is no longer ranking in the search results:

“The guide is GREAT for creation guidelines, not diagnostics. Especially if you just dropped off the map.”

Quality Raters Guidelines are Not a Diagnostic Tool

The quality raters guidelines is helpful. But Google’s algorithms do more than check if a page passes a “quality” test.

Google’s mission is “to organize the world’s information and make it universally accessible and useful. ” Web page quality is one part of that. Relevance and usefulness to a user making a search query is a major part of the algorithm.

So if a site has lost rankings, while content quality may be an issue, of higher concern is figuring out why the site is no longer relevant to a user. Google’s algorithm focuses on returning the most relevant content, regardless of coding errors or whether the article author has their contact information listed somewhere.

The Raters Guidelines are helpful. But the SEO community is clearly on the wrong path by relying so heavily on Google’s Quality Raters Guide for answers to algorithm related ranking problems. There is more nuance to ranking in Google than what’s in the Quality Raters Guidelines.

  1. Quality Raters Guidelines is not an SEO cheat sheet
  2. Google’s algorithms do more than obsess over E-A-T

Quality Raters Guidelines are Not a Diagnostic Cheat Sheet

The Quality Raters Guidelines is mostly a document about web page quality. Page quality is highly important.

But Google does not primarily rank pages because they are high quality. Google ranks pages because they are relevant and useful to users.

There is a tendency to seek answers in the Quality Raters Guidelines for update related ranking problems. This is a mistake. A broad core algorithm update encompasses a wide range of improvements designed to help Google understand search queries, understand and rank web pages, and to be useful to users.

Page Quality is just one ranking factor out of many other factors.

Broad Core Algorithm Updates are Not Solely Focused on E-A-T

Some SEOs continue to recommend that publishers hurt by an algorithm update should add more information to their About page, add more author information to the articles, in order to increase their E-A-T scores.

Goodness… Do people truly believe that hacking Google is as easy as improving author credentials?

Apparently so. As ridiculous as this may sound, that’s what some in Google’s Webmaster Help forum offered as a solution to The Daily Mail, a well known news organization.

The focus on E-A-T to solve Google update problems is a mistake because it ignores the fact that Google’s algorithm is larger than just expertise, authoritativeness and trust. Those are just three factors out of over 200 factors.

Wide Scope to Algorithm Update

It’s called a Broad Core Algorithm Update. The word “broad” is defined as having a wide scope, covering a large number of topics.

Focusing on E-A-T as the root cause of update problems is a huge mistake.

Nothing to Fix

Assuming that E-A-T is the solution to update problems ignores Google’s advice that there is nothing to fix.

What that means, that there is nothing to fix, is that there is nothing wrong with your site.

When an SEO recommends E-A-T to solve an update related ranking problem, they are saying that the reason the site doesn’t rank is because there is something broken that needs fixing.

But Google says there is nothing to fix.

One is right. One is wrong.

Google’s guidance that there is nothing broken on your site to fix is a huge clue. So why ignore it?

What Does Nothing to Fix Mean?

  • Nothing to fix means don’t expect that fixing “quality issues” will solve your Google Update problems.
  • Nothing to fix means that there is nothing wrong with your expertise, authoritativeness or trust.
  • “Nothing to fix” means that Google is doing more than “targeting” low quality signals.

Nothing to fix can mean that:

  • Google is improving natural language processing tasks
  • Google is improving how it ranks links
  • Google is improving how it understands search queries
  • Google is improving how it understands a part of a web page that exists within a larger part of a web page.
  • Google has improved the speed at which it identifies low quality links and ignores them.

As you can see, there are so many areas that Google can improve in an algorithm, the list could literally run to thousands of improvements.

If the list of things that Google could improve is so long, why in the world does the search industry focus on the same four things, Quality, Expertise, Authoritativeness, and Trust?

As can be seen by the plight of major sites like CCN and the Daily Mail, the idea that Google’s Broad Core Updates could be reduced to four baby-food level ranking factors is not helpful.

When trying to diagnose a solution, it may be more helpful to expand the set of factors looked at. Start with the search results pages themselves. How can you diagnose a ranking problem without looking at the search results?

Read cryptocurrency news site announcement: CCN is Shutting Down after Google’s June 2019 Core Update



Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

Google Search Console image search reporting bug June 5-7

Published

on


Google posted a notice that between the dates of June 5 through June 7, it was unable to capture data around image search traffic. This is just a reporting bug and did not impact actual search traffic, but the Search Console performance report may show drops in image search traffic in that date range.

The notice. The notice read, “June 5-7: Some image search statistics were not captured during this period due to an internal issue. Because of this, you may see a drop in your image search statistics during this period. The change did not affect user Search results, only the data reporting.”

How do I see this? If you login to Google Search Console, click into your performance report and then filter by clicking on the “search type” filter. You can then select image from the filters.

Here is a screen shot of this filter:

How To Filter By Image Traffic in Google Search Console

Why we should care. If your site gets a lot of Google Image search traffic, you may notice a dip in your traffic reporting within Google Search Console. You may have not noticed a similar dip in your other analytics tools. That being said, Google said this is only a reporting glitch within Google Search Console and did not impact your actual traffic to your web site.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Continue Reading

SEO

Facebook Changes Reach of Comments in News Feed

Published

on


Facebook announced a change to it’s algorithms that will affect the reach of comments on a post. Comments that have specific quality signals will  be highly ranked. Low quality comment practices may result in less reach.

Comment Ranking in News Feeds

Facebook noted that not only are posts ranked in news feeds but comments are also ranked as well.

Posts with comments that have positive quality signals will be seen by more people. Posts with low quality signals will have their news feed reach reduced.

Facebook Comment-Quality Signals

Facebook noted that their updated comment algorithm has four features:

  1. Integrity signals
  2. User indicated preferences
  3. User interaction signals
  4. Moderation signals

Integrity Signals

Integrity Signals are a measure of authenticity. Comments that violate community standards or fall into engagement-bait are negative signals. Violations of community standards are said to be removed.

Engagement Bait

Facebook engagement bait is a practice that has four features:

1. React Baiting

Encouraging users to react to your post

2. Follow and Share Baiting

This is described as telling visitors to like, share or subscribe.

3. Comment Baiting

Encouraging users to comment with a letter or number are given as examples.

. Monetization Baiting

This is described as asking for “stars” in exchange for something else, which could include something trivial like “doing push ups.”

User Indicated Preferences

This is a reference to user polls that Facebook conducts in order to understand what users say they wish to see in comments.

User Interaction Signals

These are signals related to whether users interact with a post.

Moderation Signals

This is a reference to how users hide or delete comments made in their posts.

Here is how Facebook describes it:

“People can moderate the comments on their post by hiding, deleting, or engaging with comments.

Ranking is on by default for Pages and people with a a lot of followers, but Pages and people with a lot of followers can choose to turn off comment ranking.

People who don’t have as many followers will not have comment ranking turned on automatically since there are less comments overall, but any person can decide to enable comment ranking by going to their settings. (See more details here.) “

Facebook Targeting Low Quality Comments

One of the stated goals of this update is to hide low quality posts from people’s Facebook feeds and to promote high quality posts by people you might know.

This is how Facebook described it:

“To improve relevance and quality, we’ll start showing comments on public posts more prominently when:

  • The comments have interactions from the Page or person who originally posted; or
  • The comments or reactions are from friends of the person who posted.”

Read Facebook’s announcement here: Making Public Comments More Meaningful

Images by Shutterstock, Modified by Author

 



Continue Reading

SEO

Build your PPC campaigns with this mini campaign builder script for Google Ads

Published

on


Need to quickly build a campaign or add keywords to an existing one? This script will do the work for you!

All you need to do is input a few keywords and headlines in a spreadsheet and BAM! You’ve got yourself the beginnings of a great campaign.

I’m a firm believer in Single Keyword per Ad Group (SKAG) structure – it increases ad/keyword relevance and therefore improves quality score, makes CPCs cheaper, gets you a higher ad rank and a better CTR.

Sadly, building out SKAG structures is a pretty time-consuming endeavor. You can’t implement millions of keywords and ads without PPC tech powering your builds.

But if a client just needs a couple of new keywords after updating their site with new content, this script is a quick and easy solution.

And that’s exactly what I love about PPC. There’s a special place in my heart for simple scripts anyone can use to achieve tasks that are otherwise repetitive or near-impossible.

What does the script do?

This tool will save a lot of time with small-scale builds where you know exactly which keywords and ad copy you need, for example when you’re adding a few keywords to an existing campaign.

You input your campaign name, keywords, headlines, descriptions, paths and final URL, and it will output three tabs for you: one with keyword combinations, one with negatives, and ads to upload to Google Ads Editor.

It creates one exact and one broad match modifier campaign and creates a list of keywords as exact negatives in the broad campaign to make sure that search terms that match exactly will go through the exact keyword.

I’m sure you’re dying to give it a whirl, so let’s get cracking!

How do you use it?

Make a copy of this spreadsheet (note: you’ll need to authorize the script to run). You’ll find all the instructions there as a future reminder.

Once you’ve got the spreadsheet ready, input the following:

  • The campaign name
  • The campaign name delimiter to distinguish between broad and exact campaigns
  • Headline 1 (if this cell is not specified, then it will be the same as the keyword)
  • Headline 2
  • Optionally, headline 3
  • Description 1
  • Optionally, description 2
  • Optionally, path 1 and path 2
  • The final URL
  • The keywords (you can keep going outside of the box with these!)

You’ll see a handy character counter which will go red if you exceed the character limit. Bear in mind that this tool will assume that you’re using it correctly and so you’ll need to make sure that you’re staying within the limit!

You can also optionally create a second ad variant by choosing the part of your text you want to vary (e.g., headline 2 or description 2) and inputting the copy. Otherwise, just select “None” from the dropdown menu.

Once you’re done, click the gigantic “Go!” Button, and wait for the magic to happen.

It will generate three tabs labelled “Keywords,” “Negatives” and “Ads.” If you want to run the script again with different keywords, make sure you save these tabs elsewhere or rename them to prevent the script from overriding them.

Finally, you can paste these tabs into Editor and update all the relevant settings and adjustments. Job done!

DOWNLOAD: You’ll need to authorize the script to run after you make a copy of this spreadsheet.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Daniel Gilbert is the CEO at Brainlabs, the best paid media agency in the world (self-declared). He has started and invested in a number of big data and technology startups since leaving Google in 2010.

Continue Reading

Trending

Copyright © 2019 Plolu.