Connect with us

SEO

June 2019 Broad Core Algo Update: It’s More than E-A-T

Published

on


Two high authority websites have lost traffic from the June Core Algorithm Update. Their losses challenge the conventional thinking that expertise, authoritativeness and trustworthiness (E-A-T) were the path for recovering from an update.

High “Authority” Websites Lost Rankings

Authoritative websites are said to have lost rankings in the June 2019 Core Algorithm Update. This exposes a weakness in the theory that factors such as E-A-T (Expertise, Authoritativeness and Trustworthiness) are directly associated with ranking declines.

These sites had no problems with expertise, authority or trustworthiness.

Screenshot of a discussion in Google's Webmaster Help Forums posted by a Daily Mail employeeA Daily Mail employee reached out to Google’s Webmaster Help Forums for help in diagnosing why the Daily Mail was suffering traffic losses due to the June 2019 Core Algorithm Update.

A UK news site, The Daily Mail, suffered major traffic declines due to Google’s June 2019 algorithm update.

Now, a bitcoin news site, CCN.com is reporting that they are shutting down because of the Google Update.

“Google’s June 2019 Core Update rolled out on June 3th 2019 and CCN’s traffic from Google searches dropped more than 71% on mobile overnight.

Our daily revenue is down by more than 90%.”

In a blog post explaining why they are shutting down, CCN noted that another bitcoin news site, CoinDesk, was also losing traffic:

“Why would simple fixes be the cause of the immense Google-listing drop, when other similar sites are experiencing the same blowback? The leader in blockchain news, CoinDesk, has experienced a 34,6% drop according to Sistrix.com:”

CoinDesk is a leader in their space. So it would be remarkable if they too have lost traffic due to Google’s June 2019 Broad Core Algorithm Update.

Is Trustworthiness a Problem for the Daily Mail?

Screenshot from MediaBiasFactCheck.com showing a poor rating for the Daily Mail website.MediaBiasFactCheck.com states that the Daily Mail is a questionable source of news.

MediaBiasFactCheck.com states that the Daily Mail is an unreliable news source, because of various click bait articles. Here is what MediaBiasFactCheck.com stated:

“…the Daily Mail tends to publish stories utilizing sensationalized headlines with emotionally loaded wordings such as “Woman, 63, ‘becomes PREGNANT in the mouth’ with baby squid after eating calamari”, which is obviously a fake news story.”

Sounds pretty outrageous, right?

Wrong.

The article is based on a real incident. The authoritative source of the information was the United States government’s National Institute of  Health.  That was not an “obviously fake news story” as MediaBiasFactCheck reported. Had MediaBiasFactCheck actuall fact checked the Daily Mail article (like I did), they would have found this link to NIH.GOV that relates the true story of baby squid attaching themselves into a woman’s mouth.  It’s not fake news. It’s real.

Who are MediaBiasFactCheck.com?

MediaBiasFactCheck.com is an independent for profit organization.

According to their FAQ:

Media Bias Fact Check, LLC is a Limited Liability Company owned solely by Dave Van Zandt. He also makes all final editing and publishing decisions.

Dave Van Zandt obtained a Communications Degree before pursuing a higher degree in the sciences. Dave currently works full time in the health care industry. Dave has spent more than 20 years as an arm chair researcher on media bias and its role in political influence.

I will leave it up to you to decide whether MediaBiasFactCheck is a trustworthy source of information.

What Does Pulitzer Prize Winner Politifact Say?

Politifact is a trustworthy non-profit organization. Here is their web page about the Daily Mail  that links to a page with one citation of one click bait article about Brexit.  That’s all that Politifact has to say about the Daily Mail’s trustworthiness.

Screenshot of Politifact's page about the Daily MailPulitzer prize winning Politifact cites only one article as being click baity.

Politifact does not seem to share the same opinion of the Daily Mail as MediaBiasFactCheck.com. I will leave it up to you to decide who to believe on the matter of trustworthiness. But it sure seems to me, my opinion, based on the fact that I read the Daily Mail, that it is reasonably trustworthy.

Google Webmaster Help Forum Fails to Help

In my opinion, a failing of Google’s Webmaster Help Forum is that they offer the same rote advice. When the advice does not fit the situation, the forum sometimes turns against the person asking for help nitpicking perceived failures but never actually diagnosing why a site may have lost rankings.

According to CCN, Google’s Webmaster Help Forum failed to offer useful advice:

“We have tried to find out why our stories are no longer visible on Google by asking for guidance in Google’s Webmasters Forum. While we appreciate the help of the experts from the Google Forum, their theories for why Google has decided to basically “shut down” CCN does not appear to be entirely accurate.”

Among the less than useful advice was this response:

“The website has no information about the valid organization of the publisher.

The website has information about some organization with the name CCN, however, this brand does not have unambiguity and a fragment of Google Knowledge Graph in the SERP.

This contradicts to the following recommendations of Expertise, Authoritativeness, Trustworthiness – EAT of Google:
● Who (what individual, company, business, foundation, etc.) is responsible for the website.
● Who (what individual, company, business, foundation, etc.) created the content on the page.”

The person offering help also advised CCN to register the news site with Google My Business.

Like Talking to a Chatbot

Someone from The Daily Mail news site posted asking for help. Nearly the exact same advice was given to the Daily Mail as was given to CCN.com.

The answers focused on download speed, mobile friendliness and again, Authorship:

“The information about the author such as Martin Robinson is contrary to the following Google recommendations for publishers…”

As for the site not being mobile friendly, that’s incorrect. As you can see below, the site is mobile friendly.

 

Screenshot of Google's Mobile Friendly Testing Tool

There are some page loading errors but those are scripts that are blocked by third party ad servers, something common across the Internet. The rest are warnings about things like deprecated scripting.

Is that enough to kill the rankings by 50%? What do you think?

The Webmaster Help Forum might as well be a chatbot because all the responses are essentially pre-scripted. The advice is mostly ripped from the pages of the Quality Raters Guidelines.

Google’s Algorithms are Not Summed Up by Quality Raters Guidelines

Google’s Algorithm updates cannot be summed up by what’s in the Quality Raters Guidelines. So why do SEOs depend on it to solve Google update problems?

Well…

Danny Sullivan tweeted that the Quality Raters Guidelines can be used as a reference guide for creating quality content.

He said to focus on making quality content.

Here is what Danny Sullivan tweeted:

“We tell lots of things to do. Improve site speed. Consider secure. Etc. But that’s not what this update was about. It’s broad. And respectfully, I think telling people there’s no particular thing to “fix” is indeed helpful. It means, hopefully, they think more broadly…”

Followed with:

“Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That’s like almost 200 pages of things to consider: “

Screenshot of a tweet by google's Danny SullivanGoogle’s Danny Sullivan recommended reading Google’s Quality Raters Guidelines for tips on creating quality web pages.

 

An SEO responded to Danny by correctly pointing out that the Quality Raters Guidelines is for content creation, not for diagnosing why a site is no longer ranking in the search results:

“The guide is GREAT for creation guidelines, not diagnostics. Especially if you just dropped off the map.”

Quality Raters Guidelines are Not a Diagnostic Tool

The quality raters guidelines is helpful. But Google’s algorithms do more than check if a page passes a “quality” test.

Google’s mission is “to organize the world’s information and make it universally accessible and useful. ” Web page quality is one part of that. Relevance and usefulness to a user making a search query is a major part of the algorithm.

So if a site has lost rankings, while content quality may be an issue, of higher concern is figuring out why the site is no longer relevant to a user. Google’s algorithm focuses on returning the most relevant content, regardless of coding errors or whether the article author has their contact information listed somewhere.

The Raters Guidelines are helpful. But the SEO community is clearly on the wrong path by relying so heavily on Google’s Quality Raters Guide for answers to algorithm related ranking problems. There is more nuance to ranking in Google than what’s in the Quality Raters Guidelines.

  1. Quality Raters Guidelines is not an SEO cheat sheet
  2. Google’s algorithms do more than obsess over E-A-T

Quality Raters Guidelines are Not a Diagnostic Cheat Sheet

The Quality Raters Guidelines is mostly a document about web page quality. Page quality is highly important.

But Google does not primarily rank pages because they are high quality. Google ranks pages because they are relevant and useful to users.

There is a tendency to seek answers in the Quality Raters Guidelines for update related ranking problems. This is a mistake. A broad core algorithm update encompasses a wide range of improvements designed to help Google understand search queries, understand and rank web pages, and to be useful to users.

Page Quality is just one ranking factor out of many other factors.

Broad Core Algorithm Updates are Not Solely Focused on E-A-T

Some SEOs continue to recommend that publishers hurt by an algorithm update should add more information to their About page, add more author information to the articles, in order to increase their E-A-T scores.

Goodness… Do people truly believe that hacking Google is as easy as improving author credentials?

Apparently so. As ridiculous as this may sound, that’s what some in Google’s Webmaster Help forum offered as a solution to The Daily Mail, a well known news organization.

The focus on E-A-T to solve Google update problems is a mistake because it ignores the fact that Google’s algorithm is larger than just expertise, authoritativeness and trust. Those are just three factors out of over 200 factors.

Wide Scope to Algorithm Update

It’s called a Broad Core Algorithm Update. The word “broad” is defined as having a wide scope, covering a large number of topics.

Focusing on E-A-T as the root cause of update problems is a huge mistake.

Nothing to Fix

Assuming that E-A-T is the solution to update problems ignores Google’s advice that there is nothing to fix.

What that means, that there is nothing to fix, is that there is nothing wrong with your site.

When an SEO recommends E-A-T to solve an update related ranking problem, they are saying that the reason the site doesn’t rank is because there is something broken that needs fixing.

But Google says there is nothing to fix.

One is right. One is wrong.

Google’s guidance that there is nothing broken on your site to fix is a huge clue. So why ignore it?

What Does Nothing to Fix Mean?

  • Nothing to fix means don’t expect that fixing “quality issues” will solve your Google Update problems.
  • Nothing to fix means that there is nothing wrong with your expertise, authoritativeness or trust.
  • “Nothing to fix” means that Google is doing more than “targeting” low quality signals.

Nothing to fix can mean that:

  • Google is improving natural language processing tasks
  • Google is improving how it ranks links
  • Google is improving how it understands search queries
  • Google is improving how it understands a part of a web page that exists within a larger part of a web page.
  • Google has improved the speed at which it identifies low quality links and ignores them.

As you can see, there are so many areas that Google can improve in an algorithm, the list could literally run to thousands of improvements.

If the list of things that Google could improve is so long, why in the world does the search industry focus on the same four things, Quality, Expertise, Authoritativeness, and Trust?

As can be seen by the plight of major sites like CCN and the Daily Mail, the idea that Google’s Broad Core Updates could be reduced to four baby-food level ranking factors is not helpful.

When trying to diagnose a solution, it may be more helpful to expand the set of factors looked at. Start with the search results pages themselves. How can you diagnose a ranking problem without looking at the search results?

Read cryptocurrency news site announcement: CCN is Shutting Down after Google’s June 2019 Core Update



Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

How to tackle rising Facebook CPAs

Published

on


SAN JOSE – With more advertisers and bigger budgets crowding onto Facebook and Instagram, acquisition costs are climbing. Advertisers can make their social ad dollars go further by re-thinking campaign fundamentals.

“You need to make sure you’re scaling your available inventory for click-through rates, mirroring your audience, and being dynamic,” 3Q Digital’s Senior Strategy Development manager Madeline Fitzgerald said in sharing tips for lowering CPAs across Facebook at SMX West Thursday.

Deconstructing Facebook CPCs

Audience size: bigger is usually better. CPCs on Facebook are affected by audience size, account structure, and click-through rates (CTR). The narrower and smaller your target audience, the more competitive your bid will need to be, Fitzgerald explained. The competition in the auction will ultimately impact the CPC outcome.

“If you’re noticing that your CPCs are really high, one of the first things you should do is check your audience sizes. If you’re seeing that [it’s] getting too specific, see if there are any other interests, behaviors, demographics that we can add.” Doing so, she explained, will help to broaden the target pool and give the Facebook algorithm more options to show your ads.

If you’ve reached a ceiling, broad targeting might be the next step. “If you already have a mature account, don’t go straight to this if you’re still early on in your testing phases. But if you’re trying to get to that next level, broad targeting is great way to do so,” Fitzgerald explained.

Account structure and segmentation. Account structure and the way we segment our ad sets can also determine the available ad inventory. Ads can run across a range of Facebook properties – from News Feed and Messenger to Stories and Instagram feeds. When we add segmentations like placements or geographies, the audience pool becomes restricted and advertisers might miss out on more efficient inventory.

“The algorithms are smarter than we are,” she reasoned. “Let the robots have it on factors like devices and placements. A couple of years ago, we laughed at everyone who did that. But we’re actually seeing a 13% lower CPA with some of our clients who [no longer segment those].”

Segmentation can be valuable when focusing on the funnel stage – i.e. audience personas, creative, and destination pages. But Fitzgerald recommends skipping demographics, geographies, devices, and placements — any of the factors you can’t edit after you set them up.

Campaign budget optimization. Soon, ad set budgets will be going away, in favor of campaign budget optimization (CBO), which uses machine learning to automatically serve ads to the target audience based on predictive analysis.

“I think the biggest way to figure out how to work this into our strategy is to think about the language Facebook is using to tell us about how the algorithm operates. Facebook tells us that CBO looks at the available opportunities – which is a combination of audience size and the audience’s propensity to actually convert into billable opportunities.”

Facebook’s algorithm prioritizes volume over potential for conversion,
which is why CBO works, she explained. Marketers can group together audiences with
similar potential reach or size and the budget optimization tool will see more
conversion potential for larger audience within the budget.  

Conversions are in the creative

Mirror your audience. “As advertisers, it’s our job to help users see themselves
and their goals – what they want to accomplish – in our creative. We need to
make sure we’re making it very obvious for them,” said Fitzgerald.

Compelling ad creative should be able to clearly visualize
the value proposition of what’s being promoted. And it’s not just about getting
more users in the door, it’s about getting the right users in the door
because they were drawn to your creative.

Engage audiences with video. Facebook has been pushing advertisers
to use animation and video for some time now, but Fitzgerald argues advertisers
still aren’t doing enough with it.

“A lot of advertisers take existing creative and put a slow
zoom on it, or pull a three-minute explainer video and think that counts as an
ad. But that’s not really what we’re being called to as advertisers here,” she said.
“It’s our job to figure out how to leverage movement in a more disruptive way,
and think about new original ways to talk to people.”

Highlight clear value in the copy.  Effective copy isn’t about being brand heavy. It’s about
making users comfortable with clicking on an ad. Fitzgerald explained that advertisers
can build that trust and comfort by keeping ad copy directly tied to the value of
what you’re selling.

“We want to make sure users don’t need to go through any guesswork to figure out what’s going to happen next,” Fitzgerald said. “People don’t want to have to read through your whole website to understand why they should engage with your brand.”

This story first appeared on Marketing Land. For more on digital marketing, click here.

Original URL:https://marketingland.com/how-to-tackle-rising-facebook-cpas-276352


About The Author

Taylor Peterson is Third Door Media’s Deputy Editor, managing industry-leading coverage that informs and inspires marketers. Based in New York, Taylor brings marketing expertise grounded in creative production and agency advertising for global brands. Taylor’s editorial focus blends digital marketing and creative strategy with topics like campaign management, emerging formats, and display advertising.



Continue Reading

SEO

New local SERP live in Europe

Published

on


In April 2019, Google was experimenting with a new local SERP that highlighted alternative directory sources for the same query. At the time, we saw an example in the wild for Germany. Now, an updated version of the SERP featuring branded directory buttons appears to be live in the UK, Belgium, Spain, Greece, and France – if not already throughout Europe.

A more prominent directory box. Below is an example screenshot from a UK search, showing directory links above the map and local pack.

SERP showing results for ‘asbestos removal Halifax UK

This change in the SERP grows out of Google’s continuing effort to comply with the European Commission’s antitrust decision in shopping search. It’s also an attempt by the company to preempt a separate antitrust action in local search.

Yelp previously criticized these types of screens as a return to Google’s “rival links” remedy, which was originally proposed in 2013 and ultimately rejected by the European Commission.

UK SERP showing a local carousel above the map

How are the directories selected? One obvious and immediate question is how are the displayed directories chosen? This isn’t an ad unit, in contrast to the solution implemented in shopping search. In the latter context, comparison shopping engines and Google Shopping bid against one another for placement in PLAs. However, there’s no comparable “sponsored” or “ad” label in the directory box or carousel above.

We must assume that Google is algorithmically choosing the directories to display. In the UK example above, clicking on the directory box links takes users to a category page in the case of Yell but a business profile page in the case of Cylex. Other searches (e.g., “dentists, London”) show a carousel with multiple, alternative directories.

In some cases, the directories appear on the first page of the organic results, below the map. In other cases, they do not.

Why we care. It remains to be seen whether this approach is acceptable to the European Commission. Part of that will depend on whether the buttons drive meaningful traffic to these publishers. If so it could revive the fortunes of at least some of them (think “barnacle SEO”), which have continued to see declining traffic as Google My Business and zero-click search grab more user focus and engagement.


About The Author

Greg Sterling is a Contributing Editor to Search Engine Land, a member of the programming team for SMX events and the VP, Market Insights at Uberall.



Continue Reading

SEO

E-commerce category pages outperform product detail pages in SERPs

Published

on


E-commerce category pages represent a larger opportunity for ranking and driving organic search traffic than product detail pages, according to research unveiled at SMX West 2020 on Thursday. 

Across nearly 30 top U.S., e-commerce sites ranking for more than 25 billion keywords, category pages outperformed product detail pages, driving more keyword rankings and estimated traffic, as well as showing higher potential to capture additional traffic with optimization.

The data – culled by JumpFly and seoClarity from Google’s rankings in the U.S. – highlight the outsized role that category pages play in upper-funnel marketing efforts to drive brand awareness and interest.

Specifically, e-commerce category pages – which include parent category, subcategory and product grid pages with faceted navigation – ranked for 19% more keywords on average than product detail pages ranked for. The additional keywords they ranked for drove an estimated 413% more traffic, based on the keywords’ search demand and the pages’ ranking position. With optimization, those ranking category pages also showed the potential to drive 32% more traffic.

Even though category pages drove strong traffic, there’s a significant amount of room to improve ranking performance. On average, each captured an estimated 9% of the share of voice in its search results page. That means that the other ranking pages captured an estimated 91% of the clicks. Product detail pages, by contrast, captured just 2% of the share of voice.

E-commerce sector trends

The strong-category-page trend was most apparent across sectors that naturally target more generic head and torso keywords. For example, sites that sold cordless hammer drills, table lamps and cowboy boots drove stronger performance with category pages, including fashion, home goods and home improvement, as well as department store sites.

Interestingly, the results varied for one sector tested: electronics. One likely reason that product detail pages perform more strongly in this sector could be that electronics keyword themes tend to contain more concrete product attributes than those in other e-commerce sectors. For example, common TV searches include specifics like the size, display technology, resolution, brand and whether it’s “smart” or not. Product names for electronics also tend to contain some of those attributes to differentiate the many similar products available. Therefore, the relevance between a detailed search query and the details in the product name is higher than it would be for other sectors.

Regardless of sector, however, the direct-to-consumer space drove the strongest category-page results, with category pages ranking for 356% more keywords than product detail pages. These brand manufacturers selling their own products on their sites – like Apple, IKEA, The Gap and Nike – drove an estimated 202% more traffic with category pages, and had the potential to drive 233% more traffic.

Marketplaces and auctions

No e-commerce story is complete without a look at marketplaces and auctions. Unfortunately, there wasn’t a strong consensus among the sites in either group.

Behemoth Amazon bucks the trend with product detail pages ranking for an incredible 21,847% more keywords: 34 million keywords compared to the meager 155,000 keywords that its category pages ranked for. Amazon’s product detail pages also drove an estimated 57.5 times more traffic, and had the potential to drive 275.7 times more traffic. 

This makes a certain amount of sense based on Amazon’s strength in media and electronics sales. Both sectors are more focused on the types of keywords that product detail pages would naturally win – book and movie titles, and product attributes. In fact, one of Amazon’s best practices for product detail pages involves placing as many product attributes as possible into its 50- to 250-character product names. 

Conversely, the product names, and consequently the title tags that are typically based on them, tend to be very short and vague on most e-commerce sites. One luxury jewelry site, for example, has more than 10 products named simply “Ball Ring.”

Walmart’s smaller marketplace system acted more like Amazon with product detail pages that ranked more strongly. Though technically classified as a marketplace since its Target+ expansion to include third-party sellers last year, Target’s much smaller network acted more like a department store with stronger category pages. 

On the auction side, eBay acted more like a department store with slightly stronger category pages, while Etsy drove more rankings with its product detail pages.

Why it matters

This research suggests that category page optimization is a valuable area to prioritize to boost your organic search rankings and traffic.

Category pages form the backbone of an e-commerce site as the clickable representation of the site’s taxonomy. Every category page naturally targets a series of keyword themes that form a path through the funnel. The head keyword sits at the mouth of the funnel, while the related, more detailed themes step lower to form the torso and long tail that move toward the tip of the funnel. Traditionally, the product keywords sit at the very tip of the funnel, converting the customer to a sale.

For example, an e-commerce site that sells clothing could have the following click path through a series of five category pages: women’s clothing > dresses > maxi dresses > black maxi dresses > XL black maxi dresses. Each of those five pages targets a unique keyword theme with a place in the sales funnel. Optimizing category pages enables you to capture those searching customers as they explore their purchase options.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Jill Kocher Brown is a 14-year SEO consultant, author, speaker, and editor. She loves data-driven decisions, scalable SEO strategies, e-commerce and technical SEO. A veteran of five agencies and in-house twice, Jill can be found these days at digital marketing agency JumpFly, Inc., where she’s pioneering the SEO practice.



Continue Reading

Trending

Copyright © 2019 Plolu.