In 2012, I had a large client that was ranked in the top position for most of the terms that mattered to them.
And then, overnight, their ranking and traffic dropped like a rock.
Have you ever experienced this? If so, you know exactly how it feels.
The initial shock is followed by the nervous feeling in the pit of your stomach as you contemplated how it will affect your business or how you’ll explain it to your client.
After that is the frenzied analysis to figure out what caused the drop and hopefully fix it before it causes too much damage.
In this case, we didn’t need Sherlock Holmes to figure out what happened. My client had insisted that we do whatever it took to get them to rank quickly. The timeframe they wanted was not months, but weeks, in a fairly competitive niche.
If you’ve been involved in SEO for a while, you probably already know what’s about to go down here.
The only way to rank in the timeline they wanted was to use tactics that violated Google’s Webmaster Guidelines. While this was risky, it often worked. At least before Google released their Penguin update in April 2012.
Despite my clearly explaining the risks, my client demanded that we do whatever it took. So that’s what we did.
When their ranking predictably tanked, we weren’t surprised. Neither was my client. And they weren’t upset because we explained the rules and the consequences of breaking those rules.
They weren’t happy that they had to start over, but the profits they reaped leading up to that more than made up for it. It was a classic risk vs. reward scenario.
Unfortunately, most problems aren’t this easy to diagnose. Especially when you haven’t knowingly violated any of Google’s ever-growing list of rules
In this article, you’ll learn some ways you can diagnose and analyze a drop in ranking and traffic.
1. Check Google Search Console
This is the first place you should look. Google Search Console provides a wealth of information on a wide variety of issues.
Google Search Console will send email notifications for many serious issues, including manual actions, crawl errors, and schema problems, to name just a few. And you can analyze a staggering amount of data to identify other less obvious but just as severe issues.
Overview is a good section to start with because it lets you see the big picture, giving you some direction to get more granular.
If you want to analyze specific pages, URL Inspection is a great tool because it allows you to look at any pages through the beady little eyes of the Google bot.
This can come in especially handy when problems on a page don’t cause an obvious issue on the front end but do cause issues for Google’s bots.
Coverage is great for identifying issues that can affect which pages are included in Google’s index, like server errors or pages that have been submitted but contain the noindex metatag.
Generally, if you receive a manual penalty, you’ll know exactly why. It’s most often the result of a violation of Google’s webmaster guidelines, such as buying links or creating spammy low-quality content.
However, it can sometimes be as simple as an innocent mistake in configuring schema markup. You can check the Manual Actions section for this information. GSC will also send you an email notification of any manual penalties.
The Security section will identify any issues with viruses or malware on your website that Google knows about.
It’s important to point out that just because there is no notice here, it doesn’t mean that there is no issue – it just means that Google isn’t aware of it yet.
2. Check for Noindex & Nofollow Meta Tags & Robots.Txt Errors
This issue is most common when moving a new website from a development environment to a live environment, but it can happen anytime.
It’s not uncommon for someone to click the wrong setting in WordPress or a plugin, causing one or more pages to be deindexed.
Review the Search Engine Visibility setting in WordPress at the bottom of the Reading section. You’ll need to make sure it’s left unchecked.
Review the index setting for any SEO-related plugins you have installed. (Yoast, in this example.) This is found in the installation wizard.
Review page-level settings related to indexation. (Yoast again, in this example.) This is typically found below the editing area of each page.
You can prioritize the pages to review by starting with the ones that have lost ranking and traffic, but it’s important to review all pages to help ensure the problem doesn’t become worse.
It’s equally important to also check your robots.txt file to make sure it hasn’t been edited in a way that blocks search engine bots. A properly configured robots.txt file might look like this:
When most people think of hacking, they likely imagine nefarious characters in search of juicy data they can use for identity theft.
As a result, you might think you’re safe from hacking attempts because you don’t store that type of data. Unfortunately, that isn’t the case.
Hackers are opportunists playing a numbers game, so your website is simply another vector from which they can exploit other vulnerable people – your visitors.
By hacking your website, they may be able to spread their malware and/or viruses to exploit even more other computers, when they might find the type of data they’re looking for.
But the impact of hacking doesn’t end there.
We all know the importance of inbound links from other websites, and rather than doing the hard work to earn those links, some people will hack into and embed their links on other websites.
Typically, they will take additional measures to hide these links by placing them in old posts or even by using CSS to disguise them.
Even worse, a hacker may single your website out to be destroyed by deleting your content, or even worse, filling it with shady outbound links, garbage content, and even viruses and malware.
This can cause search engines to completely remove a website from their index.
Taking appropriate steps to secure your website is the first and most powerful action you can take.
Most hackers are looking for easy targets, so if you force them to work harder, they will usually just move on to the next target. You should also ensure that you have automated systems in place to screen for viruses and malware.
Most hosting companies offer this, and it is often included at no charge with professional-grade web hosting. Even then, it’s important to scan your website from time to time to review any outbound links.
Screaming Frog makes it simple to do this, outputting the results as a CSV file that you can quickly browse to identify anything that looks out of place.
If your ranking drop was related to being hacked, it should be obvious because even if you don’t identify it yourself, you will typically receive an email notification from Google Search Console.
The first step is to immediately secure your website and clean up the damage. Once you are completely sure that everything has been resolved, you can submit a reconsideration request through Google Search Console.
4. Analyze Inbound Links
This factor is pretty straightforward. If inbound links caused a drop in your ranking and traffic, it will generally come down to one of three issues.
A manual action caused by link building tactics that violate Google’s Webmaster Guidelines.
A devaluation or loss of links.
Or an increase in links to one or more competitors’ websites.
A manual action will result in a notification from Google Search Console. If this is your problem, it’s a simple matter of removing or disavowing the links and then submitting a reconsideration request.
In most cases, doing so won’t immediately improve your ranking because the links had artificially boosted your ranking before your website was penalized.
You will still need to build new, quality links that meet Google’s Webmaster Guidelines before you can expect to see any improvement.
A devaluation simply means that Google now assigns less value to those particular links. This could be a broad algorithmic devaluation, as we see with footer links, or they could be devalued because of the actions of the website owners.
For example, a website known to buy and/or sell links could be penalized, making the links from that site less valuable, or even worthless.
An increase in links to one or more competitors’ websites makes them look more authoritative than your website in the eyes of Google.
There’s really only one way to solve this, and that is to build more links to your website. The key is to ensure the links you build meet Google’s Webmaster Guidelines, otherwise, you risk eventually being penalized and starting over.
5. Analyze Content
Google’s algorithms are constantly changing. I remember a time when you could churn out a bunch of low-quality, 300-word pages and dominate the search results.
Today, that generally won’t even get you on the first page in for moderately topics, where we typically see 1,000+ word pages holding the top positions.
But it goes much deeper than that.
You’ll need to evaluate what the competitors who now outrank you are doing differently with their content.
Word count is only one factor, and on its own, doesn’t mean much. In fact, rather than focusing on word count, you should determine whether your content is comprehensive.
In other words, does it more thoroughly answer all of the common questions someone may have on the topic compared to the content on competitors’ websites?
Is yours well-written, original, and useful?
Don’t answer this based on feelings – use one of the reading difficulty tests so that you’re working from quantifiable data.
Yoast’s SEO plugin scores this automatically as you write and edit right within WordPress. SEMrrush offers a really cool plugin that does the same within Google Docs, but there are a number of other free tools available online.
Is it structured for easy reading, with subheadings, lists, and images?
People don’t generally read content online, but instead, they scan it. Breaking it up into manageable chunks makes it easier for visitors to scan, making them more likely to stick around long enough to find the info they’re looking for.
This is something that takes a bit of grunt work to properly analyze. Tools like SEMrush are incredibly powerful and can provide a lot of insight on many of these factors, but there are some factors that still require a human touch.
You need to consider the user intent. Are you making it easier for them to quickly find what they need? That should be your ultimate goal.
Yoast has released version 12.1 of its WordPress plugin; the update adds your custom favicon to the mobile snippet preview, matches Google’s font sizes on desktop search results and introduces new schema filters.
Why we should care
An accurate preview of your mobile and desktop listings enables you to get a better idea of what your customers see before they click through, which may help you optimize your snippets and encourage them to click on your results.
The new filters introduced in this update can also be used to control your schema output and provide searchers with pertinent information about your brand.
More on the announcement
Yoast 12.1 also adds the following filters for more granular control over schema output:
wpseo_schema_organization_social_profiles filters an entity’s social profiles. You can use it to customize social profiles within the Organization schema object.
wpseo_schema_company_name and wpseo_schema_company_logo_id filter your company’s name and logo from the theme options if it hasn’t been designated in Yoast SEO’s settings.
wpseo_enable_structured_data_blocks disables Yoast’s structured data block editor blocks.
For more on Yoast’s structured data implementation updates, check out our coverage on Yoast SEO 11.0 (general schema implementation), 11.1 (images and video structured data), 11.2 (custom schema), 11.3 (personal image and avatar structured data), 11.4 (FAQ structured data), 11.5 (mobile snippet preview) and 11.6 (updated How-to structured data block).
About The Author
George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.
Google announced an update to Reviews Rich Results. The goal is to improve the Reviews Rich Results for users and to “address” abusive implementation and impose limits to where rich results trigger. Additionally,the “name” property becomes required.
Reviews Rich Results
The reviews rich results are explained in Google’s Review Snippet developer page. Google takes your schema structured data related to reviews and show stars in the search results.
The rich snippets developer page states:
“Review snippets may appear in rich results or Google Knowledge Panels.”
It’s the guidelines on their appearance in the rich results that is affected.
Limits Imposed on When Rich Results Reviews are Shown
Google announced that the display of rich results reviews will be limited. This means that any reviews outside of those limits will no longer show review snippets.
These are the allowed schema types:
Self-serving Reviews Not Allowed
Self-serving reviews are reviews of oneself. Google will no longer display self-serving reviews in the featured snippets.
This is how Google explained it:
“We call reviews “self-serving” when a review about entity A is placed on the website of entity A – either directly in their markup or via an embedded 3rd party widget. “
“name” Property is Now Required
In perhaps the biggest change to Reviews Rich Results is the mandatory requirement of the name property in the featured snippets.
Publishers who rely on schema structured data plugins, including Reviews WordPress Plugins, should check if their plugin is currently including the “name” property.
If the name property is not included with your plugin then look for an update to your plugin and update it. If there is no “name” update then it may be something your plugin maker has in a future update.
You may wish to contact your plugin maker to find out when this is coming because the “name” property is now important.
Will Rich Results Disappear if “name” Property Missing?
Google did not say if failure to have the “name” property in the structured data will result in a loss of the Reviews Rich Result. They only said it’s required.
“With this update, the name property is now required, so you’ll want to make sure that you specify the name of the item that’s being reviewed.”
This is an important update for publishers who use reviews structured data. Make sure your structured data is properly updated in order to continue to show rich results for your structured data.
Google’s news Tuesday that it is treating the nofollow attribute as a “hint” for ranking rather than a directive to ignore a link, and the introduction of rel="sponsored"andrel="ugc" raised reactions and questions from SEOs about next steps and the impact of the change to a nearly 15-year-old link attribute.
Choices for choice sake?
It’s useful if you want a choice to be more granular. You didn’t have that before. Now you do. If you want it.
As Google Search Liaison Danny Sullivan stated in a tweet Tuesday, the announcement expands the options for site owners and SEOs to specify the nature of a link beyond the singular nofollow attribute. The additional sponsored and ugc attributes are aimed at giving Google more granular signals about the nature of link content.
As a point of clarification, Google’s Gary Illyes tweeted that nofollow in meta robots will also be treated as a “hint,” but there are no ugc or sponsored robot meta tags. He also stated that he’ll be updating the official documentation to explicitly reflect this.
There is no real benefit for the sites that implement these new attributes instead of nofollow, other than organizational classification if it’s helpful. That has some viewing it through a lens of skepticism.
I want to believe this. It’s just that I don’t recall Google ever coming out with anything that did not have a direct benefit, or apparent hopeful benefit for Google’s own internal goals.
Drawing the focus back to that the key change that nofollow is now a ranking “hint,” not a directive, Sullivan tweeted, “As Gary says, that’s very helpful to our systems that impact *lots* of people. The new attributes are a minor aspect.”
That was in reference to Illyes earlier tweet that the treatment of nofollow could have a “massive impact on the end user.”
This has the potential to have a massive impact on the end user. While SEOs implement the nofollow, the outcomes of it trickle down to ranking, which directly affects end users.
It can be hard to reconcile hearing that the change could mean significant improvements in search results for users while also being told that most sites won’t see any ranking affect from the new nofollow treatment.
According to the announcement, these changes have already taken effect (save for nofollow being used as a crawling and indexing “hint,” which goes into effect in March 2020). “In most cases, the move to a hint model won’t change the nature of how we treat such links,” Sullivan and Illyes wrote in the announcement. “We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes.”
Who benefits from the new attributes?
Implementing the more granular sponsored andugc attributes is optional, and Google clearly stated there is no need for SEOs to go back and update any existing nofollows. So will site owners adopt the new attributes if they don’t have to?
But if no one is clear on the incentive to implement….they won’t.
As Sullivan has stated, the purpose of them is to provide options to help it classify these kinds of links more clearly. The nuances Google looks at between nofollow,sponsoredand ugc attributes won’t have an impact on your own site and the new attributes are voluntary to implement. “If you do want to help us understand the web better, implement them. If you don’t want to, don’t,” tweeted Illyes.
Making the new attributes voluntary means you don’t have to bang down IT’s door, but it could also mean the change request may fall to the bottom of the priority list for a lot of companies and never get implemented. As consultant Kristine Schachinger expressed in the tweet below, even the slightest SEO change can be hard to get implemented.
Do you really think we are going to get dev teams to start doing this, or content teams? We can hardly get them to write a title or description? What happens if we just ignore this? Seriously asking since no one is going to do this.
Google seems very clearly fine with that. At this stage, the actual work involved should be minimal. If your dev teams can’t implement a code change to incorporate ugc or sponsored attributes for several more sprints, or quarters (and you’ve been implementing nofollow when appropriate), you don’t have to fret.
For WordPress sites, Yoast SEO plugin founder and Chief Product Officer Joost de Valk said Tuesday that support will be coming in the next release.
“It’s quite easy,” said de Valk. If other vendors follow suit, it could speed up adoption of the new attributes.
An opportunity for manipulation?
Now that nofollow is a “hint,” some are also concerned about spammers that might want to test out whether their tactics have a new lease on life.
I’m sure this is well tested and has prob been live for like a year now without anyone noticing. It’s one of those things you prob should of not announced though – it’s going to create a plague of comment spam for blog owners now because ‘hey, nofollow links might work’.
Google says this shouldn’t spur spammers because most links will still be ignored just as before, whether they use the nofollow, ugc or sponsored attributes. Further, given that one of the stated reasons Google made the change to consider nofollow a “hint” is to be able to better understand link schemes, this spam tactic could be more risky than before.
This change should not have you overhauling your nofollow strategy. If you publish sponsored content or host forums or comments on your site, consider implementing the new attributes when you are able to make a code change. If you can’t or just don’t want to, there’s no harm in that either.
“On the surface, this only benefits Google,” Chris Silver Smith, president of Argent Media, commented via Facebook. “But, if you read between the lines, ‘hints’ mean a passing of PageRank or equivalent values. They’re already using Nofollowed links in some cases. They just want it easier to choose between links to use now in more cases.”
About The Author
George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.