In 2012, I had a large client that was ranked in the top position for most of the terms that mattered to them.
And then, overnight, their ranking and traffic dropped like a rock.
Have you ever experienced this? If so, you know exactly how it feels.
The initial shock is followed by the nervous feeling in the pit of your stomach as you contemplated how it will affect your business or how you’ll explain it to your client.
After that is the frenzied analysis to figure out what caused the drop and hopefully fix it before it causes too much damage.
In this case, we didn’t need Sherlock Holmes to figure out what happened. My client had insisted that we do whatever it took to get them to rank quickly. The timeframe they wanted was not months, but weeks, in a fairly competitive niche.
If you’ve been involved in SEO for a while, you probably already know what’s about to go down here.
The only way to rank in the timeline they wanted was to use tactics that violated Google’s Webmaster Guidelines. While this was risky, it often worked. At least before Google released their Penguin update in April 2012.
Despite my clearly explaining the risks, my client demanded that we do whatever it took. So that’s what we did.
When their ranking predictably tanked, we weren’t surprised. Neither was my client. And they weren’t upset because we explained the rules and the consequences of breaking those rules.
They weren’t happy that they had to start over, but the profits they reaped leading up to that more than made up for it. It was a classic risk vs. reward scenario.
Unfortunately, most problems aren’t this easy to diagnose. Especially when you haven’t knowingly violated any of Google’s ever-growing list of rules
In this article, you’ll learn some ways you can diagnose and analyze a drop in ranking and traffic.
1. Check Google Search Console
This is the first place you should look. Google Search Console provides a wealth of information on a wide variety of issues.
Google Search Console will send email notifications for many serious issues, including manual actions, crawl errors, and schema problems, to name just a few. And you can analyze a staggering amount of data to identify other less obvious but just as severe issues.
Overview is a good section to start with because it lets you see the big picture, giving you some direction to get more granular.
If you want to analyze specific pages, URL Inspection is a great tool because it allows you to look at any pages through the beady little eyes of the Google bot.
This can come in especially handy when problems on a page don’t cause an obvious issue on the front end but do cause issues for Google’s bots.
Coverage is great for identifying issues that can affect which pages are included in Google’s index, like server errors or pages that have been submitted but contain the noindex metatag.
Generally, if you receive a manual penalty, you’ll know exactly why. It’s most often the result of a violation of Google’s webmaster guidelines, such as buying links or creating spammy low-quality content.
However, it can sometimes be as simple as an innocent mistake in configuring schema markup. You can check the Manual Actions section for this information. GSC will also send you an email notification of any manual penalties.
The Security section will identify any issues with viruses or malware on your website that Google knows about.
It’s important to point out that just because there is no notice here, it doesn’t mean that there is no issue – it just means that Google isn’t aware of it yet.
2. Check for Noindex & Nofollow Meta Tags & Robots.Txt Errors
This issue is most common when moving a new website from a development environment to a live environment, but it can happen anytime.
It’s not uncommon for someone to click the wrong setting in WordPress or a plugin, causing one or more pages to be deindexed.
Review the Search Engine Visibility setting in WordPress at the bottom of the Reading section. You’ll need to make sure it’s left unchecked.
Review the index setting for any SEO-related plugins you have installed. (Yoast, in this example.) This is found in the installation wizard.
Review page-level settings related to indexation. (Yoast again, in this example.) This is typically found below the editing area of each page.
You can prioritize the pages to review by starting with the ones that have lost ranking and traffic, but it’s important to review all pages to help ensure the problem doesn’t become worse.
It’s equally important to also check your robots.txt file to make sure it hasn’t been edited in a way that blocks search engine bots. A properly configured robots.txt file might look like this:
When most people think of hacking, they likely imagine nefarious characters in search of juicy data they can use for identity theft.
As a result, you might think you’re safe from hacking attempts because you don’t store that type of data. Unfortunately, that isn’t the case.
Hackers are opportunists playing a numbers game, so your website is simply another vector from which they can exploit other vulnerable people – your visitors.
By hacking your website, they may be able to spread their malware and/or viruses to exploit even more other computers, when they might find the type of data they’re looking for.
But the impact of hacking doesn’t end there.
We all know the importance of inbound links from other websites, and rather than doing the hard work to earn those links, some people will hack into and embed their links on other websites.
Typically, they will take additional measures to hide these links by placing them in old posts or even by using CSS to disguise them.
Even worse, a hacker may single your website out to be destroyed by deleting your content, or even worse, filling it with shady outbound links, garbage content, and even viruses and malware.
This can cause search engines to completely remove a website from their index.
Taking appropriate steps to secure your website is the first and most powerful action you can take.
Most hackers are looking for easy targets, so if you force them to work harder, they will usually just move on to the next target. You should also ensure that you have automated systems in place to screen for viruses and malware.
Most hosting companies offer this, and it is often included at no charge with professional-grade web hosting. Even then, it’s important to scan your website from time to time to review any outbound links.
Screaming Frog makes it simple to do this, outputting the results as a CSV file that you can quickly browse to identify anything that looks out of place.
If your ranking drop was related to being hacked, it should be obvious because even if you don’t identify it yourself, you will typically receive an email notification from Google Search Console.
The first step is to immediately secure your website and clean up the damage. Once you are completely sure that everything has been resolved, you can submit a reconsideration request through Google Search Console.
4. Analyze Inbound Links
This factor is pretty straightforward. If inbound links caused a drop in your ranking and traffic, it will generally come down to one of three issues.
A manual action caused by link building tactics that violate Google’s Webmaster Guidelines.
A devaluation or loss of links.
Or an increase in links to one or more competitors’ websites.
A manual action will result in a notification from Google Search Console. If this is your problem, it’s a simple matter of removing or disavowing the links and then submitting a reconsideration request.
In most cases, doing so won’t immediately improve your ranking because the links had artificially boosted your ranking before your website was penalized.
You will still need to build new, quality links that meet Google’s Webmaster Guidelines before you can expect to see any improvement.
A devaluation simply means that Google now assigns less value to those particular links. This could be a broad algorithmic devaluation, as we see with footer links, or they could be devalued because of the actions of the website owners.
For example, a website known to buy and/or sell links could be penalized, making the links from that site less valuable, or even worthless.
An increase in links to one or more competitors’ websites makes them look more authoritative than your website in the eyes of Google.
There’s really only one way to solve this, and that is to build more links to your website. The key is to ensure the links you build meet Google’s Webmaster Guidelines, otherwise, you risk eventually being penalized and starting over.
5. Analyze Content
Google’s algorithms are constantly changing. I remember a time when you could churn out a bunch of low-quality, 300-word pages and dominate the search results.
Today, that generally won’t even get you on the first page in for moderately topics, where we typically see 1,000+ word pages holding the top positions.
But it goes much deeper than that.
You’ll need to evaluate what the competitors who now outrank you are doing differently with their content.
Word count is only one factor, and on its own, doesn’t mean much. In fact, rather than focusing on word count, you should determine whether your content is comprehensive.
In other words, does it more thoroughly answer all of the common questions someone may have on the topic compared to the content on competitors’ websites?
Is yours well-written, original, and useful?
Don’t answer this based on feelings – use one of the reading difficulty tests so that you’re working from quantifiable data.
Yoast’s SEO plugin scores this automatically as you write and edit right within WordPress. SEMrrush offers a really cool plugin that does the same within Google Docs, but there are a number of other free tools available online.
Is it structured for easy reading, with subheadings, lists, and images?
People don’t generally read content online, but instead, they scan it. Breaking it up into manageable chunks makes it easier for visitors to scan, making them more likely to stick around long enough to find the info they’re looking for.
This is something that takes a bit of grunt work to properly analyze. Tools like SEMrush are incredibly powerful and can provide a lot of insight on many of these factors, but there are some factors that still require a human touch.
You need to consider the user intent. Are you making it easier for them to quickly find what they need? That should be your ultimate goal.
On Oct. 31, Google announced the launch of its Site Kit WordPress plugin that, “enables you to set up and configure key Google services, get insights on how people find and use your site, learn how to improve, and easily monetize your content.”
This plugin allows you to easily connect the following Google Services in a dashboard format within your WordPress backend:
It brings the convenience of accessing your site’s performance data while logged into the backend of the site. This is great for webmasters, developers and agencies who are often an admin for their own site or a client’s WordPress site. However, it does not offer the robust and dynamic capabilities of a Google Data Studio report or dashboard to sort data so it may not be ideal for a digital marketing manager or CMO.
With that said, it wouldn’t hurt to implement this plugin as it’s actually a nifty tool that can help you stay on top of your site’s performance metrics. It’s also another way to give Google more access to your site which can have some in-direct benefits organically.
Here is what the Google Site Kit plugin looks like within the WordPress plugin directory.
Installing and setting up Google Site Kit
To utilize the plugin, simply click install and activate as you would any other WordPress plugin. You will then be prompted to complete the set up.
Click on the “Start Setup” button.
You will be prompted to give access to your site’s Google Search Console profile, which means you need to sign in to the Gmail account that has access to your site’s Search Console profile.
Once logged in you need to grant permissions for Google to access the data in your Search Console profile.
Once you’ve granted all the respective permissions, you will get a completion notification and can then click on “Go to my Dashboard.”
Once you’re in the Dashboard you will see options to connect other services such as Analytics, AdSense and PageSpeed insights. You can now choose to connect these services if you like. If you go to the settings of the plugin you will see additional connection options for Optimize and Tag Manager.
Here is what the dashboard looks like with Search Console, analytics and PageSpeed Insights enabled. You can see a clear breakdown of the respective metrics.
The plugin allows you to dive into each reporting respectively with navigation options on the left to drill down into Search Console and analytics.
There is also an admin bar feature to see individual page stats.
In summary, this is a great plugin by Google but keep in mind it’s just version 1.0. I’m excited to see what features and integrations the later versions will have!
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
About The Author
Tony Edward is a director of SEO at Tinuiti and an adjunct instructor of search marketing at NYU. Tony has been in the online marketing industry for over 10 years. His background stems from affiliate marketing and he has experience in paid search, social media and video marketing. Tony is also the founder of the Thinking Crypto YouTube channel.
Bing announced a new link penalties. These link penalties are focused on taking down private blog networks (PBNs), subdomain leasing and manipulative cross-site linking.
Inorganic Site Structure
An inorganic site structure is a linking pattern that uses internal site-level link signals (with subdomains) or cross-site linking patterns (with external domains) in order to manipulate search engine rankings.
While these spam techniques already existed, Bing introduced the concept of calling them “inorganic site structure” in order to describe them.
Bing noted that sites legitimately create subdomains to keep different parts of the site separate, such as support.example.com. These are treated as belonging to the main domain, passing site-level signals to the subdomains.
Bing also said sites like WordPress create standalone sites under subdomains, in which case no site level signals are passed to the subdomains.
Examples of Inorganic Site Structure
An inorganic site structure is when a company leases a subdomain in order to take advantage of site-level signals to rank better. There have been
Private blog networks were also included as inorganic site structure
Bing also introduced the idea of domain boundaries. The idea is that there are boundaries to a domain. Sometimes, as in the case of legitimate subdomains (ex. support.example.com), those boundaries extend out to the subdomain. In other cases like WordPress.com subdomains the boundaries do not extend to the subdomains.
Private Blog Networks (PBNs) Bing called out PBNs as a form of spam that abuse website boundaries.
“While not all link networks misrepresent website boundaries, there are many cases where a single website is artificially split across many different domains, all cross-linking to one another, for the obvious purpose of rank boosting. This is particularly true of PBNs (private blog networks).”
Subdomain Leasing Penalties
Bing explained why they consider subdomain leasing a spammy activity:
“…we heard concerns from the SEO community around the growing practice of hosting third-party content or letting a third party operate a designated subdomain or subfolder, generally in exchange for compensation.
…the practice equates to buying ranking signals, which is not much different from buying links.”
At the time of this article, I still see a news site subdomain ranking in Bing (and Google). This page belongs to another company. All the links are redirected affiliate type links with parameters meant for tracking the referrals.
According to Archive.org the subdomain page was credited to an anonymous news staffer. Sometime in the summer the author was switched to someone with a name who is labeled as an expert, although the content is still the same.
So if Bing is already handing out penalties that means Bing (and Google who also ranks this page) still have some catching up to do.
Bing mentioned sites that are essentially one site that are broken up into multiple interlinking sites. Curiously Bing said that these kinds of sites already in violation of other link spam rules but that additional penalties will apply.
Here’s the kind of link structure that Bing used as an example:
All these sites are interlinking to each other. All the sites have related content and according to Bing are essentially the same site. This kind of linking practice goes back many years. They are traditionally known as interlinked websites. They are generally topically related to each other.
Bing used the above example to illustrate interlinked sites that are really just one site.
That link structure resembles the structure of interlinked websites that belong to the same company. If you’re planning a new web venture, it’s generally a good idea to create a site that’s comprehensive than to create a multitude of sites that are focused on just a small part of the niche.
Curiously, in reference to the above illustration, Bing said that kind of link structure was already in violation of link guidelines and that more penalties would be piled on top of those:
“Fig. 3 – All these domains are effectively the same website. This kind of behavior is already in violation of our link policy.
Going forward, it will be also in violation of our “inorganic site structure” policy and may receive additional penalties.“
It’s good news to hear Bing is improving. Competition between search engines encourage innovation and as Bing improves perhaps search traffic may become more diversified as more people switch to Bing as well as other engines like DuckDuckGo.
Read Bing’s announcement: Some Thoughts on Website Boundaries
Google has released version 1.0 of its Site Kit plugin for WordPress, which means its officially out of beta after 6 months.
In the time since the developer preview of Site Kit was released, Google says it drastically simplified the setup, fixed bugs, and polished the main user flows.
Site Kit allows WordPress users to access data from Google products right from their site’s dashboard. The plugin aggregates data from Google Search Console, Google Analytics, PageSpeed Insights, and AdSense.
With Site Kit there’s no additional code editing required, which makes it easy to set up products like Google Analytics for those without any developer experience.
Anyone can install Site Kit, but Google emphasizes that it’s especially useful for professionals who work on sites for clients. The reasons why include:
Clients and other teams can easily access data from Google products by logging into the WordPress dashboard.
Clients will see performance states and improvement recommendations directly from Google
Site Kit allows you to set roles and permissions and make sure only relevant people can see the data.
To get the most out of Site Kit, Google recommends reviewing the main dashboard on at least a weekly basis. You can also check the stats of individual pages by navigating to the page and clicking on Site Kit in the admin bar.
With this data, Google recommends comparing the top performing pages and seeing how people found them. This can help you discover trends, such as which topics get the most engagement on Twitter, which get the most engagement on Facebook, and so on.
To get started with Site Kit, simply install it from your WordPress dashboard.