Connect with us

WordPress

How to Analyze the Cause of a Ranking Crash

Published

on

How to Analyze the Cause of a Ranking Crash


In 2012, I had a large client that was ranked in the top position for most of the terms that mattered to them.

And then, overnight, their ranking and traffic dropped like a rock.

Have you ever experienced this? If so, you know exactly how it feels.

The initial shock is followed by the nervous feeling in the pit of your stomach as you contemplated how it will affect your business or how you’ll explain it to your client.

After that is the frenzied analysis to figure out what caused the drop and hopefully fix it before it causes too much damage.

In this case, we didn’t need Sherlock Holmes to figure out what happened. My client had insisted that we do whatever it took to get them to rank quickly. The timeframe they wanted was not months, but weeks, in a fairly competitive niche.

If you’ve been involved in SEO for a while, you probably already know what’s about to go down here.

The only way to rank in the timeline they wanted was to use tactics that violated Google’s Webmaster Guidelines. While this was risky, it often worked. At least before Google released their Penguin update in April 2012.

Despite my clearly explaining the risks, my client demanded that we do whatever it took. So that’s what we did.

When their ranking predictably tanked, we weren’t surprised. Neither was my client. And they weren’t upset because we explained the rules and the consequences of breaking those rules.

They weren’t happy that they had to start over, but the profits they reaped leading up to that more than made up for it. It was a classic risk vs. reward scenario.

Unfortunately, most problems aren’t this easy to diagnose. Especially when you haven’t knowingly violated any of Google’s ever-growing list of rules

In this article, you’ll learn some ways you can diagnose and analyze a drop in ranking and traffic.

1. Check Google Search Console

This is the first place you should look. Google Search Console provides a wealth of information on a wide variety of issues.

Google Search Console will send email notifications for many serious issues, including manual actions, crawl errors, and schema problems, to name just a few. And you can analyze a staggering amount of data to identify other less obvious but just as severe issues.

Overview is a good section to start with because it lets you see the big picture, giving you some direction to get more granular.

Overview

If you want to analyze specific pages, URL Inspection is a great tool because it allows you to look at any pages through the beady little eyes of the Google bot.

This can come in especially handy when problems on a page don’t cause an obvious issue on the front end but do cause issues for Google’s bots.

URL Inspection

Coverage is great for identifying issues that can affect which pages are included in Google’s index, like server errors or pages that have been submitted but contain the noindex metatag.

Coverage

Generally, if you receive a manual penalty, you’ll know exactly why. It’s most often the result of a violation of Google’s webmaster guidelines, such as buying links or creating spammy low-quality content.

However, it can sometimes be as simple as an innocent mistake in configuring schema markup. You can check the Manual Actions section for this information. GSC will also send you an email notification of any manual penalties.

Manual Actions

The Security section will identify any issues with viruses or malware on your website that Google knows about.

It’s important to point out that just because there is no notice here, it doesn’t mean that there is no issue – it just means that Google isn’t aware of it yet.

Security

2. Check for Noindex & Nofollow Meta Tags & Robots.Txt Errors

This issue is most common when moving a new website from a development environment to a live environment, but it can happen anytime.

It’s not uncommon for someone to click the wrong setting in WordPress or a plugin, causing one or more pages to be deindexed.

Review the Search Engine Visibility setting in WordPress at the bottom of the Reading section. You’ll need to make sure it’s left unchecked.

WordPress search setting

Review the index setting for any SEO-related plugins you have installed. (Yoast, in this example.) This is found in the installation wizard.

Yoast installation wizard

Review page-level settings related to indexation. (Yoast again, in this example.) This is typically found below the editing area of each page.

You can prioritize the pages to review by starting with the ones that have lost ranking and traffic, but it’s important to review all pages to help ensure the problem doesn’t become worse.

Yoast page settings

It’s equally important to also check your robots.txt file to make sure it hasn’t been edited in a way that blocks search engine bots. A properly configured robots.txt file might look like this:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /wp-admin/

On the other hand, an improperly configured robots.txt file might look like this:

User-agent: *
Disallow: /

Google offers a handy tool in Google Search Console to check your robots.txt file for errors.

3. Determine If Your Website Has Been Hacked

When most people think of hacking, they likely imagine nefarious characters in search of juicy data they can use for identity theft.

As a result, you might think you’re safe from hacking attempts because you don’t store that type of data. Unfortunately, that isn’t the case.

Hackers are opportunists playing a numbers game, so your website is simply another vector from which they can exploit other vulnerable people – your visitors.

By hacking your website, they may be able to spread their malware and/or viruses to exploit even more other computers, when they might find the type of data they’re looking for.

But the impact of hacking doesn’t end there.

We all know the importance of inbound links from other websites, and rather than doing the hard work to earn those links, some people will hack into and embed their links on other websites.

Typically, they will take additional measures to hide these links by placing them in old posts or even by using CSS to disguise them.

Even worse, a hacker may single your website out to be destroyed by deleting your content, or even worse, filling it with shady outbound links, garbage content, and even viruses and malware.

This can cause search engines to completely remove a website from their index.

Taking appropriate steps to secure your website is the first and most powerful action you can take.

Most hackers are looking for easy targets, so if you force them to work harder, they will usually just move on to the next target. You should also ensure that you have automated systems in place to screen for viruses and malware.

Most hosting companies offer this, and it is often included at no charge with professional-grade web hosting. Even then, it’s important to scan your website from time to time to review any outbound links.

Screaming Frog makes it simple to do this, outputting the results as a CSV file that you can quickly browse to identify anything that looks out of place.

If your ranking drop was related to being hacked, it should be obvious because even if you don’t identify it yourself, you will typically receive an email notification from Google Search Console.

The first step is to immediately secure your website and clean up the damage. Once you are completely sure that everything has been resolved, you can submit a reconsideration request through Google Search Console.

4. Analyze Inbound Links

This factor is pretty straightforward. If inbound links caused a drop in your ranking and traffic, it will generally come down to one of three issues.

It’s either:

  • A manual action caused by link building tactics that violate Google’s Webmaster Guidelines.
  • A devaluation or loss of links.
  • Or an increase in links to one or more competitors’ websites.

A manual action will result in a notification from Google Search Console. If this is your problem, it’s a simple matter of removing or disavowing the links and then submitting a reconsideration request.

In most cases, doing so won’t immediately improve your ranking because the links had artificially boosted your ranking before your website was penalized.

You will still need to build new, quality links that meet Google’s Webmaster Guidelines before you can expect to see any improvement.

A devaluation simply means that Google now assigns less value to those particular links. This could be a broad algorithmic devaluation, as we see with footer links, or they could be devalued because of the actions of the website owners.

For example, a website known to buy and/or sell links could be penalized, making the links from that site less valuable, or even worthless.

An increase in links to one or more competitors’ websites makes them look more authoritative than your website in the eyes of Google.

There’s really only one way to solve this, and that is to build more links to your website. The key is to ensure the links you build meet Google’s Webmaster Guidelines, otherwise, you risk eventually being penalized and starting over.

5. Analyze Content

Google’s algorithms are constantly changing. I remember a time when you could churn out a bunch of low-quality, 300-word pages and dominate the search results.

Today, that generally won’t even get you on the first page in for moderately topics, where we typically see 1,000+ word pages holding the top positions.

But it goes much deeper than that.

You’ll need to evaluate what the competitors who now outrank you are doing differently with their content.

Word count is only one factor, and on its own, doesn’t mean much. In fact, rather than focusing on word count, you should determine whether your content is comprehensive.

In other words, does it more thoroughly answer all of the common questions someone may have on the topic compared to the content on competitors’ websites?

Is yours well-written, original, and useful?

Don’t answer this based on feelings – use one of the reading difficulty tests so that you’re working from quantifiable data.

Yoast’s SEO plugin scores this automatically as you write and edit right within WordPress. SEMrrush offers a really cool plugin that does the same within Google Docs, but there are a number of other free tools available online.

Is it structured for easy reading, with subheadings, lists, and images?

People don’t generally read content online, but instead, they scan it. Breaking it up into manageable chunks makes it easier for visitors to scan, making them more likely to stick around long enough to find the info they’re looking for.

This is something that takes a bit of grunt work to properly analyze. Tools like SEMrush are incredibly powerful and can provide a lot of insight on many of these factors, but there are some factors that still require a human touch.

You need to consider the user intent. Are you making it easier for them to quickly find what they need? That should be your ultimate goal.

More Resources:


Image Credits

All screenshots taken by author, March 2019





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

WordPress

Why & How to Tackle Technical SEO Before Link Building

Published

on

Why & How to Tackle Technical SEO Before Link Building


When you consider a link building campaign, you may not be completely reaping the benefits of your SEO efforts if you ignore technical SEO.

The best results happen when you consider all the points of your website’s SEO:

In fact, there are situations when you must tackle technical SEO before ever thinking about getting links.

If your website is weak in technical SEO areas, or extremely confusing for search engines, it won’t perform as well regardless of the quality and quantity of backlinks you have.

Your top goals with technical SEO is to make sure that your site is:

  • Easily crawled by search engines.
  • Has top cross-platform compatibility.
  • Loads quickly on both desktop and mobile.
  • Employs efficient implementation of WordPress plugins.
  • Does not have any issues with misconfigured Google Analytics code.

These five points illustrate why it’s important to tackle technical SEO before link building.

If your site is unable to be crawled or is otherwise deficient in technical SEO best practices, you may suffer from poor site performance.

The following chapter discusses why and how you should be tackling technical SEO before starting a link building campaign.

Make Sure Your Site is Easily Crawled by Search Engines

Your HTTPS Secure Implementation

If you have recently made the jump to an HTTPS secure implementation, you may not have had the chance to audit or otherwise identify issues with your secure certificate installation.

A surface-level audit at the outset can help you identify any major issues affecting your transition to HTTPS.

Major issues can arise later on when the purchase of the SSL certificate did not initially take into account what the site would be doing later.

One thing to keep in mind is that you must take great care in purchasing your certificate and making sure it covers all the subdomains you want it to.

If you don’t, you may end up with some issues as a result, such as not being able to redirect URLs.

If you don’t get a full wildcard certificate, and you have URL parameters on a subdomain – using absolute URLs – that your certificate doesn’t cover, you won’t be able to redirect those URLs to https://.

This is why it pays to be mindful of the options you choose during the purchase of your SSL certificate because it can negatively affect your site later.

No Errant Redirects or Too Many Redirects Bogging Down Site Performance

It’s easy to create an HTTPS secure implementation with errant redirects.

For this reason, an eagle eye’s view of the site’s current redirect states will be helpful in correcting this issue.

It can also be easy to create conflicting redirects if you don’t keep watch on the redirects you are creating.

In addition, it’s easy to let redirects run out of control and lead to tens or many more redirects per site URL, in turn, leads to bogging down site performance.

The easiest way to fix this issue moving forward: make sure that your redirects are all created in a 1:1 ratio.

You should not have 10-15 or more redirect URLs per URL on your site.

If you do, something is seriously wrong.

Example of correct redirects

Content on HTTPS & HTTP URLs Should Not Load at the Same Time

The correct implementation is that one should redirect to the other, not both.

If you have both of them loading at the same time, something is wrong with the secure version of your site.

If you type in your site’s URLs into your browser, try and test https:// and http:// separately.

If both URLs load, you are displaying two versions of your content, and duplicate URLs can lead to duplicate content issues.

To make sure that you do not run into this issue again, you will want to do one of the following, depending on your site’s platform:

  • Create a full redirect pattern in HTACCESS (on Apache / CPanel servers)
  • Use a redirect plugin in WordPress to force the redirects from http://

Instead, this is an example of exactly what we want to display to users and search engines:

How to Create Redirects in htaccess on Apache / Cpanel Servers

You can perform global redirects at the server level in .htaccess on Apache / CPanel servers.

Inmotionhosting has a great tutorial on how to force this redirect on your own web host. But, for our purposes, we’ll focus on the following ones.

To force all web traffic to use HTTPS, this is the following code you will want to use.

You want to make sure to add this code above any code that has a similar prefix (RewriteEngine On, RewriteCond, etc.)

RewriteEngine On
RewriteCond %{HTTPS} !on
RewriteCond %{REQUEST_URI} !^/[0-9]+..+.cpaneldcv$
RewriteCond %{REQUEST_URI} !^/.well-known/pki-validation/[A-F0-9]{32}.txt(?: Comodo DCV)?$
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

If you want to redirect only a specified domain, you will want to use the following lines of code in your htaccess file:

RewriteCond %{REQUEST_URI} !^/[0-9]+..+.cpaneldcv$
RewriteCond %{REQUEST_URI} !^/.well-known/pki-validation/[A-F0-9]{32}.txt(?: Comodo DCV)?$
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.example.com/$1 [R=301,L]

Don’t forget to change any URLs in the above examples to what is the correct implementation on your domain name.

There are other solutions in that tutorial which may work for your site.

WARNING: if you do not have confidence in your abilities to make the correct changes at the server level on your server, please make sure to have your server company/IT person perform these fixes for you.

You can screw up something major with these types of redirects if you do not know exactly what you are doing.

Secure Site redirect plugin

Use a Plugin If You Are Operating a WordPress Site

The easiest way to fix these redirect issues, especially if you operate a WordPress site, is to just use a plugin.

There are many plugins that can force http:// to https:// redirects but here are a few that will help make this process as painless as possible:

Caution about plugins – don’t just add another plugin if you’re already using too many plugins.

You may want to investigate if your server can use similar redirect rules mentioned above (such as if you are using an NGINX-based server).

It must be stated here: plugin weight can affect site speed negatively, so don’t always assume that the latest plugin will help you.

All Links On-site Should Be Changed From HTTP:// To HTTPS://

Even if you perform the redirects above, you should perform this step.

This is especially true if you are using absolute URLs, as opposed to relative URLs, where the former always displays the hypertext transfer protocol that you’re using.

If you are using the latter, this is less important and you probably don’t need to pay much attention to this.

Why do you need to change links on-site when you are using absolute URLs?

Because Google can and will crawl all of those links and this can result in duplicate content issues.

It seems like a waste of time, but it’s really not. You are making sure the end result is that Google sees exactly the site you want them to see.

One version.

One set of URLs.

One set of content.

No confusion.

Examples of links that should be changed from http:// to https://

No 404s From HTTP:// To HTTPS:// Transitions

A sudden spike in 404s can make your site almost impossible to crawl, especially if the links between http:// and https:// pages exist.

Difficulty crawling a site is one of the most common issues that can result from a spike in 404s.

Also, crawl budget wasted due to too many 404s showing up, and Google not finding pages that it should.

Why this impacts site performance, and why it matters:

While John Mueller of Google mentions that crawl budget doesn’t matter except for extremely large sites:

“Google’s John Mueller said on Twitter that he believes that crawl budget optimization is overrated in his mind. He said for most sites, it doesn’t make a difference and that it only can help really massive sites.

John wrote “IMO crawl-budget is over-rated.” “Most sites never need to worry about this. It’s an interesting topic, and if you’re crawling the web or running a multi-billion-URL site, it’s important, but for the average site owner less so,” he added.”

A great article by Yauhen Khutarniuk, Head of SEO at SEO PowerSuite, puts this perfectly:

“Quite logically, you should be concerned with crawl budget because you want Google to discover as many of your site’s important pages as possible. You also want it to find new content on your site quickly. The bigger your crawl budget (and the smarter your management of it), the faster this will happen.”

It’s important to optimize for crawl budget because finding new content on your site quickly should be the priority, while discovering as many of your site’s high priority pages as possible.

How to Fix Any 404s You May Have

Primarily, you want to redirect any 404s from the old URL to the new, existing URL.

Check out Benj Arriola’s Search Engine Journal article for more information on 404s vs. soft 404s, and how to fix them.

One of the easier ways, especially if you have a WordPress site, would be to crawl the site with Screaming Frog and perform a bulk upload of your 301 redirect rules using the Redirection WordPress plugin.

Otherwise, you may have to create redirect rules in .htaccess.

Your URL Structure Should Not Be Overly Complex

The structure of your URLs is an important consideration when getting your site ready for technical SEO.

You must pay attention to things like randomly generating dynamic parameters that are being indexed, URLs that are not easy to understand, and other factors that will cause issues with your technical SEO implementation.

These are all important factors because they can lead to indexation issues that will hurt your site’s performance.

More Human-Readable URLs

When you create URLs, you likely think about where the content is going, and then you create URLs automatically.

This can hurt you, however.

The reason why is because automatically generated URLs can follow a few different formats, none of which are very human-readable.

For example:

  • /content/date/time/keyword
  • /content/date/time/string-of-numbers
  • /content/category/date/time/
  • /content/category/date/time/parameters/

None of these formats that you encounter are very human readable, are they?

The reason why it’s important is that communicating the content behind the URL properly is a large part of user intent.

It’s even more important today also because of accessibility reasons.

The more readable your URLs are, the better:

  • Search engines can use these to determine exactly how people are engaging with those URLs vs. those who are not engaging with those URLs.
  • If someone sees your URL in the search results, they may be more apt to click on it because of the fact that they will see exactly how much that URL matches what they are searching for. In short – match that user search intent, and you’ve got another customer.
  • This is why considering this part of URL structure is so important when you are auditing a site.

Many existing sites may be using outdated or confusing URL structures, leading to poor user engagement.

Identifying which URLs can be more human readable can create better user engagement across your site.

Duplicate URLs

One important technical SEO consideration that should be ironed out before any link building is duplicate content.

When it comes to duplicate content issues, these are the main causes:

  • Content that is significantly duplicated across sections of the website.
  • Scraped content from other websites.
  • Duplicate URLs where only one piece of content exists.

This can hurt because it does confuse search engines when more than one URL represents one piece of content.

Search engines will rarely show the same piece of content twice, and not paying attention to duplicate URLs dilutes their ability to find and serve up each duplicate.

Avoid Using Dynamic Parameters

While dynamic parameters are, in and of themselves, not a problem from an SEO perspective, if you cannot manage your creation of them, and get consistent in their use, this can become a significant problem later.

Jes Scholz has an amazing article on Search Engine Journal covering the basics of dynamic parameters and URL handling and how it can affect SEO. If you are not familiar with dynamic parameters, I suggest reading her article ASAP before proceeding with the rest of this section.

Scholz explains that parameters are used for the following purposes:

  • Tracking
  • Reordering
  • Filtering
  • Identifying
  • Pagination
  • Searching
  • Translating

When you get to the point that your URL’s dynamic parameters are causing a problem, it usually comes down to basic mismanagement of the creation of these URLs.

In the case of tracking, using many different dynamic parameters when creating links that search engines crawl.

In the case of reordering, using these different dynamic parameters to reorder lists and groups of items that then create indexable duplicate pages that search engines then crawl.

You can inadvertently trigger excessive duplicate content issues if you don’t keep your dynamic parameters to a manageable level.

You should never need 50 URLs with UTM parameters to track the results of certain types of campaigns.

The creation of these dynamic URLs for one piece of content can really add up over time if you aren’t carefully managing their creation and will dilute the quality of your content along with its capability to perform in search engine results.

It leads to keyword cannibalization and on a large enough scale can severely impact your ability to compete.

Shorter URLs Are Better Than Longer URLs

A long-held SEO best practice has been shorter URLs are better than longer URLs.

Google’s John Mueller has discussed this:

“What definitely plays a role here is when we have two URLs that have the same content, and we try to pick one to show in the search results, we will pick the short one. So that is specifically around canonicalization.

It doesn’t mean it is a ranking factor, but it means if we have two URLs and one is really short and sweet and this other one has this long parameter attached to it and we know they show exactly the same content we will try to pick the shorter one.

There are lots of exceptions there, different factors that come into play, but everything else being equal – you have a shorter one and a longer one, we will try to pick the shorter one.”

There is also empirical evidence that shows that Google ranks shorter URLs for more terms, rather than long and specific.

If your site contains super long URLs everywhere, you may want to optimize them into better, shorter URLs that better reflect the article’s topic and user intent.

Examples of overly complex URLs

Make Sure Your Site Has Top Cross-Platform Compatibility & Fast Page Speed

Site glitches and other problems can arise when your site is not coded correctly.

These glitches can result from badly-nested DIV tags resulting in a glitched layout, code with bad syntax resulting in call-to-action elements disappearing, and bad site management resulting in the careless implementation of on-page elements.

Cross-platform compatibility can be affected along with page speed, resulting in greatly reduced performance and user engagement, long before link building ever becomes a consideration.

Nip some of these issues in the bud before they become major problems later.

Many of these technical SEO issues come down to poor site management and poor coding.

The more that you tackle these technical SEO issues at the beginning with more consistent development and website management best practices, the better off you’ll be later when your link building campaign takes off.

Poorly Coded Site Design

When you have a poorly coded site design, your user experience and engagement can suffer and will be adversely affected.

This is yet another element of technical SEO that can be easily overlooked.

A poorly coded site design can manifest in several ways with:

  • Poor page speed.
  • Glitches in the design appearing on different platforms.
  • Forms not working where they should (impacting conversions).
  • Any other call to actions not working on mobile devices (and desktop).
  • Any tracking code that’s not being accurately monitored (leading to poor choices in your SEO decision-making).

Any of these issues can spell disaster for your site when it can’t properly report on, capture leads, or engage with users to its fullest potential.

This is why these things should always be considered and tackled on-site before moving to link building.

If you don’t, you may wind up with weaknesses in your marketing campaigns that will be even harder to pin down, or worse – you may never find them.

All of these elements of a site design must be addressed and otherwise examined to make sure that they are not causing any major issues with your SEO.

Pages Are Slow to Load

Since July 2018, Google rolled out page speed as a ranking factor in its mobile algorithm to all users.

Slow loading pages can affect everything, so it’s something that you should pay attention to on an ongoing basis, and not just for rankings.

But for all of your users also.

What should you be on the lookout for when it comes to issues that impact page speed?

Slow Loading Images

If your site has many images approaching 1 MB (1 megabyte) in file size, you have a problem.

As the average internet connection speed approaches over 27.22 Mbps download on mobile, and fixed broadband approaches over 59.60 Mbps download, realistically, this becomes less of an issue, but can still be an issue.

You will still face slower loading pages when you have such large images on your site. If you use a tool like GTMetrix, you can see how fast your site loads these images.

Typical page speed analysis best practices say that you should take three snapshots of your site’s page speed.

Average out the three snapshots, and that’s your site’s average page speed.

It is recommended, on average, for most sites, that images should be at most 35 – 50K per image, not more. This is depending on resolution, and pixel density (including whether you are accommodating the higher pixel densities of iPhones and other devices).

Also, use lossless compression in graphics applications like Adobe Photoshop in order to achieve the best quality possible while resizing images.

Efficient Coding Best Practices

Some people believe that standard coding best practices say that you should be using W3C valid coding.

Google’s Webmaster Guidelines recommend using valid W3C coding to code your site.

Use valid HTML

But, John Mueller (and even Matt Cutts) have mentioned in the past that it’s not critical to focus on W3C-valid coding for ranking reasons.

Search Engine Journal staff Roger Montti discusses this conundrum in even further detail here: 6 Reasons Why Google Says Valid HTML Matters.

But, that’s the key word there. Focusing on it for ranking purposes.

You will find at the top of Google, for different queries, all sorts of websites that ascribe to different coding best practices, and not every site validates via the W3C.

Despite a lack of focus on that type of development best practice for ranking purposes, there are plenty of reasons why using W3C valid coding is a great idea, and why it can put you ahead of your competitors who are not doing it.

Before any further discussion takes place, it needs to be noted from a developer perspective:

  • W3C-standard validated code is not always good code.
  • Bad code is not always invalid code.
  • W3C validation should not be the be-all, end-all evaluation of a piece of coding work.
  • But, validation services like the W3C validator should be used for debugging reasons,
  • Using the W3C validator will help you evaluate your work more easily and avoid major issues as your site becomes larger and more complex after completion of the project.

But in the end, which is better, and why?

Picking a coding standard, being consistent with your coding best practices, and sticking with them is generally better than not.

When you pick a coding standard and stick with it, you introduce less complexity and less of a chance that things can go wrong after the final site launch.

While some see W3C’s code validator as an unnecessary evil, it does provide rhyme and reason to making sure that your code is valid.

For example, if your syntax is invalid in your header, or you don’t self-close tags properly, W3C’s code validator will reveal these mistakes.

If, during development, you transferred over an existing WordPress theme, from say XHML 1.0 to HTML 5 for server compatibility reasons, you may notice thousands of errors.

It means that you have incompatibility problems with the DOCTYPE in the theme and the language that is actually being used.

This happens frequently when someone copies and pastes old code into a new site implementation without regard to any coding rules whatsoever.

This can be disastrous to cross-platform compatibility.

Also, this simple check can help you reveal exactly what’s working (or not working) under the hood right now code-wise.

Where efficient coding best practices come into play, is doing things like inadvertently putting multiple closing DIV tags where they shouldn’t go, being careless about how you code the layout, etc.

All of these coding errors can be a huge detriment to the performance of your site, both from a user and search engine perspective.

Common Ways Too Many WordPress Plugins Can Harm Your Site

Using Too Many Plugins

Plugins can become major problems when their use is not kept in check.

Why is this? How can this be – aren’t plugins supposed to help?

In reality, if you don’t manage your plugins properly, you can run into major site performance issues down the line.

Here are some reasons why.

Extra HTTP Requests

All files that load on your site generate requests from the server or HTTP requests.

Every time someone requests your page, all of your page elements load (images, video, graphics, plugins, everything), and all of these elements require an HTTP request to be transferred.

The more HTTP requests you have, the more these extra plugins will slow down your site.

This can be mostly a matter of milliseconds, and for most websites does not cause a huge issue.

It can, however, be a major bottleneck if your site is a large one, and you have hundreds of plugins.

Keeping your plugin use in check is a great idea, to make sure that your plugins are not causing a major bottleneck and causing slow page speeds.

Increased Database Queries Due to Extra Plugins

WordPress uses SQL databases in order to process queries and maintain its infrastructure.

If your site is on WordPress, it’s important to know that every plugin you add will send out extra database queries.

These extra queries can add up, and cause bottleneck issues that will negatively affect your site’s page speed.

The more you load plugins up, the slower your site will get.

If you don’t manage the database queries well, you can run into serious issues with your website’s performance, and it will have nothing to do with how your images load.

It also depends on your host.

If you suffer from a large website with too many plugins and too little in the way of resources, now may be the time for an audit to see exactly what’s happening.

The Other Problem With Plugins: They Increase the Probability of Your Website Crashing

When the right plugins are used, you don’t have to worry (much) about keeping an eye on them.

You should, however, be mindful of when plugins are usually updated, and how they work with your WordPress implementation to make sure your website stays functional.

If you auto-update your plugins, you may have an issue one day where a plugin does not play nice with other plugins. This could cause your site to crash.

This is why it is so important to manage your WordPress plugins.

And make sure that you don’t exceed what your server is capable of.

This Is Why It’s Important to Tackle Technical SEO Before Link Building

Many technical SEO issues can rear their ugly head and affect your site’s SERP performance long before link building enters the equation.

That’s why it’s important to tackle technical SEO before you start link building.

Any technical SEO issues can cause significant drops in website performance long before link building ever becomes a factor.

Start with a thorough technical SEO audit to reveal and fix any on-site issues.

It will help identify any weaknesses in your site, and these changes will all work together with link building to create an even better online presence for you, and your users.

Any link building will be for naught if search engines (or your users) can’t accurately crawl, navigate, or otherwise use your site.

Summary

Timeframe: Month 1, 2, 3 and every quarter

Results Detected: 1-4 months after implementation

Tools needed:

  • Screaming Frog
  • DeepCrawl
  • Ahrefs (or Moz)
  • Google Search Console
  • Google Analytics

Link building benefits of technical SEO:

  • Technical SEO will help you get the maximum performance out of your links.
  • Technical SEO like a clean site structure and understanding of PR flow is very key for internal link placement.

Image Credits

Featured Image: Paulo Bobita
In-post images/screenshots taken by author, July 2019



Continue Reading

WordPress

Critical Vulnerability Strikes WordPress Ad Inserter

Published

on

Critical Vulnerability Strikes WordPress Ad Inserter


Ad Inserter, a popular Ad management WordPress Plugin was discovered to contain a critical vulnerability. The vulnerability allows an authenticated user as low as a subscriber to execute code on the affected website. It is advised that users of the plugin update immediately.

Screenshot of WordPress Dashboard Update LinkThis is a screenshot of the WordPress dashboard. In the top left hand corner is a link that allows you to update your plugins.

Description of Ad Inserter Vulnerability

There are actually two vulnerabilities.

Authenticated Path Traversal Exploit

The first vulnerability is called an Authenticated Path Traversal Exploit. This exploit exists in Ad Inserter version 2.4.19 and under.

This is a type of exploit that allows an attacker to access to areas of a site by adding variables to the URL, variables like ../. This allows an attacker to “traverse” to an area that may allow them to execute code or see private information.

According to Common Weakness Enumeration (CWE) web page about traversal exploits,  on a website that is maintained by the U.S. Department of Homeland Security, this is how a path traversal exploit works:

“The software uses external input to construct a pathname that is intended to identify a file or directory that is located underneath a restricted parent directory, but the software does not properly neutralize special elements within the pathname that can cause the pathname to resolve to a location that is outside of the restricted directory.”

The second vulnerability is labeled as critical. The vulnerability was discovered on Friday July 12th by the WordFence team and swiftly fixed by Ad Inserter the following day, on Saturday July 13, 2019.

Authenticated Remote Code Execution

The second vulnerability is called an Authenticated Remote Code Execution (RCE). This allows any user who is registered with the site, with permissions as low as a subscriber to be able to execute arbitrary code on a WordPress installation.

The RCE exploit affects Ad Inserter version 2.4.21 and under.

According to the WordFence website:

“On Friday, July 12th, our Threat Intelligence team discovered a vulnerability present in Ad Inserter, a WordPress plugin installed on over 200,000 websites. The weakness allowed authenticated users (Subscribers and above) to execute arbitrary PHP code on websites using the plugin.

We privately disclosed the issue to the plugin’s developer, who released a patch the very next day.

This is considered a critical security issue…”

Ad Inserter Plugin Reacted Swiftly and Ethically

Almost all plugins and software may contain a vulnerability. What’s important is how quickly a developer responds to issues and how transparent the developers are about it.

Ad Inserter WordPress Plugin ChangelogScreenshot of the Ad Inserter changelog showing that they responded ethically and transparently.

The Ad Inserter team deserve praise for how quickly they responded and for their transparency about the updates. Ad Inserter alerted their users to the vulnerability through the changelog that is visible on every user’s update page. This is important because it alerts users to the urgency of the update.

The Ad Inserter team acted swiftly and ethically. That’s the best that can be expected from any WordPress developer.

Update Ad Inserter

All users of the Ad Inserter WordPress plugin are urged to log in to their WordPress installation and update their Ad Inserter plugin.

Read the WordFence announcement here.



Continue Reading

WordPress

15 Must-Have Features for Ecommerce Sites

Published

on

15 Must-Have Features for Ecommerce Sites


All websites are unique in their own way.

We’ve seen trends in ecommerce website design like animation, 360-degree product previews, advanced filtering, and dynamic product search.

However, trends don’t guarantee conversion rates or robust user experiences. There are certain elements that every ecommerce site should have to stay relevant and competitive.

Here is a list of 15 must-have features that attract online shoppers.

1. User-Friendly

This comes from the old K.I.S.S. adage about keeping it simple.

Simplicity should be a goal in good design and you don’t have to sacrifice elegance to achieve it.

As a matter of fact, studies show that 76% of consumers say the most important characteristic of a website is ease of use.

The objective is to help shoppers get to what they want faster and without running into unnecessary complexity that can clog up the path to purchase.

Online sellers have minutes, if not seconds, to make a sale. Focus on the user experience by providing shopping categories, filters, and comparison capabilities.

To make your ecommerce site more user-friendly, consider the following:

  • Improve search functionality with an autocomplete option like Under Armour:

1.usability-autocomplete

  • Create effective, visual navigation like Stanley:

2. navigation-stanley

  • Pre-populate forms like Target:

3.usability-forms

Ecommerce sites should be a competitive advantage instead of a troublesome experience.

2. Mobile-Friendly Website

Mobile shopping accounts for 50% of online transactions.

With a responsive website, content intuitively adapts to whatever device is accessing it to provide the most user-friendly experience.

Walmart increased mobile sales by 98% after optimizing its mobile site.

4. walmart-mobile

And Intelligentsia created a better purchase experience on mobile that led to a 16% increase in mobile transactions.

It doesn’t matter if you have millions of product pages or five.

Creating a mobile-friendly experience for your shoppers will improve sales and potentially your rankings.

3. High-Resolution Photos & Video

Gone are the days of posting one photo with a few bullet points and a price tag.

Shoppers want to see multiple angles and people using the product in different environments. They want to be able to zoom in and get a feel for the product.

Technical considerations for images are crucial. Images that don’t load or take too long to load will see a consumer drop-off rate of 39%, according to Adobe.

And, we’re not talking about just professional photos. Vanity Planet found that adding Instagram photos to their product pages increased checkouts by 24%.

5.vanity planet-instagram photos

Images sell, not text.

Ecommerce websites should display multiple photos per product. The photos need to be high-resolution and optimized for page load.

4. User-Generated Reviews

Shoppers read reviews. About 95% of them, in fact.

And 57% of consumers will only use a business if it has 4 or more stars.

You might think that having negative reviews is a sale killer. The opposite is actually true.

Having negative reviews can often be positive. It’s shown that products without negative reviews are seen as censored and, in turn, shoppers will assume the positive reviews are fake.

Depending on website functionality, ecommerce sites can use plugins from the most popular review platforms including Yelp, Foursquare, and Facebook.

Forward-thinking ecommerce sites, like Paiwen paddleboards, are using reviews as user-generated content to drive social proof and build raving fans.

Paiwen-UGC

5. Special Offers

Most ecommerce sites are using special offers in their standard marketing practices via email, social, text, etc.

Next-level ecommerce sites take advantage of the prime real estate in the header section to promote special offers.

kendra-scott-offers-header

When shoppers realize they’re getting a special deal, it motivates them to buy more and spend more time searching the site.

If ecommerce sites are using ongoing promotions, providing a unique webpage that lists the offers will not only drive more sales but also improve SEO.

Consider the shopper searching [ZIP code + toyota special offers] and the value of that organic search result.

gst-special-offers

6. Wish Lists

Oh, how I love wish lists. One for fashion, one for books to read, one for holiday gift ideas.

Shop, save, and share!

Ecommerce sites that aren’t using wish lists are leaving revenue on the virtual desktop table.

What’s better than having customers bookmark items they want and will most likely buy in the future? That’s just gold in the pocket. And a remarketing campaign dream.

Just take a note from Amazon.

6. Amazon - wish lists

Retailers are creating these individualized options to drive engagement.

In this webinar by Brendan Witcher of Forrester, he shares how these can drive open rates as high as 80%, and 5x-10x higher return to purchase.

It’s also an opportunity to share a brand with new buyers. When shoppers share their wish lists with family and friends, it sends free traffic with built-in social proof to a website.

7. Find-in-Store

Not all ecommerce sites have brick-and-mortar stores. However, those that do must have a find-in-store feature.

Sometimes you don’t want to wait for an item to ship. You want instant gratification.

Shoppers are going online just to research and then complete the purchase in person. This is especially true for millennials, as they tend to research online before purchasing in a store.

This makes it incredibly helpful to see which local stores have products in stock.

8. Related Items

Simply seeing the phrase, “you might like this” causes a serotonin release signaling curiosity and excitement.

A ‘Related Items’ feature on an ecommerce site creates the desirable stickiness effect that so many marketers strive to achieve.

It happens like this:

You saved the AeroPress coffee maker to your shopping cart.

A section emerges that says, “you might like this.”

Would you like some freshly roasted, fair trade whole coffee beans?

Maybe a digital scale to measure the exact amount of coffee grounds to achieve the perfect cup of coffee?

You get the picture.

When an online seller uses the related items feature to sell more, it’s actually signaling to the buyer that “they get me.”

Related items can also include similar product categories to comparison shop, “people who bought this item also searched for”, and so on.

9. Frequently Asked Questions (FAQ)

Complex purchases require confidence with a seller and detailed information when buying products online versus face-to-face.

Fine Watch Bank (disclosure: they were a client) sells expertly crafted high-end watches.

The company’s buyers need confirmation of watch authenticity and detailed product information to feel comfortable when making a purchase.

An additional information section details frequently asked information, which establishes credibility and builds confidence with the buyer.

finewatchbank-additional-ino

FAQ sections also provide a self-help area to address common customer problems. Buyers often ask about product support, returns, lost account passwords, etc.

10. Social Proof

Brands and online sellers that connect with their buyers on an emotional level create brand trust and advocacy.

Social connection is undeniable and the opportunity to bond with an audience is easier now than ever.

Linking social profiles and user-generated content with an ecommerce site are must-have features that will give online sellers brand character.

Shoppers are inundated with a constant flow of “buy now” and “get this”. It’s those brands that engage with customers socially that create customers for life. They are offering products that help others and that people care about.

Social media allows ecommerce brands to show authenticity and associate an emotional connection with their products.

  • Look at Nike’s Pinterest account to see how incorporating products with motivational images makes a huge impact.
  • See how Ikea uses user-generated content to share design ideas and inspiration for how to use its products.
  • Check out how Lowe’s produces how-to videos that teach people how to use its products.

11. Security Features

Online transactions are an integral part of our lives.

As such, ecommerce websites can be a lucrative target for cybercriminals. It’s crucial that online sellers protect customer information and take steps to ensure privacy.

It all starts with a secure ecommerce platform.

With their sophisticated security features, it’s no wonder that Magento and WordPress paired with WooCommerce are the top two platforms for online sellers.

Other must-have ecommerce security features to consider are:

  • SSL certificate: Establishes secure connectivity between a user and the website. Look for HTTPS and a green lock in the address bar before trusting an online store with your information. Select an SSL certificate vendor with name recognition. The enterprise ecommerce giants almost always use Symantec.
  • Two-factor authentication: Adds an extra layer of security by requiring username/password and a system-generated code sent via email or text.
  • Use a firewall: Provides a gateway or wall between two networks and permits authorized traffic and blocking malicious traffic.
  • Privacy policy link in footer: Addresses the website’s privacy policies and promises customer data is not shared with third parties.

12. Advanced Payment Options

In the world of Apple Pay and PayPal, advanced payment options are a must-have feature for an ecommerce site.

Just consider the ease of use associated with Amazon one-click shopping. Shoppers with registered accounts can literally buy with the click of a button.

There are many popular online payment options. The key is understanding who the buyer is and implementing the most effective solutions.

If your website is limiting payment options, make sure to explain why.

For example, an ecommerce site may not offer Bitcoin as a payment option after determining its value isn’t reliable.

If buyers are technology-savvy individuals who tend toward this payment method, it’s important to be transparent as to why it isn’t an option. This transparency establishes that the brand understands its buyers and builds trust.

13. Detailed Shipping Information

Alarmingly, unexpected shipping costs are the number one reason for shopping cart abandonment.

It’s critical that ecommerce websites include shipping information early in the checkout process, including a ZIP code calculator showing cost.

It’s also helpful to include delivery time and options for faster delivery. Sites should list countries that are outside normal shipping zones.

Interestingly, shoppers spend 30% more per order when free shipping is included. Be sure to present codes for free shipping on all pages of the site through the header navigation, if applicable.

victorias-secret-offers-header

14. Multi-Touchpoint Contact Page

When it comes to running an ecommerce site where most business takes place online, nothing builds trust more than an overly obvious and well-presented “contact us” feature. This is especially true when selling high-end merchandise or technical products.

Never make it hard for a buyer to get in touch or request support.

Consider the contact page a feature of the website. Include multiple ways to reach out, including phone, email, and an online form.

Some companies with brick-and-mortar stores offer online appointment setting via their contact page. Be sure to add customer service social links and Google maps for locations.

15. Return Policy

Return policies are an essential feature of any ecommerce website.

Your return policy should be clearly visible and well-written or illustrated.

This is another trust-building feature of online selling. It reassures buyers that if they are unhappy or just need a different size, the brand is there for them.

Start Creating Fantastic Shopping Experiences

These must-have ecommerce site features could be grouped into categories like Duh, DOH! and Ahhh. Marketers and web designers might say “duh” when it comes to incorporating ease of use, high-resolution photos, security features, and contact information. As they are best practices and trademarks of good design.

There are others who would say “DOH!” when considering elements like special offers, user-generated reviews, and social sharing. They know they should be using these features but just haven’t gotten around to implementing them. It’s never too late!

Finally, the “ahhh” moment when you get excited about planning for robust related items, shareable wish lists, and find-in-store features.

No matter what category you’re in, ecommerce sites have an incredible opportunity to use tried-and-true features blended with innovative new elements to create fantastic shopping experiences.


Image Credits

Featured Image: Paulo Bobita
All screenshots taken by author



Continue Reading

Trending

Copyright © 2019 Plolu.