Connect with us

SEO

How & Why You Must Improve or Remove Your Old Content

Published

on


Could old content be dragging down the overall “authority” of your website? We think so.

You have an important decision to make: should you improve your old content or remove it?

Making the right decisions during this process can bring great rewards, in terms of traffic, organic search visibility (rankings, featured snippets, etc.), links, conversions, and engagement.

On March 27, I presented an SEJ ThinkTank webinar to share the process we at Search Engine Journal has been using to improve and remove old content for the last 20 months.

Here’s a recap of the presentation.

How & Why You Must Improve or Remove Your Old Content

Google’s mission since its inception is to “[o]rganize the world’s information and make it universally accessible and useful.”

On Google’s end, nothing has changed.

But what has changed is this little thing called content marketing. Around the time of the original Google Panda update, a lot of businesses and brands finally bought into the idea that content is king.

They started creating all sorts of content – some of it was great, but most of it was average, far below average, or just outright terrible.

Today, lots of content is being published, but most of it isn’t very useful. A lot of it is redundant.

In 2016, the web was made up of about 130 trillion individual pages. But the Google Search index contains hundreds of billions of webpages – only a portion of the total pages available in the web.

The search engine is filtering out a lot of stuff and you don’t want that to be you.

So this year, content marketers and creators have a new mission:

“Give Google only your best content and make it optimized, useful, and relevant.”

It’s 2019. Our mission can’t stay the same. It’s time we all start thinking about content in a new way.

Rethinking Your Content Marketing Approach

Google spokespeople have downplayed the idea that “old content won’t hurt you.” They have also warned that removing content is a dangerous SEO strategy.

But is it really?

Not based on our results.

For the last 20 months, we’ve been hacking and slashing our way through our archives which resulted to increased pageviews and organic traffic of up 60+ percent YoY.

Just check out these numbers:

When I started as Executive Editor in July 2017, we had 910,000 pageviews.

In January of this year, we just had a record month – 1.7 million pageviews. At that time, we had about 18,000 pages.

And we just topped that record again in March – more than 1.9 million pageviews. Today, we still have 18,000 pages indexed – we’re just getting more out of the same amount of content.

So how did we achieve this growth?

Here’s the process we used.

Step 1: Audit Your Content

The process all begins with auditing and evaluating your content.

There are basically three buckets of content:

  • Content that helps you.
  • Content that does absolutely nothing for you.
  • Content that can hurt you.

We need to figure out which bucket all of our content fits in.

Since 2003, Search Engine Journal has been creating tons of content and it came to a point where it got really messy and disorganized. We needed to get out of that chaos.

The first step in the process is to crawl your content.

Some options that you can use to crawl your content include:

  • Screamingfrog
  • DeepCrawl
  • Oncrawl
  • Sitebulb
  • Botify

Here are even more crawlers. Choose whichever crawler works for you.

After you get through the crawling process, you need to know about the following elements:

  • Title: Is it optimized? Does it include a reader benefit?
  • URL: Is it SEO friendly? Do you need to change it?
  • Author: Who wrote it? Is it an expert/authority in the field?
  • Publication date: Is it still fresh or out of date?
  • Number of reads: The more reads, the better. It’s a sign of good content that connected with your audience
  • Word count: It isn’t necessarily a sign of low-quality content but it could potentially indicate quality issues.
  • Number of links: How many inbound and internal links do you have?
  • Trust Flow and Citation Flow: This is Majestic’s metrics for quality score and link equity.

Step 2: Evaluate the Quality of Your Content

Next, you’ll need to analyze the quality of the content you have on the website.

I define quality content as content that is:

  • Accurate
  • Mobile-Friendly
  • Answers Questions
  • Informative
  • Original
  • Shareable
  • Solves Problems
  • Inspiring
  • Readable
  • Visual
  • Entertaining
  • Educational

And for me, low-quality content:

  • Has no target audience.
  • Has no goal / purpose.
  • Is not optimized.
  • Is unsuccessful.

On the other hand, Google defines quality content as:

  • Useful and informative.
  • More valuable and useful than other sites.
  • Credible.
  • High quality.
  • Engaging.

A term that also comes up often is E-A-T which comes from Google’s search quality rating guidelines. Simply put E-A-T means:

  • Expertise: Your unique skills, information, or knowledge.
  • Authority: Other people know about and recognize your skills or knowledge.
  • Trust: People believe what you think, say, or do and feel secure buying from or endorsing you.

Google considers content as low quality when it has the following elements:

  • Inadequate E-A-T.
  • Main content quality is low.
  • Unsatisfying amount of main content.
  • Exaggerated / shocking title.
  • Ads or supporting content distracts from main content.
  • Unsatisfying amount of info about website or content creator.
  • Mildly negative reputation of website or content creator.

What’s the best action to take when you find out that you have low-quality content? Should you remove or improve it?

This is what Google’s Gary Illyes had to say:

Illyes talked about removing content a couple of years later and said that:

“It[‘]s not guaranteed that you will see any positive effect from that… For those that don’t show up in the search results, those are not indexed, and if they are not indexed then typically they are not affecting your site.”

Google’s John Mueller has also opined on the topic in a Google Webmaster Hangout:

“Improving it means that the rankings can only go up, whereas by removing it, can cause loss of rankings instead of the gains that some people think content removals will do.”

Both of these Googlers go against the idea of content removals. But where exactly did that idea come from?

It was actually from Google’s Michael Wyszomierski back in 2011 who said:

“In addition, it’s important for webmasters to know that low quality content on part of a site can impact a site’s ranking as a whole… Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”

These different recommendations present a conflict. Which tactic is right?

Content Inequality

In 2017, I ran the numbers and figured out that the top 3 percent of posts on SEJ drove as much
traffic as the bottom 97 percent combined.

We’re talking about a few hundred posts driving as much as several thousand posts.

Metrics to Help Define Quality

We have all these vague statements on content quality from Google, but how do we assign actual metrics?

Here are five metrics we used in SEJ that you can use as well:

  • Pageviews
  • Organic traffic
  • Links
  • Conversions
  • Engagement

Ultimately, there are a lot of variables that influence the success of your content so use your best judgment whenever you’re evaluating your content.

Step 3: Determine What to Do with Your Existing Content

The final step of this process is making data-driven decisions about whether you should improve (update, rewrite, or consolidate) or remove (deindex) old content from search engines.

There are five possibilities for your content:

Scenario 1: No Changes Needed

You won’t need to change anything in your content if:

  • All information is accurate, or has historic value.
  • It consistently gets good traffic and engagement.
  • It has attracted many quality links and social shares.
  • It ranks in Position 1-3.
  • It generates conversions.

If content is already working well for you, leave it alone. Focus on areas where you can actually make gains.

Scenario 2: Content Update / Refresh

Content that needs an update or a refresh:

  • Gets consistent traffic (or used to).
  • Has earned some valuable links / shares.
  • Ranks on Page 1 of Google.
  • Few / no conversions.
  • Below average engagement.

How to Do It

To do a content update / refresh, you’ll need to:

  • Update information so it is accurate.
  • Make it better than your SERP competitor(s).
  • Keep it on the same URL (whenever possible)

For example, for the past few years, SEJ’s Assistant Editor Anna Crowe has published a marketing calendar post that needs to be refreshed yearly.

In 2016, we published the first version, which got a respectable 22,000 reads, ranked in the top 2 of Google, and got ~2,000 pageviews per month, on average.

By the end of 2017, the content needed an update for 2018.

The results:

  • #1 ranking for [marketing calendar].
  • 64,000 reads
  • About 5,000 pageviews per month, a more than 2x improvement.

Scenario 3: Content Rewrite

Your content needs to be rewritten if the following apply:

  • Currently gets little or no traffic.
  • No longer attracts new links / shares.
  • Doesn’t rank on Page 1.
  • Is it indexed?
  • No conversions.

How to Do It

Usually, a content that needs to be rewritten has a useful, relevant, or helpful topic, but it’s just written poorly. To address that, make sure to:

  • Start content from scratch.
  • Update information so it is accurate.
  • 301 redirect old to new post on a new (optimized) URL.

From Google Shares How 301 Redirects Pass PageRank, we know:

  • Google can forward PageRank through 301 redirects.
  • Not all 301 redirects pass 100 percent PageRank.
  • A 301 redirect will pass 100 percent PageRank only if the new page closely matches the topic of the old page.

For example, SEJ had a post on a popular SEO topic – subfolders vs. subdomains. However, our version, which was written in 2008, had fallen off Page 1 of Google and was only get about 50 pageviews per month.

So Jenny Halasz, who writes SEJ’s Ask an SEO column, rewrote it and we published with the title Subdomains vs. Subfolders: Which Is Better for SEO & Why?

The results were ridiculously awesome:

  • #1 ranking
  • 30,000 reads
  • About 4,000 pageviews per month, an 80x improvement.

Scenario 4: Content Consolidation

Here are the reasons why you might want to consider consolidating your content:

  • You have multiple articles on one topic.
  • One piece gets some traffic; others get little or none.
  • They do not attract any new links or shares.
  • The article is not ranking on Page 1 or…
    • The wrong page ranks.
    • Two pages are competing on the same SERP.

Below is an example:

Content Consolidation

Combining content when needed gets the approval of Google’s John Mueller. He said of the topic:

“Probably. I think that’s something that generally… we see if you take two or three or four kind of weaker pages and merge them into one, even within the same site or externally, then that’s something where we can say that this is a stronger page.

We can see that… more parts of the site are referring to this one single piece of content so it’s probably more relevant than those individual small pieces that you had before.

How to Do It

So how exactly do you go about this? Follow these steps:

  • Create one awesome piece of content.
  • Start from scratch, but you can reuse any useful existing content.
  • Make it better than your competitors.
  • 301 redirect to new (optimized) URL.

So that’s just what we did. SEJ Contributor Ashley Ward wrote How to Do a Content Audit: The Ultimate Checklist.

The results were great:

  • Ranking on Page 1
  • 8,000 reads
  • About 300 pageviews per month, a 10x improvement.

Scenario 5: Content Deletion / Deindexing

Your content needs to get the boot if:

  • It’s “thin content”.
  • It’s poorly written / off-topic / syndicated / stolen / plagiarized.
  • It has no historic significance.
  • It has a very low number of pageviews.
  • It has few or no traffic, links, shares, conversions, or engagement.

Here’s one example of terrible content. It got few pageviews or shares, and is honestly just content nobody would ever want to read:

What the Ants Taught Me About Successful Internet Marketing SEJ article

So how do you know whether to remove content from your site? It’s all in how you answer these four questions:

Is Removing Content Dangerous

If you answer “yes” to any of these questions, then removing content could be dangerous.

But if you answer “no,” to all these questions, then it’s time for that content to go.

Key Takeaways

To find out whether you need to improve or remove your old content, follow these steps:

  • Audit: Know what content you have.
  • Evaluate: Improve content if you can (rewrite, consolidate, update) or remove content if you can’t (deindex, delete from website).
  • Measure: Use metrics that matter to your company and use data to decide the fate of your content.

Q&A

Thanks everyone for all the great questions. I’ve tried to answer as many as possible here. Let’s dive in!

Q: If all your content is not high quality, should a new website be the opportunity to delete all of it to create high-quality content?

A: Unless your website is currently penalized, or has been penalized in the past, or if the domain name somehow ties you to a niche/market that you want to exit, I don’t see a reason why you can’t start from scratch at the same domain.

Creating great content is a hard process. But the rewards are there for those who can do it well over a long period of time. It will just take Google a bit of time to figure out that things have changed and reassess all the new content and its quality – and figure out what the new and improved you is all about now.

Q: How might all this translate to an ecommerce site? I only deal with ecommerce and our POCs will consistently fight us on content pruning

A: The best way to deal with pushback is to get OK with a small trial to make your case. Start by using your analytics data to tell your story.

Show that the content is currently doing nothing for your site, whether it’s traffic/sales/whatever. This should open the door to get the OK to try out content pruning and measure the results.

If you can then showcase a win, then you should be able to make the case for scaling up. Talk data and focus in on the metrics that they want to see grow, and how content pruning will help you get there. 

Q: Where do you 301 the thin pages you delete if there isn’t a logical destination / another page about the same topic?

A: You could 301 it to a category page. If that doesn’t work, I’d just deindex it. 

Q: Does this apply to sites with low page counts? SEJ has 18k+ pages and tons of authority. Would this still work on a site that has 150 pages or less?

A: I haven’t tried it on a site that small, but yeah I honestly doubt you’d see the same results. However, if you start producing and publishing more content, in a couple years you may have a bit more to work with. 

Q: If I delete a boatload of thin content and redirect to my homepage, are there any downsides to that? I could just delete, but some have a little bit of authority.

A: If the topics are relevant to your space/niche/industry, it should be OK.

But if you’re redirecting a lot of content on random topics to the homepage, I could see it being confusing to Google, especially if those keywords aren’t used anywhere on the homepage.

It’s the whole idea of am I doing an A to A or B to B redirect. But if you’re redirecting topics B, C, D, E, and F to A, and they’re all different, I could see where you would either see no benefit from it, or possibly even a slight ding from it.

Q: If I deindex/delete a page, do I need to 301 it? Or is that not necessary since I really want it to disappear. 

A: If it has any value (e.g., link equity) and you could 301 it to a relevant page, you should. If not, then no. 

Q: As the algo gets better at ranking content for the most precise keywords (via neural matching/deep synonyms, etc.) would you say that it’s natural for some content to simply not rank as strongly for the same keyword over time OR rank for fewer total number of keywords over time?  Meaning.. this is an algorithmic change, not really a content issue?

A: Google is absolutely trying to get better about matching queries to user intent (with mixed results). So we must adjust our thinking and content as well.

If you have a piece of content that used to rank well for certain queries, you should check those SERPs now. See what Google is ranking now.

  • Has the intent changed (e.g., maybe once it was a transactional/bottom of funnel search, but now it’s an informational/top of funnel search)?
  • Or are they showing featured snippets, or videos, or news results, or any other special content result blocks.

If it is an algorithmic issue, it’s still a content issue you must deal with if you want to maintain your visibility.

Q: I just wonder that after you rewrite the content, is the target keyword still be the same? 

A: Yes. 

Q: I do not show ANY ads on my pages. Is that good or bad?

A: The lack of ads is neither good nor bad, taken alone, in terms of content performance.

In terms of user experience, it’s obviously good, because pretty much everyone dislikes ads. Plenty of sites with advertising rank well, and plenty of sites without ads rank well.

Q: From a customer service perspective with educational based content, what are some other metrics you’d recommend for good “quality”? Conversion doesn’t really make sense. 

A: Google Analytics has some good content metrics. Time on Page. Bounce Rate. Session Duration. These are all indicators of content success.

I’m also a huge fan of the New vs. Returning report, because it will give you an overall feel of whether your content is making people come back, or if you’ve got a lot of one-and-dones. 

Q: To clarify, having the same primary keyword for multiple pages on the same website is a bad thing?

A: It can be. For example, let’s say you have one page that’s maybe three years old and has some outdated info. This page is outranking a really good and newer page. I’d consider that a problem.

Ultimately, though, this really depends on your own strategy and what you want users to do if they land on your page for a certain query.

But as Loren mentioned in the Q&A during the webinar, if you have a couple pages competing with each other low on Page 1 or beyond, sometimes sacrificing one of those 2 pages can help the other jump up even higher.

Q: How do you gauge content success when you are entering a new market area where you are not recognized.  When should the content success be assessed in this situation.

A: Whenever you’re entering a new market, it will take time to figure out exactly who your audience is and what content they want. With content, you definitely need to experiment.

A lot of these experiments will fail, but that’s perfectly OK, because you can learn as much from content that fails as content that succeeds. 

I’ve built content for a domain from scratch, and when I’ve done this my thinking is this: I’m creating content now that will help me later. Generally speaking, it should take about a year before you can properly assess your content performance.

But it could be as long as 18 months – or sometimes even longer, depending on how often you publish, how much competition you have, and how much demand there is for your content, among other variables. 

Q: When you rewrite, do you modify content on the same URL or create a new URL? Why do a redirect and not just use the same URL?

A: This can go either way. If you think the URL is SEO friendly, then keep it and modify and republish the content (with an updated date, if you include dates).

I found, with SEJ, that a lot of our URLs weren’t optimized, or durable/evergreen. So, for instance, we have a lot of posts with numbers in them (e.g., 10 Things You Need to Know About SEO). And the URL might be 10-things-you-need-to-know-about-seo.

But what if you want to rewrite it and make it 11 things? Or 15 things. Or what if I totally want to abandon the list format and do something else?

Then the URL doesn’t match the title, which I think it troublesome. It keeps you in a box. Or what if you want to use a better word than things, which is vague. 

If you have an unoptimized URL, and it’s still getting traffic, it may feel dangerous to change it, especially if it’s an important page.

I’ve shared our results – the process has worked for us and helped us grow traffic. If you’re worried, you can always start small, and see if this process works for you. 

Q: Your presentation seems to focus mostly on posts. What about pages? Will they also improve if refreshed regularly? (i.e., Home, About Us, Services, Categories,) etc.

A: Yes. Pages that show signs of a significant decline can benefit from a refresh. Your content is only good until it stops helping you. Once you reach that point, you need to refresh it.

But I wouldn’t suggest refreshing content if it’s still performing well or doing whatever you want it to do. And always update any static pages to keep them up to date – e.g., if you no longer offer a service or someone on your about us page is no longer the person to contact, you want to update that immediately. 

Q: If consolidating multiple posts on the same topic, I’m concerned the consolidated piece will be too long. How do we go about navigating the length issue (or is this not an issue)?

A: When consolidating, you only want to salvage the best parts of anything you reuse. You don’t need to save or re-use everything.

So say if you have 4 posts on the same topic – one is 400 words, one is 500 words, another is 1000 words, and another is 750 words. That doesn’t mean you now should publish something in the neighborhood of 2500 words. (Unless that’s what it takes to rank on the query you’re targeting.)

Look at the SERP and check the word counts for what’s ranking in the top 10. If it’s 1100 words, you want to do something in a similar range. 

Q: Do you recommend using the robots.txt file to keep categories from being indexed?  How do you  get around Google wanting to “see” everything?

A: I don’t think Google wants to see everything anymore. It sees too much content as it is. You should let Google only see your best and most useful content.

For many queries today, category pages aren’t ranking as well as they used to. Mainly because, if a user clicks on a page, they typically are looking for information/resources, not to go hunting through links on a category page.

Ultimately it comes down to your strategy and how important category pages are for you.

Q: Many of the websites I work on are websites of print magazines, and much of the older and outdated content is archive material from the print edition. Would this have “historic value” or is it harming our sites?

A: When I talked about historic value, I’m basically talking more about news, such as coverage of important developments in your industry, trends or events. If your archived material is a genuinely good resource for your audience, then I don’t think that will hurt your site (it will likely be neutral at worst). 

With SEJ, it is the case that we have covered thousands of news stories about search and digital marketing over the years. I think those news posts have historic value.

One thing you can consider doing is finding ways to promote that older content – maybe via social media updates, a mention in a newsletter, or a “flashback” type of feature where you talk about something that happened on this day five years ago. 

Q: Can you elaborate further on the E-A-T concept?

A: I’d suggest reading Google Says You Need Expert Writers: Content Standards from the New Search Guidelines. 

Q: We have a lot of blog article throughout the years that basically do the same thing. Cover the same topic. Share the same info. Would we be better off deleting old posts, pointing URLs to the best version, or leave them all up?

A: If the old posts aren’t getting any traffic or doing anything to help your site (looking at the metrics that matter to you), I would definitely suggest 301 redirecting all the old posts to your newest and most awesome “hero” post on that topic. If that doesn’t exist, then create it. And use anything you can from the old posts. 

Q: Doesn’t having six articles ranking on Content Audit push your competitors to a later page?

A: No, because we weren’t ranking on Page 1 with any of these articles. After we consolidated, we finally got on Page 1. 

Q: Did you change the URL and redirect on the subdomains vs. subfolders article or keep the same URL?

A: Yes, we published the new article on a new URL and 301 redirected the old post to the new. 

Q: If you have 3-4 articles on page 1, including a couple in top 3, would you still consolidate?

A: Interesting problem to have. Consolidating is definitely a risk. If I had 4 articles on Page 1, I would consider testing out with the bottom 2 posts first, probably redirecting the lowest performing article (in terms of metrics) to the one ranking highest, and see what happens.

Maybe you’ll get that bump to Position 1. Sadly, there are no guarantees, whatever route you choose – but as a backup, you could always preserve the old content and undo the redirect if it fails to push any of your other 3 posts higher. 

Q: All of my pages have the same general layout, banner, menu, terms etc….  So apps, tell me I have duplicate content but the core content is different. Differnt blog posts etc… but all pages have common elements. Is that a problem?

A: That sounds like most webpages on most websites, where you have unique main content but similar layouts and elements. Doesn’t sound like a problem, unless you have the same content on multiple pages of your site; multiple URLs serving the same content; or if you have content that was stolen from other websites.

 Q: If we keep the URL the same but refresh the copy, do we need to also keep the metadata the same? 

A: If you can, make it better. Optimize it. Make sure the user benefit is clear. 

Q: When you say rewrite – how much of it is rewritten? 

A: However much needs to be rewritten. It could as little as changing a few dates or it could be a more substantial overhaul. Typically, when we do a rewrite, it’s 95-100 percent different. 

Q: I work for a nonprofit. Should we remove dates from our video pages and our story pages? They are evergreen and probably don’t really need dates.

A: You can. Dates aren’t essential. 

Q: What are the main KPIs that you analyze when you evaluate a new written piece of content? How can you measure quality?

A: Typically I focus on the total number of pageviews, rankings, organic traffic, link equity (Trust Flow/Citation Flow), and social shares. These metrics tell you whether the audience responded to it.

Content is just the format you use to deliver an idea or concept. The better the idea/concept, the better the content will be. Data is the only way to measure quality, so choose the metrics that matter most to you.

Q: Very targeted and specific content on a blog that does not have lots of readers, is that quality content still? What is the minimum # of readers?

A: This will vary. Typically, I use the 80-20 rule as a starting point. Meaning, look at the bottom 20 percent of your content first. This is typically content that just didn’t work for whatever reason. So it won’t start magically working for you today or a year from now.

Q: How often shall an optimized content be checked as I assume that it takes some time until the change has some impact. Is there a recommended timeframe as to when to update content?

A: If it’s evergreen, check it either quarterly or yearly. Quarterly whenever you’ve made any changes and you want to assess if they’re working or not.

Yearly for everything. It also is a good idea when a major algorithm is announced or highly suspected to see if any of your pages ranking/traffic dropped. You might notice some patterns you can address.

Q: Why would you change multiple variables on a content page – like your Paypal example – rather than push it out incrementally and review? Could the Paypal anomaly be related to a lack of topical authority…or Your Money or Your Life ?

A: We had successfully changed multiple variables before. Our goal is always to put out the best piece of content we can.

In all the examples I shared (aside from the content refresh) plus dozens of others, we had changed authors, URLs, and content. And it had worked for us. It could be a question of topical authority, but that wouldn’t explain HubSpot ranking #1. 

Q: If you have a topic that keeps updating year after year, would you not recommend putting the year (i.e. 2019) in the URL? If the date were in the URL, it can help with optimization for the search query, but every year, we’d have to 301, and I know we don’t want a lot of 301 redirects. What do you recommend?

A: Having the year in the URL is an intent signal. We actually found that putting the year in the URL was detrimental to rankings after a certain point because it would rank well, but then lose all its value once the new year comes around.

Whenever possible, I’d recommend evergreen/durable URL so you can keep building authority for one URL year after year. You can more easily change the headline/title tag/meta info to include a year.  

Q: I’ve heard that 750 words is a criteria for blogs. Is that the case or is that more myth? And what word count would you suggest for content as for some topics doing content to 1,500 words is really hard.

A: Figure out what query you want to rank for. See what the average length is of the posts on Page 1 for that query. There is no ideal or perfect word count that works in all instances. It varies by query.

I’ve seen posts with 300 words outperform posts with 3,000 words. Really, you should write for your audience. Entertainment content is more fluffy/shorter than say marketing content. 

Q: So when you updated a dated piece, you just rewrote the Title and updated the content?

A: Yes. Titles can always be improved. And we test our headlines as well. We try to always test out 3 headlines and pick the winner based on engagement (clicks). 

Q: Were there other paid campaigns built around high performing pages to help them rank organically?  

A: No. We only do paid campaigns for our ebooks and sponsored content, they weren’t part of this case study.

Q: So, when you’ve got a couple of below average pages and merge them into one better page with a better slug… what do I do with the old page urls? Can I 301 several pages to the same new URL?

A: Exactly.

Q: Would you apply this to all areas of content? Such as health care terms blog article of “what is an explanation of benefits”.  This information hasn’t changed?

A: If the info hasn’t changed, but you aren’t ranking, getting some fresh traffic and engagement to the topic could be all you need to help you gain visibility.

The example I used was subdomains vs. subfolders, which is a topic that’s still talked about and debated today as it was 10 years ago.

Remember: not everyone in your audience will read everything you publish every day. They may miss it.

So republishing content can give it a second chance and also open it up to new people who weren’t following you a year ago. And even if people read it a year or longer ago, it’s highly likely they will have forgotten they read it. 

Q: If your site has a listing section of businesses – these don’t generate that much traffic. Would you advise to dump this?

A: If the goal is solely to generate traffic, and it’s not doing that, then there’s a solid argument to dump it, yes.

[Video Recap] Content Cleansing: Why & How You Must Improve or Remove Old Content

Here’s the video recap of the webinar presentation and Q&A session.

See the SlideShare below.


Image Credits

All screenshots taken by author, March 2019

Join Us for Our Next Webinar!

Join our next webinar on Wednesday, April 10 at 2 p.m. EST as Julia McCoy, CEO of Express Writers, shares the most common bad SEO content tactics you should absolutely avoid doing and smart solutions to help you avoid them for good.

The Do Nots of SEO: 7 Terrible, No-Good SEO Tactics to Abandon Forever

 





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

How to make your website ADA-compliant and win at SEO

Published

on


30-second summary:

  • The Americans with Disabilities Act (ADA) passed in 1990 does now include mobile apps and websites.
  • An ADA-compliant website helps more people than those covered by ADA.
  • There are many SEO benefits such as increased visibility on google image searches, and featured snippets.
  • Co-founder of Ally digital media, Abhishek Shah says, “Responsive websites help with ADA compliance and further improve your website’s overall search presence.”
  • The four best ways to make your website ADA-compliant with a clear outline of its ADA as well as SEO benefits.

The Americans with Disabilities Act (ADA) passed in 1990 does now include mobile apps and websites. Specifically, Title III of the ADA has taken an official stand on how websites should be accessible for disabled users. However, when you look at what’s necessary to make a website ADA-compliant, you will see that these also will help improve your site’s SEO.

Some elements such as title tags, heading structure, alt text, and responsive design are things all websites should include. By ensuring these are done properly and in an ADA-compliant way will maximize your website’s effectiveness.

How ADA accessibility prioritization benefits everyone

Ensuring your website complies with the ADA helps you serve a larger audience and gives a boost to your search engine rankings. This is because most of the necessary components of making your website ADA compliant feed directly into SEO best practices.

After all, the whole point is to make your website easier to view, understand, and navigate. What business doesn’t want all that for their website?

Four ways an ADA-compliant website helps improve your SEO

Here are 4 ADA-compliant must-haves (in no particular order) that will help improve your SEO. This list is by no means comprehensive, but it is a good place to start.

1. Title tags help screen searches and readers

Title tags are very basic SEO. They let the reader, and search engines, know what the page is about. A title tag doesn’t show up on your website. Rather, it appears on the results page of a search engine, and the tab at the top of your web browser.

SEO benefits

Title tags, while basic SEO, are very important. This tag needs to match your user’s intent. For example, when someone googles “best phone” the phrase best phone (or a variation like “best smartphone”) will appear in the title tag.

Writing a title that accurately reflects what the page is about is the best way to get found and clicked on. It’s why a title tag should be specific: “The best Android phones for 2020” is far better than “Why you will want to buy one of these phones.”

ADA benefits

For those who need screen readers to help them use a computer, a specific title tag such as the above example is much more user-friendly. So, it is vital the title tag accurately reflects the page content.

The accessibility guidelines say the title should be “The best Android phones for 2020” instead of “Why you will want to buy one of these phones.”

2. Descriptive alt text

Alt text is not the same thing as a caption. A caption is visible usually beneath an image. Whereas alt text is not visible on the front end of the site. The alt text is a written alternative to a page’s visual elements. This includes: .jpegs, .pngs, and .gifs. the alt text is a description of an image that lives in the backend of the site.

SEO benefits

Alt text lets search engines know the subject matter of an image. It also helps search engines to better understand the page. Additionally, if you want images to show up in Google, then writing descriptive alt text is a must-have.

ADA benefits

For web users with visual impairment using screen readers, descriptive alt text is read aloud. This helps a visually impaired reader get a better sense of what’s going on, on any given page.

A useful descriptive alt text might be: “woman at café with laptop drinking coffee” 

A useless alt text would be: “SEO tips for freelancers | Get more clients with SEO | Writing your way to success with SEO”

3. Responsive design

Responsive design has been around since 2012/2013 in one form or another. But it means more than just your website being able to adapt to whichever screen size it finds itself on.

It’s about where your logo sits, how easy is your site to navigate, how easy is it to read, and how quickly does it load?

SEO benefits

Websites that offer good, functional user experience rank better in search results. User experience isn’t just one ranking factor but an umbrella term for quite a few. Google has said that a site that takes longer than three seconds to load on a mobile site will rank higher.

How easy content is to read (and how useful it is) is also an important ranking factor.

ADA benefits

Good responsive design puts the user first. It starts from the premise that a website needs to be easy to look at, easy to navigate, and be easy to understand.

This is why you need legible text for the visually impaired. As well as quick load times for people with slow internet. And straightforward navigation to make it easy for people to get around your website.

4. Proper heading (and subheading) structure

Headings (which show up in the code as <h1> or <h2> or <h3> etc.) define your content’s hierarchy. These headings (and subheadings) work along similar lines to when you wrote essays in school.

Proper heading structure:

  • Goes in order: a h3 doesn’t go directly after a h1.
  • Describes the copy beneath it.
  • Follows a sequence: if your h2 is “4 ways…” then the h3s would be each of those points.

SEO benefits

When your writing is clearly structured it is easier to read, and easier to follow. It’s also easier for Google to crawl your content and understand what is the most important (starting with h1, and so on).

Good header structure can also your content appear in the featured snippets in the search engine results page (SERPs).

ADA benefits

For users who have limited reading comprehension or cognitive impairments, clear and direct headings make it easier to read. Headings and subheadings let a reader know what’s worth reading and what’s worth skipping over.

And just like a reader skips heading, so too can a screen reader. Which only reinforces the need for a strong, clear heading structure.

An example of a website that has both good SEO and is ADA compliant is Enviro Safety Products. When you review this site you will see it ticks all the boxes, and provides the user a seamless, friendly experience.

Americans with Disabilities Act (ADA-compliant) - Example

Source: Enviro Safety Products

How making your website ADA compliant will help you win at SEO

By applying all the necessary ADA compliant elements to your website, you are helping the one in four Americans with a disability use your website. Additionally, you will also greatly enhance your website’s SEO.

If you would like to know more about how making your website ADA compliant will help you win at SEO, you can throw questions in the comments section below.

Abhishek Shah is the co-founder of Ally Digital Media, a leading voice in digital media and marketing. He advocates for evidence-based marketing strategies to fuel the businesses. He can be found on Twitter @abiishek.





Source link

Continue Reading

SEO

How to drive digital innovation necessary during the pandemic

Published

on


30-second summary:

  • COVID-19 has kept consumers in their homes, which has led to significant spikes in internet use and companies scrambling to digitize in order to meet customers where they are.
  • The ability to quickly develop digital capabilities will continue to be critical for meeting customer needs and ensuring organizations’ survival.
  • To remain competitive, companies must enhance the digital customer experiences they offer through upgraded social media, optimized conversion, strategies, better marketing research, an effective internal website search, and fresh customer touchpoints.

Emerging digital technologies like artificial intelligence (AI) and cloud computing enticed leaders with their agility and efficiency. Many companies planned to make digitization a goal for the new decade.

In hindsight, they probably wish they hadn’t waited.

The novel coronavirus upended every aspect of our lives. As businesses and governments around the world try to combat the pandemic, millions of consumers sit inside their homes. And where do people go during a government-mandated lockdown? Online.

The unprecedented shift to remote work and online learning, combined with a dramatic increase in movie streaming, videoconferencing, and social media traffic, has led to significant spikes in internet use. In this same time frame, big tech companies — the businesses at the forefront of digital innovation — have flourished, as have brands that capitalized on the power of social media engagement.

The biggest trick to digitization right now is meeting customers where they are. For example, my company, Teknicks, is working with an online K-12 speech and occupational therapy provider. When schools began transitioning to remote learning, students’ needs changed, too. We helped the provider pivot its value proposition and messaging to accommodate school districts’ new realities. By focusing on teletherapy tools and reassuring parents, we’ve seen substantial growth and brand recognition during the pandemic.

Until we find a vaccine for the novel coronavirus, your customers will likely engage with you through online channels. The ability to develop digital capabilities quickly will continue to be critical for meeting customer needs and ensuring survival for your organization. With that in mind, here’s how you can enhance your digital customers’ experiences:

1. Upgrade your social media

It’s not hard to be good at social media marketing — it’s hard to be great. As you build your audience on websites like Facebook and Instagram, be sure to engage with followers consistently. Create a content calendar mapping out your posts and sharing strategies and stick to it. These platforms are also a great channel for customer service, allowing you to provide personalized support and become instantaneously useful (something that customer support tickets and chatbots never seem to be).

If you already have a sizable engaged audience, it’s time to work on your content strategy. Don’t build your content strategy around keywords. Instead, focus on your audiences’ needs. A truly effective content strategy will be customized for the platform you’re on and will account for the user behavior most characteristic of that platform. Naturally, you will use keywords and phrases that are optimized for discoverability while maintaining authenticity.

One key strategy is to conduct marketing research using a survey. This tactic goes well beyond traditional keyword research and generates content ideas directly from your targeted audience, not a keyword tool. Surveying your prospective customers allows them to tell you what type of content they want to consume, significantly increasing the likelihood of engagement. Often, this strategy is the key to successful marketing strategy. I’ll go into more detail below.

2. Focus on and prioritize conversion optimization

Ideally, your website looks good and loads quickly, but those qualities alone don’t make a website great. The user experience that your website offers is ultimately what determines whether customers bounce in droves or actually stick around. Attempting to boost your initial traffic will exponentially increase customer acquisition costs, so improving your conversion rates via website optimization is a more affordable (and profitable) solution.

We often see double-digit increases in conversion rates on our first test. We typically focus on the most trafficked pages to increase the likelihood of big, impactful wins. There is an entire science behind conversion optimization, but the core fundamentals have remained the same for years.

To make sure your website’s architecture is seamless and intuitive, develop a conversion rate optimization strategy that works for you. This will require you to ask visitors for feedback, experiment with different messaging options, and regularly review your analytics, among other things. The idea is to get to know your visitors well. It takes work, but it will pay off over time as the incremental conversion rate increases impact top-line revenue.

3. Conduct marketing research surveys

With the right insights, you can turn every engagement into a memorable and valuable experience for both you and your customers. The best way to get customer insights is to ask. Design a survey of up to 10 questions in a variety of formats along with some screening questions to make sure the feedback you get is actually useful.

When designing, consider your potential customers’ preferences and pain points. For example, if you know your audience is mostly on Instagram, asking “What do you like about social media?” won’t be as effective as “What makes Instagram posts better than Facebook posts?” Once the survey’s drafted, post it to your social channels and send it out to your mailing list. You want to understand which messages resonate with your audience before you spend a cent on marketing. Learning how to conduct marketing research is one of the most important marketing skills you can attain.

Asking individual customers how they feel about various messaging options can give you a goldmine of useful data to help inform the language and design choices you make. Not every customer will choose to participate in a survey, but some will. Show them you appreciate their input by offering a small discount or another incentive once the survey is completed. You’ll be surprised by how many responses you get and how beneficial the precursory information is.

4. Review your internal website search

As much as you’d love for every visitor to spend hours exploring every nook and cranny of your website, most will want to get on with their lives after they’ve found what they came for. To make the process faster, you should offer some sort of internal website search functionality. If you don’t already have one, add a search box to your navigation menu.

Not every website has one, and even the ones that do have very surface-level functions. However, search bars are a valuable asset that can increase internal sessions and conversion. Internal website searchers are 216% likelier to convert, according to WebLinc. Search bars assist your visitors and expand your understanding of user behavior, providing you with the information you need in order to adjust your website accordingly.

Evaluate the effectiveness of your internal search, taking notice of how it finds and organizes the content after a search. Most native search functionality is very basic and just looks for the presence of “search term,” but you may want to test out more advanced filters that help users more effectively find the information they are looking for.

I recommend looking at the search data monthly to see what users have been looking for. Be sure to review what searches yielded zero results and which searches brought up irrelevant content. Identify areas that can be approved and understand your content gaps that need additional content to support the demand.

5. Identify new customer touchpoints

Innovation is all about using new technology to improve old processes. While your typical customer journey might depend on your industry and business, chances are good that you can find ways to enhance it with emerging technologies.

Evaluating whether an emerging technology is a fit for your business and whether you should invest in testing it out, starts with (drumroll …) a survey. As we discussed earlier, surveys can answer just about anything you want to know about your target audience. Go ahead and ask your audience if they own or use the emerging tech and validate its place in the customer journey.

Take the new home buying process, for example. David Weekley Homes, the largest privately-held home builder in the U.S., wanted to better understand whether voice-enabled devices can play a role in the customer journey. The company also wanted to propose a voice app idea to the audience and understand how they felt about the emerging technology concept. By conducting a survey, we uncovered that 81% of the respondents would consider the voice app idea to be somewhat to extremely valuable and 70% would possibly to definitely use the voice app if it existed.

The increasing usage of voice search and voice-enabled devices also offers an opportunity for consumer brands to make it easier than ever for customers to find their products. Tide, for example, has capitalized on marketing on Amazon’s Alexa Skills platform to remove a step from the purchasing process. Customers can use the company’s skill to order Tide products without having to pull up the Amazon app or go to the Tide website. In that way, new tech makes an old process (purchasing detergent) more frictionless than ever.

The COVID-19 pandemic has made digital innovation a business imperative. Regardless of your industry, you should look for ways to anticipate and meet customer needs. Your customers expect a seamless digital experience. If you can’t provide it, they won’t have to leave their homes to find someone else that can.

Nick Chasinov is the founder and CEO of Teknicks, a research-based internet marketing agency certified by Google in Analytics, Tag Manager, and a Google Premier AdWords partner.



Source link

Continue Reading

SEO

Core Web Vitals, E-A-T, or AMP?

Published

on


30-second summary:

  • The biggest Google update of the year is called the Page Experience update.
  • Core Web Vitals are part of that update, and they are definitely ranking factors to keep in mind, especially when optimizing images.
  • AMP is no longer the only way to get a “Top Stories” feature on mobile. Starting in 2021, any news webpage can become a “Top Story”.
  • Combining AMP’s privacy concerns and cost of operation might mean that AMP will disappear within a couple of years.
  • E-A-T is not a ranking factor right now, and we don’t know if it will become one in the future.

2020. What a year. History is happening around us, and Google? Well, Google keeps on revamping their search algorithms. Over the years, there have been many many major algorithm updates, as Google worked to keep us on our toes. 2020 was no different: in one fell swoop, we got the news about a Page Experience update and AMP news. All the while the debate about whether or not you need E-A-T for ranking rages on. How do the Core Web Vitals stand in changing the search game in 2021?

Let’s go over each of these innovations and see which will change the way we do SEO, and which will fade into obscurity sooner rather than later.

1. Importance of core web vitals for SEO

Core Web Vitals were part of Page Experience update, and, by far, caused the biggest ruckus.

There’s a lot to learn about Core Web Vitals, but they boil down to the three biggest issues on our webpages:

  1. LCP — Largest Contentful Paint, which deals with the loading speed of the largest single object on the page.
  2. FID — First Input Delay, which means the reaction time of the page to the first user input after (whether they click, tap, or press any keys).
  3. CLS — Cumulative Layout Shift — this is the measure of how much the content of the page jumps while loading content, mostly visual content, after opening.

How core web vitals influences rankings

Of course, some SEO experts think that the entire Page Experience update is nothing special, and could even: “[…] distract, […] from the core mission of communication and storytelling,”.

And, sure, most of Page experience update is simply an assembly of things we’ve known for a while: use HTTPS, be mobile-friendly, control your page speed, and so on.

But Core Web Vitals are a bit different and can influence the SEO practice in unexpected ways. Key factor that’s already changing rankings is Cumulative Layout Shift.

As most SEO experts know, for a while an important part of image optimization was using the <decoding=async> attribute in the <img> tag to avoid losing page speed while rendering the page.

Using <decoding=async> could lead to some seriously janky pages if coders didn’t specify the height and width of every single image to be rendered. Some websites did it anyway, for example, Wikipedia on most of its pages has a predefined space for images created ahead of time.

Core Web Vitals and other ranking factors for 2021 - Wikipedia

But as SEO experts we didn’t have to worry about pages being jumpy all too much, as that didn’t influence the rankings. Now with CLS being formally announced as a ranking factor, things will change for a whole slew of websites and SEO experts.

We’ll need to make sure that every webpage is coded with CLS in mind, with the needed space for every image defined ahead of time, to avoid the layout shifts.

The verdict

Overall, of course, it’s too early to tell, and more work by SEO’s around the web needs to be done here. However, it seems that if you aren’t used to focusing on technical SEO, Core Web Vitals becoming ranking signals might not influence your day-to-day work at all.

However, if you are conducting complicated technical SEO, then Core Web Vitals will definitely change the way you work in as-yet unexpected ways.

2. Importance of AMP for SEO

The AMP’s relevance today is kind of an open question. While it’s always been great as a quick-and-easy way to increase page speed, the privacy concerns have been voiced over and over again since the technology’s very inception.

But in 2020, significant changes are afoot, since, within the same Page Experience update, Google announced that there’s finally no requirement for us to create AMP pages to occupy the “Top Stories” SERP feature.

That’s a pretty huge step for anybody trying to accrue as many SERP features as they can, and, in particular, for news websites.

Core Web Vitals and other search ranking factors for 2021 - Top Stories

How AMP influences rankings

If we believe John Muellers’ words, then AMP is not a ranking factor. Seems plain and simple enough. But of course, things aren’t so simple, because AMP comes with pretty significant gains in page speed, and speed is an important ranking factor.

Thanks to AMP’s pre-rendering combined with some severe design limitations, AMP webpages often really do win in page speed, even if not in ranking as is.

The “Top Stories” SERP feature, however, was a huge benefit to using an AMP for any news agency with a website, and it’s easy to understand why. Just look at how much of the page is occupied by the “Top Stories” results.

Not only do “Top Stories” automatically get top 1 ranking on the SERP, but they also sport a logo of the website posting them, standing out even more from the boring old blue-link SERP.

This means that for a few years now news websites were essentially forced into using AMP to get into a “Top Stories” SERP feature on mobile since it absorbs a whole lot of clicks.

On the other hand, it takes quite a lot of resources to support AMP versions of the webpages, because you are basically maintaining a whole additional version of your website.

Added to which, a page that’s been properly optimized for speed might not need AMP for those speed gains, as well.

The verdict

While it’s tough to imagine that AMP will fade away completely within the next couple of years, AMP’s privacy issues combined with the cost of maintaining it might spell the end of it being a widely used practice.

Now, with the “Top Stories” becoming available to non-AMP pages, there’s virtually no reason to jeopardize the users’ security for speed gains you could get by proper optimization.

3. Importance of E-A-T for SEO

Expertise. Authority. Trust. All perfectly positive words and something we should all strive for in our professional lives. But what about search optimization?

Coming straight from Google’s Quality Rater Guidelines, E-A-T has been the talk of the town for a good moment now. Let’s dive in and see how they might change the way we optimize for search.

How E-A-T influences rankings

For most of us, they don’t really.

Sure, Quality Rater Guidelines provide valuable insights into Google’s ranking process. However, E-A-T is one of the lesser-important factors we should be focusing on, partly because these are nebulous, abstract concepts, and partly because Google doesn’t exactly want us to.

As Google’s official representatives informed us, E-A-T is not in itself a ranking factor.

Receiving follow-up questions, Google’s John Mueller then reiterated that point, and Ben Gomes, Google’s VP of search engineering confirmed that quality raters don’t influence any page’s rankings directly.

However, in practice, we often see that the so-called YMYL websites already can’t rank without having some expertise and authority established. A very popular example is that it’s virtually impossible to rank a website providing medical advice without an actual doctor writing the articles.

The problem here is that expertise, authority, and trustworthiness are not easily interpreted by the search algorithms, which only understand code.

And, at the moment, there seems to be no surefire way for Google to transform these signals into rankings, except to read the feedback of their quality raters before each algorithm update.

The verdict

While using E-A-T to rank websites might sound like an inarguable benefit for the searcher, there is a couple of concerns that aren’t easily solved, namely:

  1. Who exactly will be determining the E-A-T signals, and according to which standard?
  2. The introduction of such factors creates a system where the smaller and newer websites are punished in rankings for not having the trustworthiness that they couldn’t realistically acquire.

Responding to both of these concerns requires time and effort on the search engine’s side.

As things stand right now, E-A-T is not something to keep in mind while doing day-to-day SEO operations.

Let’s imagine a fantastical scenario where a webmaster/SEO expert has some free time. Then they might want to work on E-A-T, to try and stay ahead of the curve.

On the other hand, there simply isn’t any proof that Google will actually use E-A-T. Or that, even if used, these signals will become major ranking factors. For this reason, E-A-T shouldn’t be your priority ahead of traditional SEO tasks like link building and technical optimization.

Additionally, consider this. The entire Quality Rater Guidelines is 168 pages long. However, a comprehensive explanation of what E-A-T is and why it might be calculated a certain way will take many more pages than that.

Conclusion

As of the time of this writing, the Core Web Vitals seems to be the most important ranking news to come out in 2020 in practical terms. However, search is an extremely volatile field: what worked two weeks ago may not work today, and what works today might not work for most of us.

The matters are further complicated because we’re fighting an uneven battle: it’s simply not in search engines’ best interest to give us a full and detailed picture of how ranking works, lest we abuse it.

This is why it’s crucial to keep our hand on the pulse of optimization news and changes occurring every single day. With constant efforts from our SEO community to work out the best way to top rankings, it’s possible for us to close that gap and know for sure which trends are paramount, and which we can allow ourselves to overlook.

Aleh Barysevich is Founder and CMO at SEO PowerSuite and Awario.





Source link

Continue Reading

Trending

Copyright © 2019 Plolu.