Connect with us

SEO

12 Completely Outdated SEO Practices You Should Avoid

Published

on


SEO has gone through extensive evolutionary changes over the years, and continues to do so every day.

While most traditional marketing tactics (for the most part) still hold true in digital marketing today, SEO changes have quite drastically changed the landscape.

Most, if not all, of these changes have helped improve the web – and search, in particular.

Yet, some people still cling to the “old ways” and try to use outdated SEO practices to improve their brand’s organic search visibility and performance.

Some of the tactics worked a few years ago, but now just aren’t as effective as they used to be.

Yet many novice marketers and/or small business owners are still using these “zombie” SEO techniques (tactics that should be dead, but aren’t for some godforsaken reason).

Not only are they ineffective, but many of the 12 outdated SEO practices below are potentially dangerous to the well-being of your brand, websites, and other digital properties.

1. Keyword Abuse

There are so many ways webmasters and “marketers” continue to misunderstand keywords’ role in general SEO initiatives, and how they should be used in the day-to-day strategy.

Let’s take a more granular look at specific types of keyword abuse and mismanagement, including irrelevant usage, writing for a specific keyword density, and keyword stuffing.

Irrelevant Keyword Targeting/Confusion

All too often, novice SEO practitioners try and fit their content and messaging within the confines of their keyword research (and not much else).

These “marketers” will shape the content and its metadata to represent keywords it’s not properly aligned with, nor the proper intent of the users conducting the searches for the high-volume keywords being targeted.

This causes brands to likely lose the attention of readers before ever having the chance to communicate a real message with them.

If the keywords marketed for don’t align with the content on the page, the disconnect will hinder the success of content even if it’s otherwise of good quality.

Don’t try to mislead users and direct them to content that is misrepresented by high-volume keywords in order for increased visibility.

Google knows what this looks like, and it can truly be defined as an obsolete SEO practice (as well as a “black hat” technique, in many instances).

Keyword Density

Writing for a specific “keyword density,” like many keyword-focused marketing tactics, is just missing the mark.

Google no longer depends on keyword density (or the ratio of specific keyword usage to the overall page copy) to determine whether a webpage is an effective source for answering a search query.

It is so much more advanced than simply crawling for keywords; search engines like Google use a multitude of signals to determine search results.

While keywords remain important to the topics and ideas they represent, they are not the lifeline for ranking for high-value search queries.

The quality of content and how the messaging is delivered are the lifeline for that.

Keyword Stuffing

This is probably the oldest trick in the book.

SEO is about keywords, right?

So, loading up our webpages with keywords — especially the same high-value keyword we are aggressively targeting throughout the website — is going to help us show up higher in search, thus outranking out competition?

Absolutely not.

Search engines have, for a long time, known what keyword stuffing is and what kind of text combinations are unnatural. They notice these as attempts to manipulate search results and demote the content as such.

Yes, there may still be valuable content that uses simple keyword stuffing, either intentionally or unintentionally, that is not demoted because of its actual value to users.

Back in the day, webmasters trying to game the system would go as far as putting every keyword variation of a high-value keyword in the website footer or, even more sketchily, make those keywords the same color as the site’s background, effectively hiding them from humans but not the search engine crawlers.

Webmasters have also tried this with links. (Don’t do anything like this.)

Remember, you’re writing for humans, not search engines.

2. Writing for Robots

It’s important to understand that writing unnatural is, well, not natural.

And search engines know it.

The belief is: writing for the web means we should repeat a subject by its proper name every time it is mentioned, working in variations and plural/non-plural versions of the word so that “all bases are covered.”

When crawled, the crawlers see the keyword repeated, and in several different versions, thus leading the page to rank well for the keyword variations used (over and over … and over again).

This isn’t going to work anymore.

Search engines are advanced enough to understand repeated keywords, their variations, and the unfavorable experience of generally bad content.

Write for humans, not search engine crawlers or any other robot.

3. Article Marketing & Article Directories

Any attempt to game the system doesn’t usually work out in the world of SEO.

But that doesn’t stop people from trying.

Especially when these tactics offer noticeable improvements to a brand, its website, and/or its associated digital properties.

Sure, article directories worked. And they worked pretty darn good for a long time, too.

Commonly considered one of earliest forms of digital marketing, article syndication was low-hanging fruit to those in the know. And it made sense since the idea was similar to other channels like TV and print that already use syndicated content regularly.

But Google eventually caught on, unleashing its game-changing Panda update in 2011.

Panda chewed up the search landscape, targeting content farms and directories, as well as other websites offering crap content (whether it was simply bad/false, horribly written, makes no sense, or stolen from someone else).

The idea behind article marketing doesn’t make sense in today’s world, where your high-quality content needs to be original and demonstrate expertise, authority, and trustworthiness.

4. Article Spinning

Typically done with software, article spinning is the black-hat tactic of trying to recreate quality content using different words, phrases, and organization.

Essentially the end result was a garbled mess of an article that made the same points as the source material.

It’s no surprise this isn’t effective anymore.

While AI is getting better all the time at creating content, anything generated by a machine is still of a lower quality than what a human can produce – something original, helpful, and of substance.

5. Buying Links

This one is still biting webmasters many years later.

Like most SEO tactics, if it seems shady, you probably shouldn’t do it.

Buying links is no different.

Once upon a time, it was routine practice to quickly pay to get a high volume of links pointing at your site.

Now we now that backlink profiles need to be maintained and optimized just like the websites we oversee, and low-quality domains with far too many backlinks pointing to a website may be dangerous to a website’s health.

Google can easily identify low-quality sites, and it will also identify when those sites are sending an abundance of links out that they shouldn’t be.

Today if you want to legitimately help boost the authority and visibility of your website, you need to earn links, not pay someone to build them manually.

6. Anchor Text

Internal linking is a characteristic of any good site structure and user experience.

This is typically done with anchor text, an HTML element that allows us to tell users what type of content they can expect if they click on a link.

There are various types of anchor text (branded, naked, exact-match, website/brand name, page title and/or headline, etc.), but some have most certainly become more favorable than others, depending on the usage and situation.

In the past, using exact-match and keyword-rich anchor text were standard SEO best practices.

Since Penguin, Google has been better at identifying over-optimized content.

This goes back to the Golden Rule about producing well-constructed content that is user-friendly and natural.

If you’re optimizing for search engines and not humans, you’re likely going to fail.

7. Obsolete Keyword Research Tactics

Keywords have certainly gone through some drastic changes over the last five to 10 years.

Marketers used to have a plethora of keyword-level data at their fingertips, allowing us to see what works well for our brand and what doesn’t, but also to get a better understanding of idea targeting and user intent.

Much of this went to the wayside with keyword “(not provided)”.

In the years following, tools popped up that tried to replicate keyword data. But to fully recreate it correctly is simply impossible.

And yet, even with that now-stripped keyword data, marketers are required to do keyword research of their own to get an understanding of the industry, the competition, the geographic region, etc.

To do this, many marketers turn to Google’s free Keyword Planner. While the data in there has been subject to some scrutiny over the years, it’s a free Google-owned product that gives us data we previously couldn’t really come by, so many of us continue to use it (myself included).

But it’s important to remember what the data actually represents for keywords.

“Competition” in the Keyword Planner pertains solely to paid competition and traffic, thus it is practically useless to build an organic search strategy around this data.

Some alternatives to this are the Moz Keyword Explorer tool and SEMrush’s Keyword Magic Tool, both of which are paid tools.

Google Trends is helpful for this type of competitive analysis, too, and it’s free.

8.  Pages for All Keyword Variations

This was once a useful tactic to rank well for all the variations of high-value keywords targeted by your brand and its messaging.

Fortunately, algorithm updates like HummingbirdRankBrain, and others have helped Google understand that variations of the same word are, in fact, all related to the same topic.

The best, most-useful content around these entities should be most visible due to the value it offers users on the topic, not just one variation of the word.

Aside from the fact that this will lead to brutal site self-cannibalization, it makes a website considerably harder to use and navigate since content will be so incredibly similar.

The negative user experience alone is reason enough not to do this. But the added fact that Google knows better than to overlook this practice makes it a no-brainer.

This tactic evolved and eventually helped lead to the inception of many content farms that were targeting traffic solely for their keyword value and visibility.

This was attributed to the “old way” of optimizing a website — for keywords and search engines, rather than users and their intent.

9. Targeting Exact-Match Search Queries

The tactic of targeting exact-match search queries in hopes to rank for those queries solely for the traffic numbers — and not because the search query or its answer actually pertained to the business optimizing for it — became a somewhat popular practice before the full deployment of the Google Knowledge Graph.

Marketers would strive to rank in the top spot for exact-match search queries to trigger a breakout box and an increased click-through rate for their sites.

10. Exact-Match Domains

Having high-value keywords in your URL makes sense. To some extent.

But when it becomes confusing or misleading (i.e., it results in a bad user experience), you have to draw the line.

A main best practice for domains is to keep it consistent with your brand.

Brand names should be short, concise, and somewhat meaningful.

Why wouldn’t you want the same from your domain?

Google would value exact-match domains a long time ago because it made sense to use it as a signal.

The behavioral data now has helped Google make changes like this (and many others) that are common sense, clean-up moves.

Run a good company and offer great products and/or services under the brand name, and Google will do work of making your brand visible when it’s relevant to the people searching for it.

11. XML Sitemap Frequency

We should never try to manipulate search engine crawlers so that our website is crawled more than others because it believed new content was published or substantial site changes were made.

But, since webmasters did that in the past, the sitemap is used quite differently than what was once intended.

Previously, webmasters could give a priority number to each page of a website listed in the sitemap ranging from 0.0 to 1.0.

Since that was never quite used correctly, crawlers don’t even honor the frequency rating.

Instead, search engines just crawl the content it deems it needs to crawl

Make sure you adhere to XML Sitemap best practices. Sitemaps are an incredibly important element for every website.

12. Bad Content

Face it. There was a time in our world when crappy content could still rank well.

Oh, how times have changed.

Stolen content, thin content, keyword-stuffed content, non-credible content — there was a time when all of this could get by search engine crawlers and regurgitated back to users as worthy results.

But no more.

We know what it takes to make quality content that is rewarded by search engines because they tell us what’s right and what’s wrong.

If you want to succeed at SEO today, you must do what’s right.

More SEO Resources:

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!

Ebook





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

BrightLocal launches ‘Local RankFlux’ Google local algorithm tracking tool

Published

on


BrightLocal has launched a new free tool called “Local RankFlux,” designed to alert marketers to changes in local search rankings across multiple industries.

Exclusively focused on the Google local algorithm, it offers tracking for 26 verticals. The ranking fluctuations of individual industries can then be compared to the overall sample.

Tracking over 14,000 keywords. Local RankFlux tracks roughly 560 keywords per industry vertical in 20 cities, according to BrightLocal’s blog post. It “plots the ranking position of each business in the top 20 search results and compares that ranking to the previous day’s position to determine the daily change.” 

Source: BrightLocal

Changes in higher SERP positions (e.g., 1 – 2) are weighted more heavily and are treated as more significant than changes in lower rankings (e.g., 19 – 20) in its scoring. “Local RankFlux then multiplies the change in position between today’s and yesterday’s rankings by the weighting to create a total daily fluctuation. This total is then converted into an average based on the number of keywords that returned meaningful results^ and a score produced for All Industries and for each individual industry.”

Scores above 6 suggest an update. BrightLocal explains that scores between 0 – 3 indicate nothing meaningful has happened – given that there are regular, even daily fluctuations going on. Scores of more than 3 but less than 6 indicate a minor change in the algorithm, according to BrightLocal, while scores of 6 to 10 suggest a local algorithm update. The spike in the chart below (industry average of 6.1) on August 8 suggests a meaningful change in the algorithm.

Local RankFlux score: legal category vs industry average

Source: BrightLocal

In early August Google made a core algorithm update. But the last time there was a significant local impact was in August of last year (and possibly in June, 2019 after another core update). In August 2018, SterlingSky’s Joy Hawkins detailed the ways in which her small business customers were impacted by that 2018 core algorithm update.

Why we should care. This free tool will be a useful way for local SEOs to reality check against broader industry benchmarks, to confirm whether there was indeed a local algorithm update. Informally, a number of local SEOs praised the tool based on early exposure.

Take a look and provide feedback on whether it aligns with your observations and experiences. And be sure not to miss SMX East’s full–day track on local SEO and location-based marketing for brands.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.

Continue Reading

SEO

Google’s John Mueller on Where to Insert JSON-LD Structured Data

Published

on


In the latest instalment of the #AskGoogleWebmasters video series, Google’s John Mueller answers a common question about JSON-LD structured data.

Here is the question that was submitted:

“Is it possible to insert JSON structured data at the bottom of theinstead of the? It seems to work fine for many websites.”

In response, Mueller says “yes.” JSON-LD structured data can absolutely be inserted in either the head or body of the page. Just as the person who submitted the question assumed – it will work fine either way.

JSON-LD can also be inserted into pages using JavaScript, if that’s what happens to suit your pages better.

What’s the Difference Between JSON-LD and Other Structured Data Types?

Before answering the question, Mueller gave a brief explanation of each type of structured data and how they’re different from each other.

There are two other types of structured data in addition to JSON-LD. Here are the differences between each of them.

  • JSON-LD: A JavaScript notation embedded in a script tag in the page head or body.
  • Microdata: An open-community HTML mspecification used to nest structured data within HTML content.
  • RDFA: An HTML5 extension that supports link data through additional attributes added to existing HTML tags on the page.

Although all of these types of structured data are acceptable to use, Mueller has gone on record saying Google prefers the use of JSON-LD.



Continue Reading

SEO

Subdomain leasing and the giant hole in Google’s Medic update

Published

on


ConsumerAffairs provides buying guides for everything from mattresses to home warranties. But they also direct consumers on purchasing hearing aids, dentures, diabetic supplies, and even lasik surgery. Many have questioned the legitimacy of ConsumerAffairs buying guides, largely because top-rated brands often have financial relationships with the organization. ConsumerAffairs’ health content has been hit in the post-medic world, but now it seems they’ve found a way to circumvent the algorithm update by hosting slightly modified versions of their buying guides on local news websites around the country. Google “hearing aids in Phoenix” and you’ll discover just how well this strategy is working. Local ABC affiliate station ABC15 hosts all of ConsumerAffairs’ buying guides, including those in the health category, on their new “reviews” subdomain. So far, I’ve counted almost 100 of these ConsumerAffairs content mirrors. Despite cracking down on low-authority medical advice and subdomain leasing, Google seems to be missing this huge hack on their ranking algorithm.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Abram Bailey, AuD is a Doctor of Audiology and the founder of HearingTracker.com, the leading independent resource for informed hearing aid consumers.

Continue Reading

Trending

Copyright © 2019 Plolu.