Connect with us

WordPress

An SEO Guide to URL Parameter Handling

Published

on

An SEO Guide to URL Parameter Handling


While parameters are loved by developers and analytics aficionados, they are often an SEO nightmare. Endless combinations of parameters can create thousands of URL variations out of the same content.

The problem is we can’t simply wish parameters away. They play an important role in a website’s user experience. So we need to understand how to handle them in an SEO-friendly way.

To do so we explore:

What Are URL Parameters?

url parameter elements

Also known by the aliases of query strings or URL variables, parameters are the portion of a URL that follows a question mark. They are comprised of a key and a value pair, separated by an equal sign. Multiple parameters can be added to a single page by using an ampersand.

The most common use cases for parameters are:

  • Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
  • Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=newest
  • Filtering – For example ?type=widget, colour=blue or ?price-range=20-50
  • Identifying – For example ?product=small-blue-widget, categoryid=124 or itemid=24AU
  • Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
  • Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
  • Translating – For example, ?lang=fr, ?language=de or

SEO Issues with URL Parameters

1. Parameters Create Duplicate Content

Often, URL parameters make no significant change to the content of a page. A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.

For example, the following URLs would all return collection of widgets.

  • Static URL: https://www.example.com/widgets
  • Tracking parameter: https://www.example.com/widgets?sessionID=32764
  • Reordering parameter: https://www.example.com/widgets?sort=newest
  • Identifying parameter: https://www.example.com?category=widgets
  • Searching parameter: https://www.example.com/products?search=widget

That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.

The challenge is that search engines treat every parameter based URL is a new page. So they see multiple variations of the same page. All serving duplicate content and all targeting the same keyword phrase or semantic topic.

While such duplication is unlikely to cause you to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality as these additional URLs add no real value.

2. Parameters Waste Crawl Budget

Crawling redundant parameter pages drains crawl budget, reducing your site’s ability to index SEO relevant pages and increasing server load.

Google sums up this point perfectly.

“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”

3. Parameters Split Page Ranking Signals

If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.

This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.

4. Parameters Make URLs Less Clickable

parameter based url clickability

Let’s face it. Parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are less likely to be clicked.

This will impact page performance. Not only because CTR can influence rankings, but also because it’s less clickable on social media, in emails, when copy pasted into forums or anywhere else the full URL may be displayed.

While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.

Poor URL readability could contribute to a decrease in brand engagement.

Assess the Extent of Your Parameter Problem

It’s important to know every parameter used on your website. But chances are your developers don’t keep an up to date list.

So how do you find all the parameter that need handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?

Follow these five steps:

  • Run a crawler: With a tool like Screaming Frog you can search for “?” in the URL.
  • Look in Google Search Console URL Parameters Tool: Google auto-adds the query strings it finds.
  • Review your log files: See if Googlebot is crawling parameter based URLs.
  • Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
  • Look in Google Analytics All Pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.

Armed with this data, you can now decide how to best handle each of your website’s parameters.

SEO Solutions to Tame URL Parameters

You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.

Limit Parameter-Based URLs

A simple review of how and why parameters are generated can provide an SEO quick win. You will often find ways to reduce the number of parameter URLs and so minimize the negative SEO impact. There are four common issues to begin your review.

1. Eliminate Unnecessary Parameters

remove unnecessary parameters

Ask you developer for a list of every website parameters and its function. Chances are, you will discover parameters that no longer perform a valuable function.

For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.

Or you may discover that a filter in your faceted navigation is rarely applied by your users.

Any parameters caused by technical debt should be immediately eliminated.

2. Prevent Empty Values

no empty parameter values

URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.

In the above example, key2 and key3 add no value both literally and figuratively.

3. Use Keys Only Once

single key usage

Avoid applying multiple parameters with the same parameter name and a different value.

For multi-select option, it is better to combine the values together after a single key.

4. Order URL Parameters

order url parameters

If the same URL parameter are rearranged, the pages are interpreted by search engines as equal. As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burn crawl budget and split ranking signals.

Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.

In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters and finally tracking.

Pros:

  • Allows more efficient use of crawl budget.
  • Reduces duplicate content issues.
  • Consolidates ranking signals to fewer pages.
  • Suitable for all parameter types.

Cons:

  • Moderate technical implementation time

Rel=”Canonical” Link Attribute

rel=canonical for parameter handling

The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.

You can rel=canonical your parameter based URLs to your SEO-friendly URL for tracking, identifying or reordering parameters. But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating or some filtering parameters.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Consolidates ranking signals to the canonical URL.

Cons:

  • Wastes crawl budget on parameter pages.
  • Not suitable for all parameter types.
  • Interpreted by search engines as a strong hint, not a directive.

Meta Robots Noindex Tag

meta robots noidex tag for parameter handling

Set a noindex directive for any parameter based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.

URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Suitable for all parameter types you do not wish to be indexed.
  • Removes existing parameter-based URLs from the index.

Cons:

  • Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
  • Doesn’t consolidate ranking signals.
  • Interpreted by search engines as a strong hint, not a directive.

Robots.txt Disallow

robots txt disallow for parameter handling

The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.

You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.

Pros:

  • Simple technical implementation.
  • Allows more efficient use of crawl budget.
  • Avoids duplicate content issues.
  • Suitable for all parameter types you do not wish to be crawled.

Cons:

  • Doesn’t consolidate ranking signals.
  • Doesn’t remove existing URLs from the index.

URL Parameter Tool in Google Search Console

GSC URL parameter handling

Configure the Google’s URL parameter tool to tell crawlers the purpose of your parameters and how you would like them to be handled.

Google Search Console has a warning message that using the tool “could result in many pages disappearing from a search.”

This may sound ominous. But what’s more menacing is thousands of duplicate pages hurting your website’s ability to rank.

So it’s best to learn how to configure URL parameters in Google Search Console, rather than letting Googlebot decide.

The key is to ask yourself how the parameter impacts the page content.

  • Tracking parameters don’t change page content. Configure them as “representative URLs”.
  • Configure parameters that reorder page content as “sorts”. If this is optionally added by the user, set crawl to “No URLs”. If a sort parameter it is applied by default, use “Only URLs with value”, entering the default value.
  • Configure parameters that filter page down to a subset of content as “narrows”. If these filters are not SEO relevant, set crawl to “No URLs”. If they are SEO relevant set to “Every URL”.
  • Configure parameters that shows a certain piece or group of content as “specifies”. Ideally, this should be static URL. If not possible, you will likely want to set this to “Every URL”.
  • Configure parameters that display a translated version of the content as “translates”. Ideally, translation should be achieved via subfolders. If not possible, you will likely want to set this to “Every URL”.
  • Configuration parameters that display a component page of a longer sequence as “paginates”. If you have achieved efficient indexation with XML sitemaps, you can save crawl budget and set crawl to “No URL”. If not, set to “Every URL” to help crawlers to reach all of the items.

Google will automatically add parameters to the list under the default “Let Googlebot decide”. The challenge is, these can never be removed, even if the parameter no longer exists. So whenever possible, it’s best to proactively add parameters yourself. So that if at any point that parameter no longer exists, you may delete it from GSC.

For any parameter you set in Google Search Console to “No URL”, you should also consider adding it in Bing’s ignore URL parameters tool.

Pros:

  • No developer time needed.
  • Allows more efficient use of crawl budget.
  • Likely to safeguard against duplicate content issues.
  • Suitable for all parameter types.

Cons:

  • Doesn’t consolidate ranking signals.
  • Interpreted by Google as a helpful hint, not a directive.
  • Only works for Google and with lesser control for Bing.

Move From Dynamic to Static URLs

Many people think the optimal way to handle URL parameters is simply avoid them in the first place. After all, subfolders surpass parameters to help Google understand site structure and static, keyword based URLs have always been a cornerstone of on-page SEO.

To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.

For example, the URL:

www.example.com/view-product?id=482794

Would become:

www.example.com/widgets/blue

This approach works well for descriptive keyword based parameters, such as those which identify categories, products, or filter for search engine relevant attributes. It is also effective for translated content.

But it becomes problematic for non-keyword relevant elements of faceted navigation, such as price. Having such a filter as a static, indexable URL offers no SEO value.

It’s also an issue for searching parameters, as every user generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low quality content pages whenever a user has searched for a item you don’t offer.

It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as

www.example.com/widgets/blue/page2

Very odd for reordering, which would give a URL such as

www.example.com/widgets/blue/lowest-price

And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of UTM parameter.

More to the point, by replacing dynamic parameters with static URLs for things like pagination, onsite search box results or sorting does not address duplicate content, crawl budget or internal link equity dilution.

And having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.

Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding the SEO problems.

But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page. And if obviously not feasible for tracking parameters and not optimal for pagination.

The crux of the matter is that for many websites, completing avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.

So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement as query strings. For parameters that you do want to be indexed, use static URL paths.

Pros:

  • Shifts crawler focus from parameter based to static URLs which have a higher likelihood to rank.

Cons:

  • Significant investment of development time for URL rewrites and 301 redirects.
  • Doesn’t prevent duplicate content issues.
  • Doesn’t consolidate ranking signals.
  • Not suitable for all parameter types.
  • May lead to thin content issues.
  • Doesn’t always provide a linkable or bookmarkable URL.

Best Practice URL Parameter Handling for SEO

So which of these six SEO tactics should you implement?

The answer can’t be all of them.

Not only would that create unnecessary complexity. But often the SEO solutions actively conflict with one another.

For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tag. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.

What becomes clear is there is no one perfect solution.

Even Google’s John Mueller can’t decide on an approach. In a Google Webmaster hangout, he initially recommended against disallowing parameters, but when questioned on this from a faceted navigation perspective, answered “it depends.”

There are occasions when crawling efficiency is more important than consolidating authority signals.

Ultimately, what’s right for your website will depend on your priorities.

url parameter handling option pros and cons

Personally, I don’t use noindex or block access to parameter pages. If Google can’t crawl and understand all the URL variables, it can’t consolidate the ranking signals to the canonical page.

I take the following plan of attack for SEO-friendly parameter handling:

  • Do keyword research to understand what parameters should be search engine friendly, static URLs.
  • Implement correct pagination handling with rel=”next & rel=”prev”.
  • For all remaining parameter based URLs, implement consistent ordering rules, which use keys only once and prevent empty values to limit the number of URLs.
  • Add a rel=canonical link attribute to suitable parameter pages to combine ranking ability.
  • Configure URL parameter handling in both Google and Bing as a failsafe to help search engines understand each parameter’s function.
  • Double check no parameter based URLs are being submitted in the XML sitemap.

No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.

More Resources:


Image Credits

Featured Image: Created by author, January 2019
In-Post Images/Screenshots: Created/Taken by author, January 2019

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!

Ebook





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

WordPress

5 Easy SEO Wins with Powerful Results

Published

on

5 Easy SEO Wins with Powerful Results


Search engine optimization, when done correctly, can take a lot of work. This is why so many people are so eager to take shortcuts.

Fortunately, there are some tasks that don’t require as much effort, compared to tasks like link building, yet still yield significant gains.

I’m a big fan of efficiency, so I love tactics that deliver a greater return on my investment of time and/or money.

In this article, I’m going to explain five of these tactics which are easy to execute successfully but can deliver powerful results.

These easy SEO wins will help you get more out of your efforts and sprint past your competitors. They will also help to leverage better results out of your other SEO efforts like link building and content development.

1. Prune Outdated /Low-Quality Content

You probably created all of the content on your website with the best of intentions, but still, it’s almost a certainty that some of it is garbage.

There are a variety of reasons for this, and it happens to the best of us. The solution in many cases is to prune this content. In fact, Danny Goodwin and Loren Baker recently hosted a webinar on exactly this topic.

Some people are hesitant to get rid of any content, no matter the reason. The thinking is generally that it can’t do any harm to leave it there. And Google has reinforced this thinking time and time again.

But the reality is that despite what Google’s representatives say, outdated and/or low-quality content can negatively impact your ranking and traffic.

It probably should impact your credit score too, but apparently, I don’t have the clout necessary to make that happen.

Identifying Content to Prune

Once you’ve worked up the courage to start pruning, the first step is to identify the content that should be deleted.

The easiest and most complete way to do this is to use software like Screaming Frog to crawl your website and generate a list of URLs. This helps to ensure you don’t miss anything.

Next, you’ll need to begin the tedious task of reviewing this list, URL by URL, to determine which content is outdated or low quality. This means you actually need to manually visit each page and review the content.

It may help to prioritize this list. Google Search Console gives you the ability to export a CSV file of the URLs Google has indexed for your website, which you can then sort by traffic.

From here, you’ll want to start evaluating the URLs with no traffic, working your way up.

sorted URLs

It’s important to point out that a lot of this content you’re deleting can and should be redirected to a stronger, high-quality page.

But don’t fall into the misguided approach of redirecting them to your homepage. If there is a legitimately relevant page on your website, redirect it there, otherwise, just let it 404.

But what about the content that’s not a complete dumpster fire, and is still relevant?

2. Improve Quality Content

If you’ve been doing things right, a lot, if not most of your content should survive the executioner’s blade.

This content should be improved based on your visitors’ needs.

The advantage here is that this content already exists, the URL has a history in Google, and it may even have some inbound links. Because of these factors, it makes a lot more sense to improve that content rather than starting over from scratch.

Depending on circumstances, this might include:

  • Editing your content to improve readability, increase engagement, and to make it more comprehensive.
  • Adding relevant and useful media, including images, video, and PDFs.
  • Including original data, research, statistics, and case studies.

We’ll want to prioritize the content to improve based on quick and easy wins. This means we won’t be targeting topics we don’t already rank for, but we also won’t be focused on improving positions we already rank highly for.

So let’s go back to our Google Search Console export and sort the data based on URLs that rank anywhere from Position 5 to 30 in the search results.

sort URLs by position

We’ll then further sort this data by relevance and potential search volume. From here, we will compare these URLs to our competitors who outrank us to identify opportunities to improve.

Some of the things we’re looking for could include:

Word Count

Despite what you may have been told, size does matter.

While not a worthwhile metric on its own, it can help to determine how comprehensive several URLs are in comparison to each other.

Depth

Generally speaking, the top ranking pages across all industries tend to be more comprehensive than those that they outrank. This doesn’t mean that longer content will always win, but it can be a powerful factor.

Does your content effectively and completely answer not only the original query, but also any related questions that may come up as a result?

You need to think about not only the immediate topic, but everything related to the customer journey. This might include:

  • Related definitions
  • Frequently asked questions
  • A summary of relevant laws and regulations
  • Explanation of a process
  • Technical specifications
  • Statistical data
  • Case studies

Readability

How well-written is your content?

This is not something you want to evaluate by gut feel – you need an objective measurement.

  • Yoast gives you a readability score while editing content right in WordPress.
  • SEMrush enables you to test readability both within their platform and with a Chrome add-on that integrates with Google Docs.
  • There are countless other tools as standalone websites, apps, and addons/plugins, available.

Your immediate goal is to make your content easier to read than the content that’s outranking you, but that’s just a starting point.

If your competitors content reads like someone spilled a bowl of alphabet soup, don’t set out to simply be a little better than them. Your goal should be to blow them away.

Media

Are original and useful images included within your content? How about video and/or audio files?

Images can provide additional context that helps search engines understand what your content is about. So can video, provided that schema is properly used.

But both serve another more important role, and that is to improve the user experience.

Look for opportunities to use media to provide additional information that’s not included in the text.

Both images and video are great at making complex topics easier to understand, but video is particularly effective at keeping visitors on your website longer, which is always a good thing.

It’s always a wise idea to include a watermark on your images to prevent competitors from stealing them.

Sure, you could file a DMCA complaint after the fact, but it’s always easier to avoid the problem in the first place.

3. Update Internal Links

Internal links can be a powerful tactic in your SEO toolbox, but it’s important to review them from time.

Your internal links should point to any pages that you want to rank well, and they should be placed on any pages with content relevant to the link destination. Equally important, these links should be direct.

redirects

This is a pretty common problem in websites where content is frequently published, moved, or deleted. The solution is to use a tool like Screaming Frog or SEMrush to crawl your site and identify any redirect chains.

As for managing these internal links, I’m a big fan of automating this task, and this is easy for WordPress websites.

There are several plugins available that enable you to specify certain words/phrases to automatically link to specific URLs.

This allows you to instantly create, edit, and delete links across your entire website, whether you have a few pages or a few million pages.

4. Improve Page Load Speed

The longer a webpage takes to load, the fewer leads and sales you’ll generate. To compound this problem, slower websites also tend not to rank as well compared to faster websites.

This makes page speed monumentality important.

Most websites are painfully slow, but the good news is that it’s relatively easy to improve.

While improving page speed requires a moderate level of technical expertise, I still consider this to be an easy win because the improvements you make will have an immediate and sitewide effect.

I’ll briefly share a few tactics here, but I encourage you to check out another article I wrote, explaining how to improve page speed, in great detail.

Dump the Budget Web Hosting

The cheaper web hosts tend to oversell their services, so your website is crammed onto a server with hundreds or even thousands of other websites.

Because these servers often lack the horsepower necessary, the websites they host often suffer in terms of page speed.

Reduce HTTP Calls

Every part of your website – each HTML, CSS, JavaScript, image, video, and any other type of file — requires a separate HTTP request.

Fewer HTTP requests typically means a faster website.

So how do we get there?

The first step is to remove any unnecessary plugins. Then, you’ll merge multiple CSS and JavaScript files into a single CSS and JavaScript file.

You should also minimize the number of image files by using CSS to create the desired design effect and/or using sprites to merge multiple frequently used images.

Optimize Media Files

Images and videos on many websites tend to be larger than they need to be.

The first step is to to make sure your media files are in the ideal format. For example, JPG is best for photographic images, while GIF or PNG are better for images with large areas of solid color.

Then, you’ll need to ensure your media files are properly sized. Images and video should be no larger than they will be displayed.

For example, if a particular image on your website will be never displayed at more than 800px wide, the image file should be 800px wide.

Finally, you’ll need to compress your media files. There are a number of free tools available online for compressing various file types. There are also WordPress plugins that can compress all of the images already on your website.

These three steps are a good start, but as I mentioned earlier, I highly encourage you to check out my previous article on improving page speed for more tactics and greater detail.

5. Implement Schema Markup

There is no definitive evidence that schema markup has any direct impact on ranking, however, it’s still critical to SEO.

That’s because it has the potential to increase your website’s visibility in the search results, which results in higher click-through rates.

Since most websites today still don’t use schema, this creates a tremendous opportunity for those that do. Take a look at this example and tell me which result caught your eye first?

schema in SERPs

Fortunately, implementing schema is relatively simple. There are three types, and they are used in different scenarios.

  • Standard schema microdata, which is marked up directly in HTML.
  • JSON-LD, which is marked up in JavaScript and is the most recommended format.
  • RDFa, which is used in a variety of different document types including XML, HTML 4, SVG, and many others.

In some cases, you’ll use JSON and add it to your website just like you would any other script. In some cases, you’ll add markup to specific elements on your website, and in others, you might add RDFa to a different document type.

Roger Montti wrote a great, in-depth post on schema, so rather than reinventing the wheel here, I’ll just direct you to his article.

But schema goes a lot deeper than where it is today and I anticipate that it will play a much larger and more direct role in the search algorithm. Especially as voice search begins to gain traction.

Montti explains in another article how Google is currently using speakable markup, which I believe will become a more prominent factor in search in the coming years.

More Resources:


Image Credits

Featured Image: Created by author, April 2019
All screenshots taken by author, April 2019





Source link

Continue Reading

WordPress

7 Expert Tips to Protect Your Online Reputation

Published

on

7 Expert Tips to Protect Your Online Reputation


Ready for some detailed advice to help protect your reputation online?

The advice you’ll read in this article works for both proactive reputation management, and for those that already have online negative content/reviews about them.

Be advised, however, that some sites may be impossible or very hard to beat.

Major news sites (e.g., The Wall Street Journal, LA Times, New York Times) require significant work to manage.

If a major news site has posted negative content about you, then you really need to be (or hire) an expert in SEO.

The do-it-yourself option is just not feasible at that scale. That said, these tips will help some DIYers before moving on to an expensive reputation management firm.

Up until recently, popular complaint website RipoffReport was also hard to beat.

Recent changes to Google’s algorithm (which we think occurred in September 2018) have pushed some complaint sites lower in search ranking. Read my prior article for more details about this.

Tip 1: Provide Excellent Service

You want to make sure that you really are providing an excellent service to avoid getting negative reviews in the first place. Consider going above and beyond your current efforts.

For example, if you run a restaurant, maybe provide a free appetizer to surprise new or returning customers.

If you notice even the slightest sign that a customer is unhappy, try your best to resolve the issue ASAP. The next best thing is to offer a free service or refunds to make up for the issue.

You can’t make everyone happy. I have been doing SEO services for over 20 years and there have been many times when I’ve had unhappy clients.

I have always either offered free services or provided refunds to my unhappy clients, and this is how I have kept a near flawless record online.

They say the client is always right. I know that sometimes they are not, but consider whether arguing with your customer is worth your reputation.

The decision may come down to the dollar value of your services.

Maybe a negative review on Yelp or Google Maps would not affect your overall rating because you have many positive reviews.

But what would happen if you got a negative review on a complaint site like RipoffReport?

Often, these kinds of reviews rank high for the brand name and can do more damage in a few months than the amount in dispute with your client.

I have offered full refunds to several clients over the years because the threat of a negative review on the right site can hurt.

My firsthand knowledge of the damage done to businesses has made me overly cautious.

One negative review can cost thousands of dollars in online reputation management (ORM) services to try and repair.

Tip 2: Ask for Reviews

Certain professions are more likely to have more negative reviews than positive.

For example, dentists for some reason usually get a high number of negative reviews.

My guess is that no one goes to a dentist with a happy feeling. One usually goes to a dentist to fix a cavity or do a cleaning, which could result in the discovery of cavities and require more work.

Having to spend money you had not planned on spending is a pretty good reason for most people to get upset. Even the best dental insurance requires some kind of a copay, so dental procedures can be expensive.

Even if you aren’t a dentist, you’re more likely to get positive reviews if you ask for them.

If you avoid asking your best customers for reviews, you may end up with more negative reviews than you would have wanted.

Just make sure that you know your customers are happy before you ask for the review.

If you are seeing your customer in person, you may start by asking how they felt about your service right after you finish the job.

Alternatively, you may want to follow up after a few days.

Another tip is to use a different person to follow up then whoever served the customer. If it is one of your staff that did the work/sale, then either a manager or you should do the follow-up.

This way the customer is more likely to tell you about a negative experience, and you won’t feel as defensive about it since you were not the one involved.

Tip 3: Incentives for Reviews

Consider offering some kind of incentive for reviews, but be warned that this practice is against Yelp. If you do this, make sure to never ask for it in writing, but always verbally.

If someone reports you to Yelp for doing this, you may get a warning or a demotion in Yelp’s search results.

I have seen businesses post messages behind their business cards asking for Yelp reviews, with a discount for positive reviews.

A customer just needs to take a picture of this and send it to Yelp. Yelp will quickly follow up with a Consumer Alert on your account.

Tip 4: Offer Refunds to Unhappy Clients

If you have clients that are unhappy with your services, at first try to resolve or fix the issues, but if this is not possible, then offer a full or partial refund or some other incentive such as discount coupons or even retail gift cards.

Accept that you were wrong. Trying to resolve issues will always sit better with clients than trying to argue.

Refunds can either help avoid the negative review or lessen the damage and turn the negative review into a somewhat positive one.

I’ve had clients where even a partial refund has meant the difference between a 1-star and a 4-star review. Even a 5-star rating may be possible.

Tip 5: Review Generating Platforms

Many companies offer platforms for review generation. The basic concept is to collect your customer’s emails and/or phone numbers.

After their visit, or every so often, you can send a survey email or text message to ask for feedback.

The message will ask how they felt about your services and if the answer comes back positive you can then ask them to give you a review on the review site of your choosing, such as Yelp and Google Maps.

If the answer comes back negative, you will see the message and can reach out to them to try and resolve the issue before they think about posting a negative review in the first place.

These services typically cost as little as $30 per month to run yourself, or up to hundreds of dollars for a full-service provider (ORM company). Some companies that offer this service include:

Tip 6: Consider Revising Your Business Model

I have a client with an ecommerce fashion store that dropships items from China, even though the business is based in the U.S.

The delivery time is usually 2 to 5 weeks, which is slow for most people. In addition, sometimes the Chinese sizes run smaller than US sizes.

So this business often gets many negative reviews and requests for returns/refunds. They also further upset clients by asking the customer to send back the item at their own expense.

As you can see, this kind of business cannot avoid negative reviews unless they change their business model.

The main benefit of their service is that it’s affordable. In fact, they are extremely cost-effective compared to similar fashion items found at major department stores.

So, what can a business like this do?

My advice begins with an adjustment to their sales copy informing customers that items are delivered from China and that shipping may take 2-5 weeks.

This tactic reduces some of their sales, but it avoids so many unhappy customers and unnecessary refunds.

Most people would probably not mind waiting a little if that would save them some money.

The customers that don’t want to wait that long are usually the ones that would complain most because they probably needed the item to be there for an occasion.

Also, they can offer free or reduced shipping costs for returns.

If the item is pretty cheap, another option is to provide a full refund and have the customer simply keep the item. Amazon used this tactic effectively in its growth phase to encourage Prime users.

The good news with this business is that they decided to change business models and keep inventory on hand to ship from the U.S. after I consulted with them.

They have been getting fewer negative reviews since they did this couple of months ago.

So my point here is to take a look at your business model to see what adjustments you can make to avoid situations that lead to negative reviews.

Even if it is going to cost you some business or money, you would be better off in the long run.

Not only will you increase business from new customers (thanks to positive reviews), happy clients will return and refer others to your business.

Tip 7: Be Proactive, Not Reactive

There are a number of things you can do to create a positive online image.

Your goal should be to populate the top 20 of Google with positive content about your business, which in turn may help to keep negative content out.

I plan on writing another article soon to cover more specifics, but in general, here are a few recommendations:

  • Register your social media profiles on the top social media sites, and stay active on those platforms.
  • Active Twitter profiles often get in the top 10 for their brand names, and Google may even show the latest feeds from them taking additional real estate space.
  • YouTube videos will often rank well for brand names. You can create a professional video for less than $1,000, or an even lower budget video using your smartphone. Also, you can hire a freelancer on a site like Fiverr to do a slide show type video about your business.
  • Distribute press releases every few months. Try to use different networks for distribution to get maximum coverage.
  • If you don’t already have a blog, create one and post on a regular basis (once a week is what we recommend to our clients as a minimum).
  • Create mini sites or blogs with subdomain blog platforms, such as wordpress.com or tumblr.com. Make sure your brand name appears as part of the subdomain (i.e., yourbrand.wordpress.com).

More Resources:





Source link

Continue Reading

WordPress

Official AMP Plugin for WordPress Now Supports AMP Stories

Published

on

Official AMP Plugin for WordPress Now Supports AMP Stories


A new version of the official AMP plugin for WordPress supports the creation of AMP Stories.

Google recently added a dedicated section to search results for showcasing AMP Stories.

So there’s no time like the present for learning how to create them.

While AMP Stories can be created with or without this plugin, its drag-and-drop functionality certainly makes things easier.

Official AMP Plugin for WordPress Now Supports AMP Stories

The AMP Project team explains how WordPress offers an ideal platform for creating AMP Stories:

“Building on top of WordPress, and specifically the new Gutenberg editor available in WordPress 5.0, allows the AMP Stories creation process to benefit from the rich media management architecture available in WordPress.

In Gutenberg everything is a block. This makes it easy to create rich post layouts, provide enhanced authoring tools (word count, color contrast, document outlines, etc.), and extend with custom blocks.”

Capabilities of the latest AMP plugin update include:

  • Creating and reordering AMP Story pages
  • Dragging and dropping blocks
  • Managing your content overall as part of WordPress
  • Creating new elements, such as text, videos, images
  • Changing the background color and opacity, and adding a gradient
  • Animating the text, rotating it, and selecting a Google font

This feature is currently available in an experimental alpha version of the AMP plugin, which is said to work best with Gutenberg.

With that said, download and install the plugin at your own discretion.





Source link

Continue Reading

Trending

Copyright © 2019 Plolu.