Connect with us

SEO

7 Expert Tips to Boost Your PPC Performance Today

Published

on


Over the last decade, the number of account and campaign parameters to consider has shot up almost 20 times.

How are PPC specialists expected to know which actions to choose for the best results?

And what exactly do you need to do to continually increase performance while staying ultra-competitive in the marketplace?

On August 14, I moderated a sponsored SEJ webinar presented by Adzooma’s Puneet Vaghela and Sal Mohammed.

They shared seven essential PPC optimization strategies that are proven to boost ROI, save time, and reduce spend.

Here’s a recap of the webinar presentation.

 Adzooma’s Puneet Vaghela and Sal Mohammed share seven essential PPC optimization strategies that are proven to boost ROI, save time, and reduce spend. From set up, budgeting and account structure, to the use of data, technology integrations and audience settings, this valuable webinar will cover it all.

So much has changed in the paid search landscape in the past few years. Today, it has become a complex ecosystem with:

With all of this to consider, it’s essential to determine what actions to take through all the clutter of managing a PPC account.

Here are seven areas to consider when optimizing your paid search campaigns if you want to bring the greatest returns.

1. Account & Campaign Settings

No one has an infinite marketing budget, therefore it’s important to find efficiencies wherever possible in your account.

There are three different settings in your account and campaign that you can easily change to boost PPC performance.

Locations

Location targeting

It’s important to use location targeting in your PPC campaigns to drive efficiencies and identify geographic areas with a higher propensity to convert.

It’s one of the best ways to actually reduce wastage in ad spend. Make sure to target your audience in the areas they’re searching.

If you don’t use location settings, you’ll be wasting budget showing ads to people who have no interest in your business.

Location settings also allow you to see in which areas you have the most traction. Therefore, you should concentrate budgets in these areas to maximize the effectiveness of your PPC spend.

Make sure you select the country you want to target when you set up your campaigns initially and then drill down and create campaigns for specific locations for the top-performing areas.

Devices

Device targeting

Ensuring you’re targeting the correct devices is also key to success.

Google has said that about 30–50% of searches on mobile have local intent.

If you’re a business or a high street store, you should be increasing bids on mobile targeting to reach people in the right place, at the right time.

People also interact on devices differently so use the data within your search engine to see which devices are driving the strongest KPI performance and modify bids accordingly.

Don’t worry about bidding too high, the data you gather will help inform you in your most profitable areas moving forward. That extra you spend in the beginning will just help you further down the line.

Ad Copy Rotation

Ad copy rotation

This is something that a lot of people just leave to Google to do for them.

But a lot of advertisers do like rotating ads evenly so they can optimize it themselves.

If the aim of your campaign is for branding, then this works. You can use tag lines from other media channels to support your messaging and then test it. Learn their ad copy to make sure you’re using the right one.

However, if you’re running a direct response campaign, then you should be trying to maximize the number of clicks or conversions coming to your site.

It would be a good idea to allow the system to actually optimize the ads for you based on the best click-through rate or conversion rate.

2. Automated Bid Management

Bid Management in the Engine

You should be using bid management in Bing or Google Ads to make your ads work as hard as possible for you.

Firstly, you need to analyze your data from the engine or analytics to see how many searches people take to convert with you. You need this to know which bidding works best.

You can then set up automated bidding in the engine based on the last-click conversion model if your conversion length is small.

If your conversion length is high, set it up based on a many-per-click conversion model so you can capture all the keywords required for someone to convert.

Bid Management Using Rules

You can also use automated rules to ensure your account is performing to the standards you expect using third-party bid management platforms, such as Adzooma.

When you’ve analyzed the data in your account, you’ll have identified how many impressions, clicks and conversions you need to drive profitability or hit your target KPI.

With this knowledge, you can set up automated rules to make changes to your account based on these criteria and help drive greater performance on your account while saving you time.

Bid management is generally a good strategy, particularly if you’re new and you don’t know which bids you should be putting in and how to manage them.

If you’re a large-scale advertiser and you’re inundated with different campaigns that you’re running, it is also another great thing you can use.

3. Data Integrations

Data integration is vital to any marketing team. There’s an easy way to integrate your analytics data with your search data in one platform.

Google Analytics, even if it’s the free version, is an important tool for marketers as it allows you to make more informed decisions on your PPC spend.

To link Google Ads and Google Analytics, you’ll need administrative access to Google Ads account and edit permission to a Google Analytics account.

Once you actually have the two platforms linked, you’ll be able to see a number of metrics you couldn’t before including:

  • How many of your clicks resulted in new visitors to your site.
  • How long people are spending on your site from PPC.
  • And, using goals in analytics, what actions people are actually taking on your site from PPC.

Using this data, you can see which keywords are working best for site engagement and optimize accordingly based on your KPIs.

If you’re running a branding campaign, you want more people to spend more time on your site and visit more pages if you’re running a direct response campaign, you want more people to interact with specific goals on your site and probably convert at the same time.

With an analytics integration, you can also start creating audiences based on people’s on-site behavior which is really important.

4. Audience Data

Paid search is based on keyword intent – targeting people based on what they are looking for at all times. It’s been like this since the start.

However, today’s climate is very busy with multiple channels, devices, locations, seasonality, increasing competition, and more data than ever been before.

So how do you sort through the clutter to make sure you’re targeting the people most likely to convert with you and thereby maximize the utilization of your marketing budget?

Why You Should Be Using RLSA

Remarketing lists for search ads (RLSAs) were introduced by Google in 2013 and have grown to become one of the most important strategies available to marketers.

Identifying where people are interacting with your site and gauging who are most likely to convert with you is key to increasing efficiency in your ad spend and improving your conversion rate and cost per acquisition or cost per lead.

In Google Ads and Bing Ads, you can create audiences based on which URLs people have visited on your website and then retarget them when they search for other relevant terms to either:

  • Ensure your ad is appearing in front of them (particularly good for generic keyword efficiency).
  • Or show people different messaging to entice them.

You can also use other data, like demographic data, to make your ads even more targeted. However, this is just one aspect of remarketing with audiences.

How to Create More Enhanced Lists for RLSA

Moving further along, you should also be integrating any CRM data with your search platforms to create customer match lists which are audience lists based on the email addresses within your database.

This allows you to target people you know have already interacted with you and creates similar audience lists to target people similar to people who have already engaged with you and should play a part in your CRM strategy.

Linking your analytics platform with Google Ads allows you to use other on-site metrics to create audience lists. Time on-site, bounce rate, goals, pages visited, etc. are all very important in creating audience lists.

This will let you retarget people, not only based on what pages they visited on your site, but also how long they spend. This means you have another engagement aspect you can layer into your audience strategy.

Why is this important?

Audiences allow you to narrow down your targeting ratio.

This means that rather than spending your budget guessing who might interact and convert with you, you can use this data to:

  • Make informed decisions on which groups of people have the highest propensity to convert.
  • Target them specifically – increasing the effectiveness of your media budget.

That’s really important because acquiring a customer can cost five times the amount of retaining a customer.

If you find someone who’s gone to your site and shown interest, then creating a strategy that can reengage or similarly find more people like that user is something you should be leveraging.

Demographic targeting is also key. It enables you to reach a specific audience based on age, gender, parental status, household income, and multiple other variables.

5. Generic Keyword Efficiency

With generic keyword being so expensive, it’s important to use them properly.

Generics are higher in the funnel, used more for research purposes. Conversion rates on generic terms tend to be very low, and far lower than brand terms.

It’s an ineffective way to drive business goals based on a last-click conversion model.

How to Use Generics Properly

You can use generics as a retargeting mechanism – targeting people in your audience lists when they search for generic terms are after visiting your site.

They will have already engaged with you and so they will be familiar with your brand. Thus, when they widen their search, keep your brand at the forefront of their minds and get them back to convert, either through:

  • A different messaging (i.e., put an offer into the ad copy).
  • Or by increasing your bids on your audiences so that you appear more prominently on generic terms at a time when people are more likely to convert.

This will allow you to concentrate your generic keyword budget to an audience with a higher propensity to engage which will drive higher click-through rates (and hopefully conversion rates), reduce impression wastage, and allow you to use your budget more effectively.

Using scripts can make the use of generics a lot easier. If you ensure your generic keywords are only live during certain moments or triggers, it increases their value to your business and makes the use of them more efficient.

One example is if you sell ice cream, use generic terms when it’s really hot to increase the likelihood of people purchasing your product rather than wasting money showing your ad when it’s cold.

How to use generics properly

To run a strategy like this you can either write a weather script in Google ads using an API connection from a weather information source or you can use a third-party platform that already has the API connection set up.

This will allow you to automate the process of activating ads for specific generic keywords based on the trigger you decide.

This strategy can be used with a multitude of triggers such as TV ads, programs, social posts, news articles, stock market fluctuations, pollution levels, sports, and even other events.

Basically, anything that you can get an API connection to, you can feed that back into Google Ads to trigger into a strategy like this.

6. Effective Account Structure

Your account structure forms the foundation of your entire account and how well it will performs.

A broad structure will lead to impression wastage.

A granular account structure may take longer to set up in the short run but will benefit you with more accurate data and bid management capabilities moving forward.

Ensure Your Campaigns Are Split by Products or Categories

Don’t lump random keywords together. This will allow you to write more relevant ad copy based on the keywords in your ad groups and campaigns.

Some people like to use their websites as a touchpoint on how to structure their account and that’s a good idea.

However, if it’s a particularly large website, it can get quite difficult to use so just make sure that you are splitting your products and categories into the keywords that they should be by a group.

If You’re Covering Your Main Brand Term, They Should Have Its Own Campaign

This will allow you to manage the daily budget for this keyword much more accurately than if it’s fighting for budget with other keywords.

The same here comes into effect for your highest performing terms as well, even if they’re generics.

Create Single Keyword Ad Groups (SKAGs), Where Possible

For your top-performing keywords, keep them in their own ad groups to:

  • Make the ad copy as accurate as possible for testing and learning.
  • Give you the ability to manage their daily budgets and bids separately so all your other keywords in your account.

Split Your Campaigns by Match Types

Using the following match types is recommended:

  • Exact match for traffic generation.
  • Broad match modifier to identify new keywords to add to your account.

Why avoid other match types?

  • Using broad match can cause impression wastage and your budget can get depleted quickly.
  • Broad match modified basically can cover all phrase match plus can harness a large net for harvesting new keywords.

This will allow you to manage your traffic drivers more effectively and allocate the correct budget levels to them and then use your remaining budget to invest in broad match modifier terms to harvest new keywords

By following these tips to building a strong foundation in your account, you’ll be able to initially identify your optimal bidding levels and you can then allow the bidding algorithms within the engine or third-party tool you’re using to optimize activity for more secure base.

Once you’re happy with your account structure, you can use numerous review tools to check how it’s performing and benchmark against that.

Using Adzooma’s free Google Ads Health Check tool can help you quickly spot 47 automatic areas on your account to see if it is set up the correct way.

7. Attribution

When most people think about attribution, they think about a complex user journey and having to use a data science team to translate what the numbers mean into actionable marketing ideas.

But attribution doesn’t have to be time-consuming or something only data scientists can do.

Using Google Ads, you can use data-driven attribution to report on your performance and see which touchpoints along the user journey are leading to the conversions on your site.

You can also use it to inform your bidding rules – which keywords to bid on – not based on the last-click model, but based on the effectiveness of each keyword in the journey.

This means that rather than just pausing a keyword because it didn’t result in a conversion, you can now ensure that:

  • You’re visible on keywords that help in driving conversions throughout the user journey.
  • You’re optimized towards the ones which have the greatest impact at the beginning and in the middle of the journey

Data-driven attribution is different from the other attribution models in that it uses your conversion data to calculate the actual contribution of each keyword across the conversion path.

Each data-driven model is specific to each advertiser.

There’s a caveat, however.

Data-driven attribution requires a certain amount of data to create a precise model of how your conversions should be attributed.

Because of this, not all advertisers will see an option for data-driven attribution in Google Ads.

As a general guideline, for this model to be available you must have at least 15,000 clicks on Google search and conversion action must have at least 600 conversions within 30 days.

If you don’t have this volume of data, you can use attribution modeling in Google Analytics to identify your keyword values through the funnel, analyze that manually, and then attribute it back to your activity.

7 Key Takeaways

  • Push some simple change to your account that will make a big difference.
  • Automate the way you manage bids and improve performance.
  • Integrate data to enhance your bidding strategies.
  • Know why audience data is so important and how to use it.
  • Make generic keywords work harder for you.
  • Boost performance quickly with simple account structure changes.
  • Deploy data-driven attribution that drives performance.

[Video Recap] Improve Your PPC Performance Starting Today with These 7 Expert Actions

Watch the video recap of the webinar presentation and Q&A session.

Or check out the SlideShare below.


Image Credits

All screenshots taken by author, August 2019



Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

TripAdvisor says it blocked or removed nearly 1.5 million fake reviews in 2018

Published

on


The majority of consumers (80% – 90%) routinely consult reviews before buying something, whether online or off. The powerful influence of reviews on purchase behavior has spawned a cottage industry of fake-reviews, a problem that is growing on major sites such as Amazon, Google and Yelp, among other places.

Just over 2% of reviews submitted were fake. TripAdvisor is one of those other places, where reviews form the core of the company’s content and the principle reason consumers visit. How much of the review activity on TripAdvisor is fraudulent? In its inaugural TripAdvisor Transparency Report the company says that 2.1% of all reviews submitted to the site in 2018 were fake. (A total of 4.7% of all review submissions were rejected or removed for violating TripAdvisor’s review guidelines, which extend beyond fraud.)

Source: TripAdvisor Review Transparency Report

73% blocked by machine detection. Given the volume of review submissions TripAdvisor receives – more than 66 million in 2018 – that translates into roughly 1.4 million fake reviews. TripAdvisor says that 73% of those fake reviews were blocked before being posted, while the remainder of fake reviews were later removed. The company also says that it has “stopped the activity of more than 75 websites that were caught trying to sell reviews” since 2015.

TripAdvisor defines “fake review” as one “written by someone who is trying to unfairly manipulate a business’ average rating or traveler ranking, such as a staff member or a business’ competitor. Reviews that give an account of a genuine customer’s experience, even if elements of that account are disputed by the business in question, are not categorized as fake.”

The company uses a mix of machine detection, human moderation and community flagging to catch fraudulent reviews. The bulk of inauthentic reviews (91%) are fake positive reviews TripAdvisor says.

Most of the fake reviews that are submitted to TripAdvisor (91%) are "biased positive reviews."
Source: TripAdvisor Review Transparency Report

TripAdvisor says that the review fraud problem is global, with fake reviews originating in most countries. However, it said there was a higher percentage than average of fake reviews “originating from Russia.” By contrast, China is the source of many fake reviews on Amazon.

Punishing fake reviews. TripAdvisor has a number of penalties and punishments for review fraud. In the first instance of a business being caught posting or buying fake reviews, TripAdvisor imposes a temporary ranking penalty.

Upon multiple infractions, the company will impose a content ban that prevents the individual or individuals in question from posting additional reviews and content on the site. It also prevents the involved parties from creating new accounts to circumvent the ban.

In the most extreme cases, the company will apply a badge of shame (penalty badge) that warns consumers the business has repeatedly attempted to defraud them. This is effectively a kiss of death for the business. Yelp does something similar.

Why we should care. Consumer trust is eroding online. It’s incumbent upon major consumer destinations sites to police their reviews aggressively and prevent unscrupulous merchants from deceiving consumers. Yelp has been widely criticized for its “review filter” but credit the company for its long-standing efforts to protect the integrity of its content.

Google and Amazon, in particular, need to do much more to combat review spam and fraud. Hopefully TripAdvisor’s effort and others like it will inspire them to.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.

Continue Reading

SEO

10 Key Checks for Assessing Crawl Hygiene

Published

on


When optimizing our websites for crawlability, our main goal is to make sure that search engines are spending their time on our most important pages so that they are regularly crawled and any new content can be found.

Each time Googlebot visits your website, it has a limited window in which to crawl and discover as many pages and links on your site as possible. When that limit is hit, it will stop.

The time it takes for your pages to be revisited depends on a number of different factors that play into how Google prioritizes URLs for crawling, including:

  • PageRank.
  • XML sitemap inclusion.
  • Position within the site’s architecture.
  • How frequently the page changes.
  • And more.

The bottom line is: your site only gets Googlebot’s attention for a finite amount of time with each crawl, which could be infrequent. Make sure that time is spent wisely.

It can be hard to know where to start when analyzing how well-optimized your site is for search engine crawlers, especially when you work on a large site with a lot of URLs to analyze, or work in a large company with a lot of competing priorities and outstanding SEO fixes to prioritize.

That’s why I’ve put together this list of top-level checks for assessing crawl hygiene to give you a starting point for your analysis.

1. How Many Pages Are Being Indexed vs. How Many Indexable Pages Are There on the Site?

Why This Is Important

This shows you how many pages on your site are available for Google to index, and how many of those pages Google was actually able to find and how many it determined were important enough to be indexed.

An indexability pie chart in DeepCrawlBar chart showing indexed pages in Google Search Console

2. How Many Pages Are Being Crawled Overall?

Why This Is Important

Comparing Googlebot’s crawl activity against the number of pages you have on your site can give you insights into how many pages Google either can’t access, or has determined aren’t enough of a priority to schedule to be crawled regularly.

Crawl stats line graph in Google Search ConsoleBar chart showing Googlebot crawling in Logz.io

3. How Many Pages Aren’t Indexable?

Why This Is Important

Spending time crawling non-indexable pages isn’t the best use of Google’s crawl budget. Check how many of these pages are being crawled, and whether or not any of them should be made available for indexing.

Bar chart showing non-indexable pages in DeepCrawl

4. How Many URLs Are Being Disallowed from Being Crawled?

Why This Is Important

This will show you how many pages you are preventing search engines from accessing on your site. It’s important to make sure that these pages aren’t important for indexing or for discovering further pages for crawling.

Bar chart showing pages blocked by the robots.txt in Google Search Console

5. How Many Low-Value Pages Are Being Indexed?

Why This Is Important

Looking at which pages Google has already indexed on your site gives an indication into the areas of the site that the crawler has been able to access.

For example, these might be pages that you haven’t included in your sitemaps as they are low-quality, but have been found and indexed anyway.

Bar chart showing pages indexed but not submitted in a sitemap in Google Search Console

6. How Many 4xx Error Pages Are Being Crawled?

Why This Is Important

It’s important to make sure that crawl budget isn’t being used up on error pages instead of pages that you want to have indexed.

Googlebot will periodically try to crawl 404 error pages to see whether the page is live again, so make sure you use 410 status codes correctly to show that pages are gone and don’t need to be recrawled.

A line graph showing broken pages in DeepCrawl

7. How Many Internal Redirects Are Being Crawled?

Why This Is Important

Each request that Googlebot makes on a site uses up crawl budget, and this includes any additional requests within each of the steps in a redirect chain.

Help Google crawl more efficiently and conserve crawl budget by making sure only pages with 200 status codes are linked to within your site, and reduce the number of requests being made to pages that aren’t final destination URLs.

Redirect chain report in DeepCrawl

8. How Many Canonical Pages Are There vs. Canonicalized Pages?

Why This Is Important

The number of canonicalized pages on your site gives an indication into how much duplication there is on your site. While canonical tags consolidate link equity between sets of duplicate pages, they don’t help crawl budget.

Google will choose to index one page out of a set of canonicalized pages, but to be able to decide which is the primary page, it will first have to crawl all of them.

Pie chart showing canonical pages in DeepCrawl

9. How Many Paginated or Faceted Pages Are Being Crawled?

Why This Is Important

Google only needs to crawl pages that include otherwise undiscovered content or unlinked URLs.

Pagination and facets are usually a source of duplicate URLs and crawler traps, so make sure that these pages that don’t include any unique content or links aren’t being crawled unnecessarily.

As rel=next and rel=prev are no longer supported by Google, ensure your internal linking is optimized to reduce reliance on pagination for page discovery.

Pie chart showing pagination breakdown in DeepCrawl

10. Are There Mismatches in Page Discovery Across Crawl Sources?

Why This Is Important

If you’re seeing pages being accessed by users through your analytics data that aren’t being crawled by search engines within your log file data, it could be because these pages aren’t as discoverable for search engines as they are for users.

By integrating different data sources with your crawl data, you can spot gaps where pages can’t be easily found by search engines.

Google’s two main sources of URL discovery are external links and XML sitemaps, so if you’re having trouble getting Google to crawl your pages, make sure they are included in your sitemap if they’re not yet being linked to from any other sites that Google already knows about and crawls regularly.

Bar chart showing crawl source gaps in DeepCrawl

To Sum Up

By running through these 10 checks for your websites that you manage, you should be able to get a better understanding of the crawlability and overall technical health of a site.

Once you identify areas of crawl waste, you can instruct Google to crawl less of those pages by using methods like disallowing them in robots.txt.

You can then start influencing it to crawl more of your important pages by optimizing your site’s architecture and internal linking to make them more prominent and discoverable.

More Resources:


Image Credits

All screenshots taken by author, September 2019



Continue Reading

SEO

Google explains why syndicators may outrank original publishers

Published

on


Last week we reported that Google has updated its algorithms to give original reporting preferred ranking in Google search. So when John Shehata, VP of Audience Growth at Condé Nast, a major publishing company, posted on Twitter that Yahoo is outranking the original source of the article, Google took notice.

The complaint. Shehata posted on Twitter, “Recently I see a lot of instances where Google Top Stories ranking syndicated content from Yahoo above or instead of original content. This is disturbing especially for publishers. Yahoo has no canonicals back to original content but sometimes they link back.”

As you can see, he provided screen shots of this happening as evidence.

No canonical. John also mentioned that Yahoo, who is legally syndicating the content on behalf of Conde Nast, is not using a canonical tag to point back to the original source. Google’s recommendation for those allowing others to syndicate content is to have a clause requiring syndicators must use the canonical tag to point back to the source the site is syndicating from. Using this canonical tag indicate to Google which article page is the original source.

The issue. Sometimes those who license content, the syndicators, post the content before or at the same time as the source they are syndicating it from. That makes it hard for Google or other search engines to know which is the original source. That is why Google wrote, “Publishers that allow others to republish content can help ensure that their original versions perform better in Google News by asking those republishing to block or make use of canonical. Google News also encourages those that republish material to consider proactively blocking such content or making use of the canonical, so that we can better identify the original content and credit it appropriately.”

Google’s response. Google Search Liason Danny Sullivan responded on Twitter: “If people deliberately chose to syndicate their content, it makes it difficult to identify the originating source. That’s why we recommend the use of canonical or blocking. The publishers syndicating can require this.”

This affects both web and News results, Sullivan said. In fact, th original reporting algorithm update has not yet rolled out to Google News, it is just for web search currently:

Solution. If you allow people to syndicate your content, you should require them to use the canonical tag or make them block Google from indexing that content. Otherwise, do not always expect Google to be able to figure out where the article originated from, espesially when your syndication partners publish the story before or at the same time that you publish your story.

Why we care. While the original reporting change is interesting in this case, it is somewhat unrelated. If the same article is published on two different sites at the same time, both sites can appear to the search engines as the original source. If these sites are syndicating your content legally, review or update your contracts to require syndicators to either use canonical tags or block their syndicated content from indexing altogether. If syndicators are stealing your content and outranking you, Google should be better at dealing with that algorithmically, otherwise, you can file a DMCA takedown request with Google.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Continue Reading

Trending

Copyright © 2019 Plolu.