Connect with us

SEO

Here’s what happened when I followed Googlebot for 3 months

Published

on


On internet forums and content-related Facebook groups, discussions often break out about how Googlebot works – which we shall tenderly call GB here – and what it can and cannot see, what kind of links it visits and how it influences SEO.

In this article, I will present the results of my three-month-long experiment.

Almost daily for the past three months, GB has been visiting me like a friend dropping by for a beer.

Sometimes it was alone:

[02/09/2018 18:29:49]: 66.249.76.136 /page1.html Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

[02/09/2018 19:45:23]: 66.249.76.136 /page5.html Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

[02/09/2018 21:01:10]: 66.249.76.140 /page3.html Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

[02/09/2018 21:01:11]: 66.249.64.72 /page2.html Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

[02/09/2018 23:32:45]: 66.249.64.72 /page6.html Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Sometimes it brought its buddies along:

[16/09/2018 19:16:56]: 64.233.172.231 /page1.html Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko; Google Search Console) Chrome/41.0.2272.118 Safari/537.36

[16/09/2018 19:26:08]: 66.249.69.235 /image.jpg Googlebot-Image/1.0

[27/08/2018 23:37:54]: 66.249.76.156 /page2.html Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

And we had lots of fun playing different games:

Catch: I observed how GB loves to run redirections 301 and crawl images, and run from canonicals.

Hide-and-seek: Googlebot was hiding in the hidden content (which, as its parents claim, it does not tolerate and avoids)

Survival: I prepared traps and waited for it to spring them.

Obstacles: I placed obstacles with various levels of difficulty to see how my little friend would deal with them.

As you can probably tell, I was not disappointed. We had tons of fun and we became good friends. I believe our friendship has a bright future.

But let’s get to the point!

I built a website with merits-related content about an interstellar travel agency offering flights to yet-undiscovered planets in our galaxy and beyond.

The content seemed to have a lot of merits when in fact it was a load of nonsense.

The structure of the experimental website looked like this:

Experimental website structure

I provided unique content and made sure that every anchor/title/alt, as well as other coefficients, were globally unique (fake words). To make things easier for the reader, in the description I will not use names like anchor cutroicano matestito, but instead refer them as anchor1, etc.

I suggest that you keep the above map open in a separate window as you read this article.

Part 1: First link counts

One of the things that I wanted to test in this SEO experiment was the First Link Counts Rule – whether it can be omitted and how it influences optimization.

The First Link Counts Rule says that on a page, Google Bot sees only the first link to a subpage. If you have two links to the same subpage on one page, the second one will be ignored, according to this rule. Google Bot will ignore the anchor in the second and in every consecutive link while calculating the page’s rank.

It is a problem widely overseen by many specialists, but one that is present especially in online shops, where navigation menus significantly distort the website’s structure.

In most stores, we have a static (visible in the page’s source) drop-down menu, which gives, for example, four links to main categories and 25 hidden links to subcategories. During the mapping of a page’s structure, GB sees all the links (on each page with a menu) which results in all the pages being of equal importance during the mapping and their power (juice) is distributed evenly, which looks roughly like this:

The most common but in my opinion, the wrong page structure.

The above example cannot be called a proper structure because all the categories are linked from all the sites where there is a menu. Therefore, both the home page and all the categories and subcategories have an equal number of incoming links, and the power of the entire web service flows through them with equal force. Hence, the power of the home page (which is usually the source of most of the power due to the number of incoming links) is being divided into 24 categories and subcategories, so each one of them receives only 4 percent of the power of the homepage.

How the structure should look:

If you need to fast test the structure of your page and crawl it like Google does,  Screaming Frog is a helpful tool.

In this example, the power of the homepage is divided into four and each of the categories receives 25 percent of the homepage’s power and distributes part of it to the subcategories. This solution also provides a better chance of internal linking. For instance, when you write an article on the shop’s blog and want to link to one of the subcategories, GB will notice the link while crawling the website. In the first case, it will not do it because of the First Link Counts Rule. If the link to a subcategory was in the website’s menu, then the one in the article will be ignored.

I started this SEO experiment with the following actions:

  • First, on the page1.html, I included a link to a subpage page2.html as a classic dofollow link with an anchor: anchor1.
  • Next, in the text on the same page, I included slightly modified references to verify whether GB would be eager to crawl them.

To this end, I tested the following solutions:

  • To the web service’s homepage, I assigned one external dofollow link for a phrase with a URL anchor (so any external linking of the homepage and the subpages for given phrases was out of question) – it sped up the indexing of the service.
  • I waited for page2.html to start ranking for a phrase from the first dofollow link (anchor1) coming from page1.html. This fake phrase, or any other that I tested could not be found on the target page. I assumed that if other links would work, then page2.html would also rank in the search results for other phrases from other links. It took around 45 days. And then I was able to make the first important conclusion.

Even a website, where a keyword is neither in the content, nor in the meta title, but is linked with a researched anchor, can easily rank in the search results higher than a website which contains this word but is not linked to a keyword.

Moreover, the homepage (page1.html), which contained the researched phrase, was the strongest page in the web service (linked from 78 percent of the subpages) and still, it ranked lower on the researched phrase than the subpage (page2.html) linked to the researched phrase.

Below, I present four types of links I have tested, all of which come after the first dofollow link leading to page2.html.

Link to a website with an anchor

< a href=”page2.html#testhash” >anchor2< /a >

The first of the additional links coming in the code behind the dofollow link was a link with an anchor (a hashtag). I wanted to see whether GB would go through the link and also index page2.html under the phrase anchor2, despite the fact that the link leads to that page (page2.html) but the URL being changed to page2.html#testhash uses anchor2.

Unfortunately, GB never wanted to remember that connection and it did not direct the power to the subpage page2.html for that phrase. As a result, in the search results for the phrase anchor2 on the day of writing this article, there is only the subpage page1.html, where the word can be found in the link’s anchor. While Googling the phrase testhash, our domain does not rank either.

Link to a website with a parameter

page2.html?parameter=1

Initially, GB was interested in this funny part of the URL just after the query mark and the anchor inside the anchor3 link.

Intrigued, GB was trying to figure out what I meant. It thought, “Is it a riddle?” To avoid indexing the duplicate content under the other URLs, the canonical page2.html was pointing at itself. The logs altogether registered 8 crawls on this address, but the conclusions were rather sad:

  • After 2 weeks, the frequency of GB’s visits decreased significantly until it eventually left and never crawled that link again.
  • page2.html wasn’t indexed under the phrase anchor3, nor was the parameter with the URL parameter1. According to Search Console, this link does not exist (it is not counted among incoming links), but at the same time, the phrase anchor3 is listed as an anchored phrase.

Link to a website from a redirection

I wanted to force GB to crawl my website more, which resulted in GB, every couple of days, entering the dofollow link with an anchor anchor4 on page1.html leading to page3.html, which redirects with a 301 code to page2.html. Unfortunately, as in the case of the page with a parameter, after 45 days page2.html was not yet ranking in the search results for the anchor4 phrase which appeared in the redirected link on page1.html.

However, in Google Search Console, in the Anchor Texts section, anchor4 is visible and indexed. This could indicate that, after a while, the redirection will begin to function as expected, so that page2.html will rank in the search results for anchor4 despite being the second link to the same target page within the same website.

Link to a page using canonical tag

On page1.html, I placed a reference to page5.html (follow link) with an anchor anchor5. At the same time, on page5.html there was unique content, and in its head, there was a canonical tag to page2.html.

< link rel=“canonical” href=”https://example.com/page2.html” />

This test gave the following results:

  1. The link for the anchor5 phrase directing to page5.html redirecting canonically to page2.html was not transferred to the target page (just like in the other cases).
  2. page5.html was indexed despite the canonical tag.
  3. page5.html did not rank in the search results for anchor5.
  4. page5.html ranked on the phrases used in the page’s text, which indicated that GB totally ignored the canonical tags.

I would venture to claim that using rel=canonical to prevent the indexing of some content (e.g. while filtering) simply could not work.

Part 2: Crawl budget

While designing an SEO strategy, I wanted to make GB dance to my tune and not the other way around. To this aim, I verified the SEO processes on the level of the server logs (access logs and error logs) which provided me with a huge advantage. Thanks to that, I knew GB’s every movement and how it reacted to the changes I introduced (website restructuring, turning the internal linking system upside-down, the way of displaying information) within the SEO campaign.

One of my tasks during the SEO campaign was to rebuild a website in a way that would make GB visit only those URLs that it would be able to index and that we wanted it to index. In a nutshell: there should only be the pages that are important to us from the point of view of SEO in Google’s index. On the other hand, GB should only crawl the websites that we want to be indexed by Google, which is not obvious to everyone, for example, when an online shop implements filtering by colors, size and prices, and it is done by manipulating the URL parameters, eg.:

example.com/women/shoes/?color=red&size=40&price=200-250

It may turn out that a solution which allows GB to crawl dynamic URLs makes it devote time to scour (and possibly index) them instead of crawling the page.

example.com/women/shoes/

Such dynamically created URLs are not only useless but potentially harmful to SEO because they can be mistaken for thin content, which will result in the drop of website rankings.

Within this experiment I also wanted to check some methods of structuring without using rel=”nofollow”, blocking GB in the robots.txt file or placing part of the HTML code in frames that are invisible for the bot (blocked iframe).

I tested three kinds of JavaScript links.

JavaScript link with an onclick event

A simple link constructed on JavaScript

< a href=”javascript:void(0)” onclick=”window.location.href =’page4.html’” >anchor6< /a >

GB easily moved on to the subpage page4.html and indexed the entire page. The subpage does not rank in the search results for the anchor6 phrase, and this phrase cannot be found in the Anchor Texts section in Google Search Console. The conclusion is that the link did not transfer the juice.

To summarize:

  • A classic JavaScript link allows Google to crawl the website and index the pages it comes upon.
  • It does not transfer juice – it is neutral.

Javascript link with an internal function

I decided to raise the game but, to my surprise, GB overcame the obstacle in less than 2 hours after the publication of the link.

< a href=”javascript:void(0)” class=”js-link” data-url=”page9.html” >anchor7< /a >

To operate this link, I used an external function, which was aimed at reading the URL from the data and the redirection – only the redirection of a user, as I hoped – to the target page9.html. As in the earlier case, page9.html had been fully indexed.

What is interesting is that despite the lack of other incoming links, page9.html was the third most frequently visited page by GB in the entire web service, right after page1.html and page2.html.

I had used this method before for structuring web services. However, as we can see, it does not work anymore. In SEO nothing lives forever, apart from the Yellow Pages.

JavaScript link with coding

Still, I would not give up and I decided that there must be a way to effectively shut the door in GB’s face. So, I constructed a simple function, coding the data with a base64 algorithm, and the reference looked like this:

< a href=”javascript:void(0)” class=”js-link” data-url=”cGFnZTEwLmh0bWw=” >anchor8< /a >

As a result, GB was unable to produce a JavaScript code that would both decode the content of a data-URL attribute and redirect. And there it was! We have a way to structure a web service without using rel=nonfollows to prevent bots from crawling wherever they like! This way, we do not waste our crawl-budget, which is especially important in the case of big web services, and GB finally dances to our tune. Whether the function was introduced on the same page in the head section or an external JS file, there is no evidence of a bot either in the server logs or in Search Console.

Part 3: Hidden content

In the final test, I wanted to check whether the content in, for example, hidden tabs would be considered and indexed by GB or whether Google rendered such a page and ignored the hidden text, as some specialists have been claiming.

I wanted to either confirm or dismiss this claim. To do that, I placed a wall of text with over 2000 signs on page12.html and hid a block of text with about 20 percent of the text (400 signs) in Cascading Style Sheets and I added the show more button. Within the hidden text there was a link to page13.html with an anchor anchor9.

There is no doubt that a bot can render a page. We can observe it in both Google Search Console and Google Insight Speed. Nevertheless, my tests revealed that a block of text displayed after clicking the show more button was fully indexed. The phrases hidden in the text ranked in the search results and GB was following the links hidden in the text. Moreover, the anchors of the links from a hidden block of text were visible in Google Search Console in the Anchor Text section and page13.html also began to rank in the search results for the keyword anchor9.

This is crucial for online shops, where content is often placed in hidden tabs. Now we are sure that GB sees the content in hidden tabs, indexes them, and transfers the juice from the links that are hidden there.

The most important conclusion that I am drawing from this experiment is that I have not found a direct way to bypass the First Link Counts Rule by using modified links (links with parameter, 301 redirects, canonicals, anchor links). At the same time, it is possible to build a website’s structure using Javascript links, thanks to which we are free from the restrictions of the First Link Counts Rule. Moreover, Google Bot can see and index content hidden in bookmarks and it follows the links hidden in them.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

“Do not accept ‘just’ high quality. Anyone can do that. If the sky is the limit, find a higher sky.” Max Cyrek is CEO of Cyrek Digital, a digital marketing consultant and SEO evangelist. Throughout his career, Max, together with his team of over 30, has worked with hundreds of companies helping them succeed. He has been working in digital marketing for nearly ten years and has specialized in technical SEO, managing successful marketing projects.





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

FTC smacks down anti-review ‘non-disparagement clauses’ in form contracts

Published

on


There are numerous studies circulating that show how important reviews are to consumer purchase decision-making. To protect the integrity of online reviews Congress passed The Consumer Review Fairness Act (CRFA) in 2016. This was largely modeled on an earlier California law.

CRFA makes non-disparagement clauses illegal. The intention of CRFA was to “prohibit the use of certain clauses in form contracts that restrict the ability of a consumer to communicate regarding the goods or services offered in interstate commerce that were the subject of the contract, and for other purposes.”

These terms are typically called “non-disparagement” clauses and have been used periodically by professionals and corporations to pre-empt and prevent negative reviews. They often provide financial penalties or the right to sue for their violation. But they’re illegal.

Trying to get away with it anyway. Apparently quite a few businesses didn’t get the memo. Last week the FTC announced that it had settled administrative complaints with five firms using these illegal clauses in their customer contracts:

  • A Waldron HVAC
  • National Floors Direct
  • LVTR LLC
  • Shore to Please Vacations
  • Staffordshire Property Management

The FTC administrative complaints were originally announced in May and June. (The Yelp blog has some additional factual detail about the companies and circumstances.) It’s not clear if these contracts have just been in use for years (pre-dating the CRFA) or whether the companies got bad legal advice.

Must notify all their customers. Each of these firms must now notify all consumers who signed their agreements that the contractual provisions in question are not enforceable. There are other multi-year reporting and compliance requirements that the FTC orders impose as well.

In addition, Shore to Please Vacations apparently sued a vacation renter, who had written a negative review, in Florida civil court. It must now dismiss the private lawsuit for breach of contract.

Why we should care. Any marketer, brand or business owner contemplating any scheme to prevent or preempt negative reviews needs to stop thinking this way immediately. These efforts invariably backfire and cause more damage to the business’ reputation than anything contemplated by the non-disparagement clause.

Marketers need to follow review best practices and treat reviews and responding to them as just an ordinary part of doing business. It’s also important to remember that businesses that have some critical reviews ultimately have more credibility than those with only five star reviews.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He researches and writes about the connections between digital and offline commerce. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Continue Reading

SEO

7 Expert Tips to Boost Your PPC Performance Today

Published

on


Over the last decade, the number of account and campaign parameters to consider has shot up almost 20 times.

How are PPC specialists expected to know which actions to choose for the best results?

And what exactly do you need to do to continually increase performance while staying ultra-competitive in the marketplace?

On August 14, I moderated a sponsored SEJ webinar presented by Adzooma’s Puneet Vaghela and Sal Mohammed.

They shared seven essential PPC optimization strategies that are proven to boost ROI, save time, and reduce spend.

Here’s a recap of the webinar presentation.

 Adzooma’s Puneet Vaghela and Sal Mohammed share seven essential PPC optimization strategies that are proven to boost ROI, save time, and reduce spend. From set up, budgeting and account structure, to the use of data, technology integrations and audience settings, this valuable webinar will cover it all.

So much has changed in the paid search landscape in the past few years. Today, it has become a complex ecosystem with:

With all of this to consider, it’s essential to determine what actions to take through all the clutter of managing a PPC account.

Here are seven areas to consider when optimizing your paid search campaigns if you want to bring the greatest returns.

1. Account & Campaign Settings

No one has an infinite marketing budget, therefore it’s important to find efficiencies wherever possible in your account.

There are three different settings in your account and campaign that you can easily change to boost PPC performance.

Locations

Location targeting

It’s important to use location targeting in your PPC campaigns to drive efficiencies and identify geographic areas with a higher propensity to convert.

It’s one of the best ways to actually reduce wastage in ad spend. Make sure to target your audience in the areas they’re searching.

If you don’t use location settings, you’ll be wasting budget showing ads to people who have no interest in your business.

Location settings also allow you to see in which areas you have the most traction. Therefore, you should concentrate budgets in these areas to maximize the effectiveness of your PPC spend.

Make sure you select the country you want to target when you set up your campaigns initially and then drill down and create campaigns for specific locations for the top-performing areas.

Devices

Device targeting

Ensuring you’re targeting the correct devices is also key to success.

Google has said that about 30–50% of searches on mobile have local intent.

If you’re a business or a high street store, you should be increasing bids on mobile targeting to reach people in the right place, at the right time.

People also interact on devices differently so use the data within your search engine to see which devices are driving the strongest KPI performance and modify bids accordingly.

Don’t worry about bidding too high, the data you gather will help inform you in your most profitable areas moving forward. That extra you spend in the beginning will just help you further down the line.

Ad Copy Rotation

Ad copy rotation

This is something that a lot of people just leave to Google to do for them.

But a lot of advertisers do like rotating ads evenly so they can optimize it themselves.

If the aim of your campaign is for branding, then this works. You can use tag lines from other media channels to support your messaging and then test it. Learn their ad copy to make sure you’re using the right one.

However, if you’re running a direct response campaign, then you should be trying to maximize the number of clicks or conversions coming to your site.

It would be a good idea to allow the system to actually optimize the ads for you based on the best click-through rate or conversion rate.

2. Automated Bid Management

Bid Management in the Engine

You should be using bid management in Bing or Google Ads to make your ads work as hard as possible for you.

Firstly, you need to analyze your data from the engine or analytics to see how many searches people take to convert with you. You need this to know which bidding works best.

You can then set up automated bidding in the engine based on the last-click conversion model if your conversion length is small.

If your conversion length is high, set it up based on a many-per-click conversion model so you can capture all the keywords required for someone to convert.

Bid Management Using Rules

You can also use automated rules to ensure your account is performing to the standards you expect using third-party bid management platforms, such as Adzooma.

When you’ve analyzed the data in your account, you’ll have identified how many impressions, clicks and conversions you need to drive profitability or hit your target KPI.

With this knowledge, you can set up automated rules to make changes to your account based on these criteria and help drive greater performance on your account while saving you time.

Bid management is generally a good strategy, particularly if you’re new and you don’t know which bids you should be putting in and how to manage them.

If you’re a large-scale advertiser and you’re inundated with different campaigns that you’re running, it is also another great thing you can use.

3. Data Integrations

Data integration is vital to any marketing team. There’s an easy way to integrate your analytics data with your search data in one platform.

Google Analytics, even if it’s the free version, is an important tool for marketers as it allows you to make more informed decisions on your PPC spend.

To link Google Ads and Google Analytics, you’ll need administrative access to Google Ads account and edit permission to a Google Analytics account.

Once you actually have the two platforms linked, you’ll be able to see a number of metrics you couldn’t before including:

  • How many of your clicks resulted in new visitors to your site.
  • How long people are spending on your site from PPC.
  • And, using goals in analytics, what actions people are actually taking on your site from PPC.

Using this data, you can see which keywords are working best for site engagement and optimize accordingly based on your KPIs.

If you’re running a branding campaign, you want more people to spend more time on your site and visit more pages if you’re running a direct response campaign, you want more people to interact with specific goals on your site and probably convert at the same time.

With an analytics integration, you can also start creating audiences based on people’s on-site behavior which is really important.

4. Audience Data

Paid search is based on keyword intent – targeting people based on what they are looking for at all times. It’s been like this since the start.

However, today’s climate is very busy with multiple channels, devices, locations, seasonality, increasing competition, and more data than ever been before.

So how do you sort through the clutter to make sure you’re targeting the people most likely to convert with you and thereby maximize the utilization of your marketing budget?

Why You Should Be Using RLSA

Remarketing lists for search ads (RLSAs) were introduced by Google in 2013 and have grown to become one of the most important strategies available to marketers.

Identifying where people are interacting with your site and gauging who are most likely to convert with you is key to increasing efficiency in your ad spend and improving your conversion rate and cost per acquisition or cost per lead.

In Google Ads and Bing Ads, you can create audiences based on which URLs people have visited on your website and then retarget them when they search for other relevant terms to either:

  • Ensure your ad is appearing in front of them (particularly good for generic keyword efficiency).
  • Or show people different messaging to entice them.

You can also use other data, like demographic data, to make your ads even more targeted. However, this is just one aspect of remarketing with audiences.

How to Create More Enhanced Lists for RLSA

Moving further along, you should also be integrating any CRM data with your search platforms to create customer match lists which are audience lists based on the email addresses within your database.

This allows you to target people you know have already interacted with you and creates similar audience lists to target people similar to people who have already engaged with you and should play a part in your CRM strategy.

Linking your analytics platform with Google Ads allows you to use other on-site metrics to create audience lists. Time on-site, bounce rate, goals, pages visited, etc. are all very important in creating audience lists.

This will let you retarget people, not only based on what pages they visited on your site, but also how long they spend. This means you have another engagement aspect you can layer into your audience strategy.

Why is this important?

Audiences allow you to narrow down your targeting ratio.

This means that rather than spending your budget guessing who might interact and convert with you, you can use this data to:

  • Make informed decisions on which groups of people have the highest propensity to convert.
  • Target them specifically – increasing the effectiveness of your media budget.

That’s really important because acquiring a customer can cost five times the amount of retaining a customer.

If you find someone who’s gone to your site and shown interest, then creating a strategy that can reengage or similarly find more people like that user is something you should be leveraging.

Demographic targeting is also key. It enables you to reach a specific audience based on age, gender, parental status, household income, and multiple other variables.

5. Generic Keyword Efficiency

With generic keyword being so expensive, it’s important to use them properly.

Generics are higher in the funnel, used more for research purposes. Conversion rates on generic terms tend to be very low, and far lower than brand terms.

It’s an ineffective way to drive business goals based on a last-click conversion model.

How to Use Generics Properly

You can use generics as a retargeting mechanism – targeting people in your audience lists when they search for generic terms are after visiting your site.

They will have already engaged with you and so they will be familiar with your brand. Thus, when they widen their search, keep your brand at the forefront of their minds and get them back to convert, either through:

  • A different messaging (i.e., put an offer into the ad copy).
  • Or by increasing your bids on your audiences so that you appear more prominently on generic terms at a time when people are more likely to convert.

This will allow you to concentrate your generic keyword budget to an audience with a higher propensity to engage which will drive higher click-through rates (and hopefully conversion rates), reduce impression wastage, and allow you to use your budget more effectively.

Using scripts can make the use of generics a lot easier. If you ensure your generic keywords are only live during certain moments or triggers, it increases their value to your business and makes the use of them more efficient.

One example is if you sell ice cream, use generic terms when it’s really hot to increase the likelihood of people purchasing your product rather than wasting money showing your ad when it’s cold.

How to use generics properly

To run a strategy like this you can either write a weather script in Google ads using an API connection from a weather information source or you can use a third-party platform that already has the API connection set up.

This will allow you to automate the process of activating ads for specific generic keywords based on the trigger you decide.

This strategy can be used with a multitude of triggers such as TV ads, programs, social posts, news articles, stock market fluctuations, pollution levels, sports, and even other events.

Basically, anything that you can get an API connection to, you can feed that back into Google Ads to trigger into a strategy like this.

6. Effective Account Structure

Your account structure forms the foundation of your entire account and how well it will performs.

A broad structure will lead to impression wastage.

A granular account structure may take longer to set up in the short run but will benefit you with more accurate data and bid management capabilities moving forward.

Ensure Your Campaigns Are Split by Products or Categories

Don’t lump random keywords together. This will allow you to write more relevant ad copy based on the keywords in your ad groups and campaigns.

Some people like to use their websites as a touchpoint on how to structure their account and that’s a good idea.

However, if it’s a particularly large website, it can get quite difficult to use so just make sure that you are splitting your products and categories into the keywords that they should be by a group.

If You’re Covering Your Main Brand Term, They Should Have Its Own Campaign

This will allow you to manage the daily budget for this keyword much more accurately than if it’s fighting for budget with other keywords.

The same here comes into effect for your highest performing terms as well, even if they’re generics.

Create Single Keyword Ad Groups (SKAGs), Where Possible

For your top-performing keywords, keep them in their own ad groups to:

  • Make the ad copy as accurate as possible for testing and learning.
  • Give you the ability to manage their daily budgets and bids separately so all your other keywords in your account.

Split Your Campaigns by Match Types

Using the following match types is recommended:

  • Exact match for traffic generation.
  • Broad match modifier to identify new keywords to add to your account.

Why avoid other match types?

  • Using broad match can cause impression wastage and your budget can get depleted quickly.
  • Broad match modified basically can cover all phrase match plus can harness a large net for harvesting new keywords.

This will allow you to manage your traffic drivers more effectively and allocate the correct budget levels to them and then use your remaining budget to invest in broad match modifier terms to harvest new keywords

By following these tips to building a strong foundation in your account, you’ll be able to initially identify your optimal bidding levels and you can then allow the bidding algorithms within the engine or third-party tool you’re using to optimize activity for more secure base.

Once you’re happy with your account structure, you can use numerous review tools to check how it’s performing and benchmark against that.

Using Adzooma’s free Google Ads Health Check tool can help you quickly spot 47 automatic areas on your account to see if it is set up the correct way.

7. Attribution

When most people think about attribution, they think about a complex user journey and having to use a data science team to translate what the numbers mean into actionable marketing ideas.

But attribution doesn’t have to be time-consuming or something only data scientists can do.

Using Google Ads, you can use data-driven attribution to report on your performance and see which touchpoints along the user journey are leading to the conversions on your site.

You can also use it to inform your bidding rules – which keywords to bid on – not based on the last-click model, but based on the effectiveness of each keyword in the journey.

This means that rather than just pausing a keyword because it didn’t result in a conversion, you can now ensure that:

  • You’re visible on keywords that help in driving conversions throughout the user journey.
  • You’re optimized towards the ones which have the greatest impact at the beginning and in the middle of the journey

Data-driven attribution is different from the other attribution models in that it uses your conversion data to calculate the actual contribution of each keyword across the conversion path.

Each data-driven model is specific to each advertiser.

There’s a caveat, however.

Data-driven attribution requires a certain amount of data to create a precise model of how your conversions should be attributed.

Because of this, not all advertisers will see an option for data-driven attribution in Google Ads.

As a general guideline, for this model to be available you must have at least 15,000 clicks on Google search and conversion action must have at least 600 conversions within 30 days.

If you don’t have this volume of data, you can use attribution modeling in Google Analytics to identify your keyword values through the funnel, analyze that manually, and then attribute it back to your activity.

7 Key Takeaways

  • Push some simple change to your account that will make a big difference.
  • Automate the way you manage bids and improve performance.
  • Integrate data to enhance your bidding strategies.
  • Know why audience data is so important and how to use it.
  • Make generic keywords work harder for you.
  • Boost performance quickly with simple account structure changes.
  • Deploy data-driven attribution that drives performance.

[Video Recap] Improve Your PPC Performance Starting Today with These 7 Expert Actions

Watch the video recap of the webinar presentation and Q&A session.

Or check out the SlideShare below.


Image Credits

All screenshots taken by author, August 2019



Continue Reading

SEO

Are your Google text ads getting truncated? Here’s what to consider

Published

on


This week, Andrea Cruz, digital marketing manager at KoMarketing noticed text ad headlines and descriptions getting cut off and wondered if is new.

I looked back at some older screenshots of search results and didn’t see truncation happening very often. But now I’m easily able to replicate the kind of result Andrea saw, including in the first text ad position, as in the example below.

Truncated headlines and descriptions in expanded text ads aren’t new, but it could be that it’s happening more often lately with certain ad renderings, which frequently include no ad extensions. Is the pendulum swinging back to simpler ads?

Why does ad truncation happen?

One thing to keep in mind is that truncation is about pixels rather than a specific character count, and wider characters use more pixels. In 2016, when expanded text ads were introduced, Google said advertisers should consider limiting headline length to 33 characters to keep them from potentially being truncated. That’s still the suggested length in the help center, even since Google added the third headline option:

“In some situations, Google Ads needs to shorten your text, usually with an ellipsis (“…”). This could happen if your ad text frequently uses wider characters (like “m”) instead of narrower characters (like “i”), because your headline text could be wider than the space available for it on some browser sizes. With most Latin languages, you can avoid this effect by limiting your line’s overall character count to 33 characters total.”

Additionally, if the ad preview in Google Ads shows the full headline, Google says it will generally render completely.

For descriptions, Google doesn’t give specific guidelines, and the preview tool won’t show truncation. Again, pixels will matter. In several results I looked at, description truncation happened between 84 to 86 characters, but a description with 91 characters displayed in full on one line because it had a lot of narrow letters.

Is ad truncation happening more often?

It may appear that truncation is happening more often because of the way Google often displays text ads now. The text ads above the organic results often show with just one description line, particularly on desktop.

Consider this screenshot of a results page for the query “car loan” captured last year in July 2018:

A Google search result from 2018.

Now, compare that to a results page served today in which the ads in positions two to four include just one line of description copy (the last ad’s description is truncated) and no ad extensions below them:

In a result from today, only the first ad shows ad extensions. The other ads show just one line of description copy.

I see this shorter ad rendering regularly across various queries, particularly on desktop. And the lack of ad extensions is interesting. Ads at the bottom of the page on mobile and desktop tend to show more description copy as well as ad extensions than ads above the organic results.

Ad rendering changes are constant

Google is always experimenting with the way it displays ads, even within the same results page. In the mobile example below (from today), notice the Expedia ad in the second position has a description that gets truncated and no ad extensions showing with it.

After refreshing that search result page later in the day, Expedia’s ad, still in the second position, appears with a description followed by callout extensions and an app extension, while the Hotwire ad in position three shows with just a description.

We don’t have control over how Google chooses to display our ads from one search result to the next, and it will vary based on device, browser and other contextual signals. It also decides when and what ad extensions to show. But we do have some control over truncation. If you want to avoid having your titles and descriptions cut off, experiment with length.

Something more interesting to watch may be the frequency with which your ad extensions show. It’s interesting to often see simpler ad treatments above the organic results these days.


About The Author

Ginny Marvin is Third Door Media’s Editor-in-Chief, managing day-to-day editorial operations across all of our publications. Ginny writes about paid online marketing topics including paid search, paid social, display and retargeting for Search Engine Land, Marketing Land and MarTech Today. With more than 15 years of marketing experience, she has held both in-house and agency management positions. She can be found on Twitter as @ginnymarvin.

Continue Reading

Trending

Copyright © 2019 Plolu.