fbpx
Connect with us

SEO

Automation layering: How PPC pros retain control when automation takes over

The PPC track at SMX Advanced kicked off with a keynote by Ginny Marvin where she considered the future of an industry where automation is increasingly doing more of the work humans used to do. Her message was that we can’t escape automation so we must find a way to coexist with the machines. The […]

Published

on


The PPC track at SMX Advanced kicked off with a keynote by Ginny Marvin where she considered the future of an industry where automation is increasingly doing more of the work humans used to do. Her message was that we can’t escape automation so we must find a way to coexist with the machines.

The topic of automation in PPC comes up a lot but I suspect that when our industry talks about the impact of automation, what is considered are mostly automations built by the likes of Google and Microsoft… disruptive (but not necessarily bad) capabilities like Smart Bidding, close variant keywords, responsive search ads, etc.

But nobody ever said that advertisers can’t be disruptors too. They too can build automations to change the game and give themselves a competitive edge.

Having to build your own automations may sound daunting but remember that they don’t have to be cutting-edge like machine learning in order to be useful. In this post, I’ll explain an easy way to get started with your own automations using the principle of “automation layering.”

Automations from the engines are better with human help

In my new book, Digital Marketing in an AI World, I explain that humans plus machines usually perform better than machines alone. This is not a new concept and one most of you have probably come across in some form or other. One specific example I used to share in presentations came from Wired in 2014 and said that, “The National Weather Service employs meteorologists who, understanding the dynamics of weather systems, can improve forecasts by as much as 25 percent compared with computers alone.”

Because of the potential for better results, PPC pros want to remain involved. They have knowledge about the business that could meaningfully impact results. Sometimes there simply is not enough data for a machine learning system to come up with the same insight. So it’s generally agreed upon that humans + machines can outperform machines alone.

Generally, we tend to translate this concept into the PPC world by saying that account managers need to work together with automations from the engines.

When humans work together with automations from
the ad engines like Google, the results are generally thought to be better than if the automation didn’t have the help of a smart PPC account manager.

Automations from the engines are better with automations from advertisers

Then I started thinking about the role human PPC managers need to play for the premise to be true that humans + machines outperform machines alone. I realized that the humans in that equation could actually be replaced by machines as well, but in this case, machines that are controlled by the PPC pro and not the ad engine. PPC pros could benefit from the control (since they define the automation) and the time savings (because they don’t need to exert control manually).

So we should try to replace some forms of human control with new layers of automation and see if that delivers the same benefits as humans + machines. If we can write down the steps we take, we can teach a machine to do those steps for us. And it can be a simple rule-based approach which is much simpler to create than something based on machine learning.

Humans don’t need to do repetitive manual work to help the automations from the engines. They can teach their own machines to automate their process.

The concept behind automation layering is not a new idea. In engineering, solutions can be broken down into systems that can themselves be connected to other systems. Each system accepts inputs and returns outputs and so long as there is agreement over the format of inputs and outputs, many systems can be strung together and work seamlessly together to solve more complex problems.

Likewise, an automation could interact with other automations. In PPC, let’s call this principle “automation layering.”  This is an important concept because it’s the next evolution of what PPC pros have been doing for years: using their own insights to control what Google does. But just like Google is getting ever-more automated, our control over it should also become more automated.

By replacing the manual work done by the PPC expert with an automation
that follows their logic, PPC teams can still reap the benefits of having more control over automations created by the ad engines.

Let’s look at why automation layering makes sense in PPC.

Escaping automation is not an option

The reason humans worry about automations created by the engines is that we can’t escape these. They are launched at the engine’s discretion and whether we like it or not, we have to spend time figuring out how they impact our work. Given how busy the typical PPC manager is, this extra work is not something to look forward to.

Despite promising great things, the truth is that success with new automations depends on experimentation and reskilling, both tasks that require time to do well. To take an example from aviation, cutting corners with reskilling when new automations are launched can lead to disastrous results as seen with the 737-Max. Luckily in PPC the stakes are not as high, but I believe the analogy is relevant.

Automation layering for close variants

Some new automations cannot be turned off so they force us to change how we work with Google Ads. Close variants are a recent example of this type of change. In September of last year, they redefined what different keyword match types, like “exact match” mean.

Some account managers now spend extra time monitoring search terms triggered for exact match keywords. This would be a great form of human control to turn into automation layering where the PPC manager turns their structured logic for how they check close variants into an automation that does it automatically.

There are two specific ways I’ve shared to layer an automation on top of Google’s exact match keywords to keep control when they expand to close variants with similar meaning.

The first way is to simply check the performance of the close variant to that of the underlying exact keyword. If a user-defined threshold for performance is met, it can automatically be added as a new keyword with its own bid, or as a negative keyword if the performance is significantly lower. Note that close variants when used in conjunction with Smart Bidding should already get the appropriate bid to meet CPA or ROAS targets, but regardless it can’t hurt to add your own layer of automation to confirm this.

The second way is to use the Levenshtein distance calculation to find how far the close variant is from the exact keyword. It is a simple calculation that adds up the number of text changes required to go from one word to another. Every character added, deleted, or changed adds one point. Hence going from the correct spelling of my company name “Optmyzr” to the common typo “Optmyzer” has a Levenshtein distance of 1 (for the addition of the letter “e”). Going from the word “campsite” to “campground” on the other hand has a score of 6 because 4 letters need to be changed and 2 need to be added.

Layer your own automation on top of close variants to determine how different the close variant is to the exact match keyword. The Levenshtein distance function can be used to calculate the number of text changes required to go from one text string to another.

With a Google Ads script, we could write our own automation that turns these manual checks into fully automated ones. Because it’s an automation that we can define, it’s as powerful as the more manual human control that we used to have to put in to get the benefits normally associated with humans + machines.

Automation layering for Smart Bidding

Other automations like Smart Bidding are optional but with their pace of improvements, it’s just a matter of time before even the most ardent fans of doing PPC manually simply won’t be able to make enough of a difference that they can charge a living wage for their manual bid management services.

The machines are simply better at doing the math that predicts future conversions and using this expected conversion rate to turn an advertiser’s business goals around CPA or ROAS into a CPC bid that the ad auction can use to rank the ad against all others.

That said, remember that Smart Bidding is not the same as automated bidding. Part of the bid management process is automated, but there’s still work for humans to do. Things like setting goals and ensuring measurement is working are just two examples of these tasks.

Smart bidding doesn’t mean the entire bid management process is automated. Account managers still need to control dials for seasonality, conversion types, and fluctuating margins. These well-defined processes are great things to automate so they can be layered on Google’s Smart Bidding automation.

Besides needing to dial in adjustments for seasonality, special promotions and figuring out how to connect these limited controls to business goals like acquiring new customers, driving store visits or driving higher repeat sales, there’s still the point that most companies care about profits. Despite what we may think after hearing of Uber’s $1 billion quarterly loss, the reality is that most companies don’t have hordes of cash from VCs and a recent IPO so profits are what helps these businesses grow. Curiously, Google Ads doesn’t really have a Smart Bidding strategy geared towards profits.

So it’s up to the human PPC pro to bridge that gap and perhaps add some automation layering. One way to drive towards profitable PPC is to take margins into account when setting ROAS goals.

More profitable items (the ones with higher margins) can have lower ROAS targets. Remember ROAS in Google is “conv value/cost” (i.e., conversion value divided by ad costs). Assuming the conversion value is the cart value of the sale, for an item with a better margin more of that cart value is the product markup. So a lower ROAS can still deliver a profit whereas for items with low margins, less of the cart value is the markup and hence a higher ROAS is needed to break even.

PPC pros could manually assign different products to different smart shopping campaigns with different ROAS targets but that would be tedious and time consuming, especially if the margins for existing products were to change due to promotions and sales events. A smarter solution would be to apply automation layering and use a tool or script that sends products automatically to the right smart shopping campaigns where Google’s automations could take over.

Conclusion

The engines are automating many things we used to have lots of control over because we used to do them manually: from finding new keywords, to setting better bids, to writing ads. But when the people behind the businesses that advertise on Google get a say, results can be better than if the engine’s automation runs entirely on its own.

Just like Google is adding automations, so should you. Use the concept of Automation Layering to your advantage to retain the level of control you’re used to while also saving time by letting the machines do your work.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Frederick (“Fred”) Vallaeys was one of the first 500 employees at Google where he spent 10 years building AdWords and teaching advertisers how to get the most out of it as the Google AdWords Evangelist.
Today he is the Cofounder of Optmyzr, an AdWords tool company focused on unique data insights, One-Click Optimizations™, advanced reporting to make account management more efficient, and Enhanced Scripts™ for AdWords. He stays up-to-speed with best practices through his work with SalesX, a search marketing agency focused on turning clicks into revenue. He is a frequent guest speaker at events where he inspires organizations to be more innovative and become better online marketers.

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

How to use XPath expressions to enhance your SEO and content strategy

30-second summary: As Google increasingly favors sites with content that exudes expertise, authority, and trustworthiness (E-A-T), it is imperative that SEOs and marketers produce content that is not just well written, but that also demonstrates expertise. How do you understand what topics and concerns matter most to your customer base? Can you use Q&As to […]

Published

on


30-second summary:

  • As Google increasingly favors sites with content that exudes expertise, authority, and trustworthiness (E-A-T), it is imperative that SEOs and marketers produce content that is not just well written, but that also demonstrates expertise.
  • How do you understand what topics and concerns matter most to your customer base?
  • Can you use Q&As to inform content strategies?
  •  XPath notations can be your treasure trove.
  • Catalyst’s Organic Search Manager, Brad McCourt shares a detailed guide on using XPath notations and your favorite crawler to quickly obtain the Q&As in a straightforward and digestible format.

As Google increasingly favors sites with content that exudes expertise, authority, and trustworthiness (E-A-T), it is imperative that SEOs and marketers produce content that is not just well written, but that also demonstrates expertise. One way to demonstrate expertise on a subject or product is to answer common customer questions directly in your content.

But, how do you identify what those questions are? How do you understand what topics and concerns matter most?   

The good news is that they are hiding in plain sight. Chances are, your consumers have been shouting at the top of their keyboards in the Q&A sections of sites like Amazon.

XPath and how to find customer questions and preferences

These sections are a treasure trove of (mostly) serious questions that real customers have about the products you are selling.

How do you use these Q&As to inform content strategies? XPath notation is your answer.

You can use XPath notations and your favorite crawler to quickly obtain the Q&As in a straightforward and digestible format. XPath spares you from clicking through endless screens of questions by automating the collection of important insights for your content strategy.

What is XPath?

XML Path (XPath) is a query language developed by W3 to navigate XML documents and select specified nodes of data.

The notation XPath uses is called “expressions”. Using these expressions, you can effectively pull any data that you need from a website as long as there is a consistent structure between webpages.

This means you can use this language to pull any publicly available data in the source code, including questions from a selection of Amazon Q&A pages.

This article is not meant to be a comprehensive tutorial on XPath. For that, there are plenty of resources from W3. However, XPath is easy enough to learn with only knowing the structure of XML and HTML documents. This is what makes it such a powerful tool for SEOs regardless of coding prowess.

Let’s walk through an example to show you how…

Using XPath to pull customer questions from Amazon

Pre-req: Pick your web crawler

While most of the big names in web crawling – Botify, DeepCrawl, OnCrawl – all offer the ability to extract data from the source code, I will be using ScreamingFrog in the example below.

ScreamingFrog is by far the most cost-effective option, allowing you to crawl up to 500 URLs without buying a license. For larger projects you can buy a license. This will allow you to crawl as many URLs as your RAM can handle.

Step one: Collect the URLs to crawl

For our example, let’s pretend we’re doing research on the topics we should include in our product pages and listings for microspikes. For those unaware, microspikes are an accessory for your boots or shoes. They give you extra grip in wintry conditions, so they are particularly popular among cold-weather hikers and runners.

Example for finding details using Amazon

Source: https://www.amazon.com/s?k=microspikes

Here we have a list of 13 questions and answer pages for the top microspike pages on Amazon.com. Unfortunately, there is some manual work required to create the list.

List of questions - XPath and creating content

The easiest way is to search for the topic (that is, microspikes) and pull links to the top products listed. If you have the product’s ASIN (Amazon Standard Identification Number) handy, you can also generate the URLs using the above format, but switching out the ASIN.

Step two: Determine the XPath

From here, we need to determine the XPath.

In order to figure out the proper XPath notation to use to pull in the desired text, we have two main options:

  1.       View the Source-CodeDetermine the XPath
  2.       View the rendered source code and copy the XPath directly from Chrome’s Inspect Element tool

Copy XPath

You’ll find that the expression needed to locate all questions in an Amazon Q&A page is:

//span[@class=”a-declarative”]

Here is XPath notation broken down:

  •       // is used to locate all instances of the following expression.
  • Span is the specific tag we’re trying to locate. //span will locate every single <span> tag in the source code. There are over 300 of these, so we’ll need to be more specific.
  • @class specifies that //span[@class] will ensure all <span> tags with an assigned class attribute will be located.
  • @class=”a-declarative” dictates that //span[@class=”a-declarative”] only locates <span> tags where the class attribute is set to “a-declarative” – that is, <span class=”a-declarative”>

There is an extra step in order to return the inner text of the specified tag that is located, but ScreamingFrog does the heavy lifting for us.

It’s important to note that this will only work for Amazon Question and Answer pages. If you wanted to pull questions from, say, Quora, TripAdvisor, or any other site, the expression would have to be adjusted to locate the specific entity you desire to collect on a crawl.

Step three: Configure your crawler

Once you have this all set, you can then go into ScreamingFrog.

Configuration -> Custom -> Extraction

Configure your crawler

This will then take you to the Custom Extraction screen.

Custom extraction screen

This is where you can:

  • Give the extraction a name to make it easier to find after the crawl, especially if you’re extracting more than one entity. ScreamingFrog allows you to extract multiple entities during a single crawl.
  • You can then choose the extraction method. In this article, it is all about XPath, but you also have the option of extracting data via CSSPath and REGEX notation as well.
  • Place the desired XPath expression in the “Enter XPath” field. ScreamingFrog will even check your syntax for you, providing a green checkmark if everything checks out.
  • You then have the option to select what you want extracted, be it the full HTML element or the HTML found within the located tag. For our example, we want to extract the text in between any <span> tags with a class attribute set to “a-declarative” so we select “extract text.”

We can then click OK.

Step four: Crawl the desired URLs

Now it’s time to crawl our list of Amazon Q&A pages for microspikes.

First, we’ll need to switch the Mode in ScreamingFrog from “Spider” to “List.”

Then, we can either add our set of URLs manually or upload them from an Excel or other supported format.

After we confirm the list, ScreamingFrog will crawl each URL we provided, extracting the text between all <span> tags containing the class attribute set to “a-declarative.”

In order to see the data collected, you just need to select “Custom Extraction” in ScreamingFrog.

Run the desired URLs

At first glance, the output might not look that exciting.

However, this is only because a lot of unneeded space is included with the data, so you might see some columns that appear blank if they are not expanded to fully display the contents.

Once you copy and paste the data into Excel or your spreadsheet program of choice, you can finally see the data that has been extracted. After some clean-up, you get the final result:

Final list of questions created using XPath

The result is 118 questions that real customers have asked about microspikes in an easily accessible format. With this data at your fingertips, you’re now ready to incorporate this research into your content strategy.

Content strategies

Before diving into content strategies, a quick word to the wise: you can’t just crawl, scrape and publish content from another site, even if it is publicly accessible.

First, that would be plagiarism and expect to be hit with an DMCA notice. Second, you’re not fooling Google. Google knows the original source of the content, and it is extremely unlikely your content is going to rank well – defeating the purpose of this entire strategy.

Instead, this data can be used to inform your strategy and help you produce high quality, unique content that users are searching for.

Now, how do you get started with your analysis?

I recommend first categorizing the questions. For our example there were many questions about:

  • Sizing: What size microspikes are needed for specific shoe/boot sizes?
  • Proper Use – Whether or not microspikes could be used in stores, on slippery roofs, while fishing, mowing lawns, or for walking on plaster?
  • Features: Are they adjustable, type of material, do they come with a carrying case?
  • Concerns: Are they comfortable, do they damage your footwear, do they damage the type of flooring/ground you’re on, durability?

This is an amazing insight into the potential concerns customers might have before purchasing microspikes.

From here, you can use this information to:

1. Enhance existing content on your product and category pages

Incorporate the topics into the product or category descriptions, answering questions shoppers might have pre-emptively.

For our example, we might want to make it abundantly clear how sizing works – including a sizing chart and specifically mentioning types of footwear the product may or may not be compatible with.

2. Build out a short on-page FAQ section featuring original content, answering commonly asked questions

Make sure to implement FAQPage Schema.org markup for a better chance to appear for listings like People Also Ask sections, which are increasingly taking up real estate in the search results.

For our example, we can answer commonly asked questions about comfort, damage to footwear, durability, and adjustability. We could also address if the product comes with a carrying case and how to best store the product for travel.

3. Produce a product guide, incorporating answers to popular questions surrounding a product or category

Another strategy is to produce an extensive one-stop product guide showcasing specific use cases, sizing, limitations, and features. For our example, we could create specific content for each use case like hiking, running in icy conditions, and more.

Even better, incorporate videos, images, charts, and featured products with a clear path to purchase.

Using this approach your end product will be content that shows expertise, the authority on a subject, and most importantly, addresses customer concerns and questions before they even think to ask. This will help prevent your customers from having to do additional research or contact customer service. Thanks to your informative and helpful content, they will be more ready to make a purchase.

Furthermore, this approach also has the potential to lower product return rates. Informed customers are less likely to purchase the wrong product based upon assumed or incomplete knowledge.

Conclusion

Amazon is just the tip of the iceberg here. You can realistically apply this strategy to any site that has publicly accessible data to extract, be that questions from Quora about a product category, Trip Advisor reviews about hotels, music venues, and attractions, or even discussions on Reddit.

The more informed you are about what your customers are expecting when visiting your site, the better you can serve those expectations, motivate purchases, decrease bounces, and improve organic search performance.

Brad McCourt is an Organic Search Manager at Catalyst’s Boston office. 



Source link

Continue Reading

SEO

Five must knows for advertisers and marketers

Published

on


30-second summary:

  • 2020 set the stage for one of the most disruptive and fluid years search has ever seen.
  • Local search and Google My Business (GMB) set to be key focal areas for search advertisers and marketers amid shifts in COVID era search activity.
  • Google continues to make moves at further integrating ecommerce into search.
  • Manual Text Ads look to be on shaky ground as we move into 2021.
  • Ashley Fletcher, VP of Marketing at Adthena, shares five must-knows for search in 2021.

From algorithm changes to shifts in search activity as a result of COVID-19, 2020 was one of the most disruptive years that the search industry has ever seen. And although positive movements have been made in helping to rein in the COVID-19, a “return to normal” still seems a long way off. However, with the COVID-19 vaccine raising the possibility that “non-COVID era” search habits may return, search professionals are hard at work trying to determine which industry changes are here to stay, and which may fade away, as the world begins to get long overdue COVID relief. This means the landscape of search in 2021 is likely to see just as unpredictable of evolution as it did in 2020.

With that in mind, here are three key areas search advertisers and marketers should pay close attention to as we move into, and through, 2021.

Doubling down on GMB and local search

Remember when Google My Business (GMB) was just a helpful little tool for search advertising and marketing? Those days are now behind us.

Accounting for 33% of how local businesses are ranked, GMB is now a huge factor when it comes to SEO. Moreover, as local continues to become a bigger part of the search environment as more users are opting to stay close to home as a result of the COVID-19 pandemic, advertisers and marketers need to optimize their search strategies appropriately and stay abreast of any enhancements to GMB.

Greater consideration for voice search

With 157 million Amazon Echos in homes around the US at the start of 2020, voice search is poised to continue being a massive player in search moving forward. And given how easy it is, the fact that more smart speakers are set to be purchased in the years to come, voice search is likely to go from a secondary voice channel to a primary one in short order. Therefore, with this new avenue opening up and PPC having to be rethought as a result, advertisers should begin thinking about how to optimize their searches from traditional keyword search logic to spoken word-centric phrases.

Direct buy on Google? Amazon beware

E-commerce is set to be one of the most intriguing areas of search in 2021 as Google continues to indicate that shopping will be a key goal for its platform moving forward. For years, Google has been signaling that shopping and e-commerce are key focal areas for its platform. And through the rollout of features such as Smart Shopping — among other things — Google has never been in a better position to drive sales directly from its SERPs. This means that not only should Amazon be on high-alert, but traditional retail search advertisers need to seriously consider their search strategies in the year ahead.

The end of the text ad?

Could 2021 be the end of the road for text ads? This has been the question on search pros minds particularly since Google briefly scrapped the ability to create text ads in October — not to mention when the ability to create ETAs disappeared from Google Ads dropdown menus on a smaller scale in August. Plus, given the added emphasis being placed on Smart Bidding, it seems that manual text ads could have a limited lifespan at best, and 2021 could be the year where we see this search staple wound down entirely.

Being OK with uncertainty

Search advertisers are used to adapting to continuously evolving circumstances. But 2021 could push the term “evolution” to an extreme. From better understanding search patterns during the COVID era to figuring out which trends are here to stay and which are just passing fads, 2021 is going to be a very hard year for search professionals to get their heads around — let alone always get it right. With that in mind, it has never been more important for search professionals to lean into both technology and teamwork to make sense of what lies ahead. Moreover, search professionals need to move into 2021 with a whole new perspective on flexibility. Simply put, search advertising is set to chart completely foreign waters in 2021, and by embracing the fact that uncertainty is the new normal search professionals will likely have a much easier time adapting to these new circumstances.

Closing note

While 2020 presented the search industry with an unprecedented amount of uncertainty, 2021 could represent a period of even greater unpredictability as several foundational changes seem to be set to take place within the space. However, by keeping an eye on these emerging areas and game planning now, search advertisers and marketers will not only be able to avoid potential headaches and growing pains but be able to put themselves in a position to drive success as soon as possible.

Ashley Fletcher is VP of Marketing at Adthena.



Source link

Continue Reading

SEO

28 Ways to supercharge your site

Published

on


30-second summary:

  • Google plans to roll out the new Core Web Vitals update in early 2021.
  • The overall size, dimensions, load order, and format of your images will drastically affect your PageSpeed score.
  • Loading critical CSS and JS inline can improve the perceived load time of your site.
  • Above-the-fold videos and large background images can be particularly damaging to your Largest Contentful Paint time.
  • A server upgrade and a CDN can improve your server response time and your contentful paint score.
  • Founder of Content Powered, James Parsons, shares an exhaustive list of 28 elements that will supercharge your site for Google’s Core Web Vitals update and Google PageSpeed Insights.

Announced in early 2020, the Core Web Vitals are a set of metrics Google is developing and plans to roll into their overall search algorithm in May of 2021.  Given that it’s almost 2021 now, anyone who wants to get ahead on optimizing their site for this new algorithm update can get to work now.  Thankfully, Google has been very good about publicly disclosing what these new metrics are and how they work.

Armed with that information, it’s possible to build a checklist of action items to check and optimize on your site to ready yourself for the inevitable rollout of these new ranking factors.  Here are 28 such items for that checklist.

A. Image optimization

Images are one of the largest influencing factors in the core web vitals.  All of the web vitals measure the time until some initial rendering, and loading images is the largest source of delay before a page is initially fully loaded.  Thus, optimizing images tends to be the most powerful tool for improving core web vitals.

1. Reduce the Dimensions of Background Images

Background images are rarely fully necessary to a site design and can be a large source of delay in loading a page for the first time.

If you use a background image, reduce how large that image is and optimize it so it loads as close to instantaneously as possible.

2. Minimize or Replace Background Images with Patterns

If you’re not tied to a specific background image, either replace the image with flat colors, a gradient, or even a simple tiled pattern.  Again, the goal is to minimize how many assets need to load before the initial load of the website is complete.  Since background images don’t make a huge impact (and are even less necessary on mobile), minimize or remove them as much as possible.

3. Remove Images on Mobile Above the Fold

Speaking of mobile, the mobile browsing experience is often slower than desktop browsing due to the quality of cell and wireless signals.  Mobile devices are especially susceptible to delays in the first input and on the content shift.

Core Web Vitals report elements - Above and Below the Fold

To help avoid that, strive to make as much of your above-the-fold content as possible based on text and other simple elements.  Large images and slideshows above the fold are particularly rough on your score, so remove or move them as much as possible.

4. Implement Lazy Loading

Lazy loading is a common technique for speeding up the initial load of any given page.  With Google’s new metrics on the horizon, it’s no surprise that support for it is quickly becoming a default feature.  WordPress, for example, added native default lazy loading in version 5.5 earlier this year.  Make use of lazy loading for any content, particularly images, that doesn’t need to load above the fold initially.

5. Use WebP Images

Another Google initiative, WebP is a new image format developed back in 2010.  It’s a smaller image format with better compression algorithms than your traditional image formats like PNG.

While it hasn’t really picked up widespread traction until recently, it’s becoming more and more valuable as both users and search engines are increasingly concerned with speed and load times.  Support is widespread, even if usage isn’t, so you can more-or-less safely use WebP images as your primary image files.

6. Optimize Image File Sizes

Using a tool to crunch or smush image files to be smaller in file size should be a default part of optimizing images for the web by this point.

Core Web Vitals report elements - WebP Image Optimization

If you don’t do it already, make sure you implement a way to process images as part of your blogging workflow moving forward. You’ll also want to make sure you’ve defined the height and width of images to prevent layout shift.

B. CSS optimization

CSS has become an increasingly critical part of many site designs, so much so that blocking it makes the web almost unrecognizable.  With so much of a site reliant on CSS for everything from colors to positioning, making sure your code is optimized is more important than ever.

7. Inline Critical CSS

You don’t need to inline every bit of your CSS, though that works as well.  In particular, you want to inline CSS that is critical to the overall design and layout of your theme.

Core Web Vitals report elements - Inlining CSS

This minimizes the number of individual files a browser needs to call from your server just to load the initial layout and paint the initial content on your site.

8. Minify CSS

CSS is by default a very minimalist language and can operate perfectly well without spaces, indentation, comments, and other text that makes it more user-friendly and easier to develop.  Before uploading new code to your site, run it through a tool to minify it and remove all of that excess cruft that has a microscopic-yet-tangible effect on page loading.

9. Consolidate CSS Files and Code

It can be tempting to store CSS in a variety of files and scattered throughout your code, placing it where it seems like it should be rather than where it makes sense to put it.  Remember; what is easiest as a developer is not necessarily the fastest for a user.  Consolidate your CSS, whether it’s inline or in separate files, and only execute specific elements as necessary.

10. Optimize CSS Delivery

CSS is often a late-loading element of site code.  Traditional site design loads the framework for the site, then the content, then the CSS to format it all.  Particularly when CSS is stored in an external file, this delays loading significantly.  Preloading your CSS is a strategy recommended by Google to force the browser to load the CSS and have it ready when it’s needed.

C. JavaScript optimization

JavaScript is one of the biggest sources of code bloat and delay in loading websites. Optimizing your site’s JS can help speed it up tremendously, even when it doesn’t seem like it would have much of an effect based on what you’re doing to it.

11. Minify JS Scripts

Like CSS, JavaScript doesn’t need extraneous spaces and breaks to function.  It also doesn’t need verbose variable names, which are useful for development but can increase the size of scripts by a significant amount.

Run your scripts through a minifier before adding them to your site.

Core Web Vitals report elements - Minify your Javascript

12. Consolidate Scripts and Minimize Usage

Many of the purposes web designers use JavaScript for have been available as features in HTML5 and CSS3 for years now.  Particularly in older websites, a revamp or review of scripts can find alternative, faster ways to do the same things.  Review and optimize, minimize, consolidate, and strip as much JavaScript as you can from your site.

13. Defer or Async Scripts Whenever Possible

Scripts are roadblocks in rendering a website.  When a browser has to render a JS script, it has to process through that script before it can continue loading the page.  Since many developers put scripts in their headers, this delays page loading significantly.  Using Defer allows the browser to continue loading the page before executing the script, while Async allows them to load simultaneously.  Using these two features allows you to offset the delay inherent in using scripts and speed up your initial page loads.

14. Remove jQuery Migrate

A recent update to jQuery has led to a lot of old plugins and scripts no longer working.  To buy time and allow webmasters to update their sites, the Migrate module was introduced.  This is essentially a translation module that allows old jQuery to function on sites that utilize a newer version of jQuery.

Core Web Vitals report elements - Remove jQuery Migrate

Perform an audit of your site to see if anything you’re using – particularly old plugins and apps – uses jQuery Migrate.  If so, consider updating or replace those plugins.  Your goal is to remove usage of the Migrate module entirely because it’s rather bulky and can slow down websites dramatically.

15. Use Google Hosted JS Whenever Possible

Google offers a range of standard libraries hosted on their servers for use on your website.  Rather than relying on a third party for those libraries or hosting them yourself, use Google’s versions for the fastest possible load times.

D. Video optimization

Videos are increasingly popular as part of the average website, from core elements of content to video-based advertising and everything in between.  They’re also extremely large files, even with partial loading and modern video buffering.  Optimize your use of video as much as possible.

16. Use Image Placeholders for Video Thumbnails

There are plenty of users who browse the web with no desire to watch videos, so forcing videos to load in the background for them is completely unnecessary.  A good workaround is to use an image placeholder where the video would normally load.

Core Web Vitals report elements - Lazy load your videos

The image loads faster and looks like the video player with a loaded thumbnail.  When a user clicks it to start the video, it begins the video load but doesn’t require loading any of the video file or player until that point.

17. Minimize Videos Above the Fold

As with images, video files are extremely heavy, so loading them above the fold is a guaranteed delay on your first content paint.  Push them below the fold; most people want to read a title and introduction before they get to the video anyway.

E. Font and icon optimization

Fonts and icon usage can be a lot heavier on a site’s load times than you might expect. Optimizing them might seem like minuscule detail work, but when you see the impact it can have, you’ll wonder why you never made these minor-yet-impactful optimizations before.

18. Preload Fonts

Similar to scripts, when your website calls for a font that it needs to load, loading that font takes precedence and stops the rest of the code from rendering.

Using a preload command to load the font earlier than necessary helps speed up page loading, as well as preventing the “flash of unstyled text” effect that happens for a brief instant between the text loading and the font styling appearing.

Core Web Vitals report elements - Pre-load your fonts

19. Only Use Fonts You Need

Many web fonts and font families load their entire character sets and stylesheets when called, even if your page doesn’t utilize 90% of that content.  Often, you can limit how much you load, though you may need to pay for premium font access.  It can be quite worthwhile if you’re using limited amounts of a given font, or a font that has a particularly large character set included.

20. Use SVG Whenever Possible

SVGs are Scalable Vector Graphics and are a way to create extremely small elements of a page that can nevertheless scale indefinitely, as well as be manipulated individually, to a much greater degree than traditional fonts and icons.  If possible, switch to using SVGs instead of your usual icons.

F. Server optimization

No matter how many optimizations you make to the code of your website, to your images, or to other elements of your site, none of it matters if your server is slow.  The proliferation of web hosting companies, the ongoing development of faster and stronger tech, all means that web hosting shows its age very quickly.  Every few years, it can be worthwhile to change or upgrade hosting to faster infrastructure.

21. Upgrade to a Faster Server

You don’t necessarily need to upgrade from a shared host to a dedicated host, though this can help with some of the speed issues inherent in shared hosting.  Even simply upgrading from a slower package to a faster one can be a good use of a budget.

22. Use a CDN

Modern content delivery networks can handle most of the elements of your site faster than your typical web host can in almost every circumstance.  At a minimum, consider using a CDN for your images, videos, and other multimedia.  You can also consider offloading stand-alone script files as well.

23. Preload DNS Queries

Preloading or prefetching DNS queries helps minimize the delay between an asset being requested by the visitor and the display of that asset.

This couples with using a CDN to store assets by loading and resolving the CDN’s domain before it’s called for the first time, further speeding up page load times.

Preload DNS queries - DNS lookups

24. Preload Your Cache

Often, a cache plugin or script used on a website triggers when the first visitor arrives to view the page.  That first visitor has a slower experience, but their loads cache the page for future visitors until the cache expires.  Unfortunately, the first visit is often a Google bot crawling your page from your XML sitemap or an internal link, and that means that Google is the first one to experience the slow version of your site.  You can get around this by preloading the cache on your website so Google’s next visit is a guaranteed fast-loading web page.

25. Consider a Server-Side Cache

Software such as Varnish Cache acts as a server-side cache to further speed up the generation and serving of a cached version of your page, making it as fast as possible with as few server calls as possible.

G. Additional optimization

Anything that didn’t fit in another category has been added here.  These additional optimizations might not apply to your site design, but if they do, taking care of them can be a great boon.

26. Minimize Third-Party Scripts

Webmasters in 2021 will need to strike a balance between site speed optimizations and user engagement tools.

Many plugins, such as social sharing buttons, third-party comment systems, and media embeds all need to execute third-party scripts in order to work, but those scripts slow down the site.  Minimize them as much as possible, and try to find the fastest versions of each.

Minimize third-party scripts

27. Avoid Pre-Load Filler

A common technique for sites with slower load times is to add a spinner, a loading icon, an animation, or another form of content that loads and displays to indicate to a user that the site is, in fact, loading.  While this can help minimize bounces, it’s a huge hit to the initial loads measured by the core web vitals.  Remove these and work to speed up your site such that you don’t need them.

28. Consider a Site Redesign

When all is said and done, sometimes you need to make so many changes to so many foundational elements of your site that it’s easier to simply scrap your current design and engineer a new one with speed in mind.  Consider it a possibility, and analyze the benefits you’ll get from optimized core web vitals.  No one knows yet how influential those metrics will be on the overall algorithm, but it certainly can’t hurt to optimize for them.

James Parsons is the founder of Content Powered, a blog management & content marketing company. He’s worked as a senior-level content marketer for over a decade and writes for Forbes, Entrepreneur, Inc, and Business Insider.





Source link

Continue Reading

Trending