Connect with us

SEO

Why New Pages Fluctuate in Google SERPs

Published

on


In May 2018, Matt Southern published an article on Search Engine Journal about ranking fluctuations for new content.  John Mueller confirmed that new content bounce around in the search results before finally settling down. What was left unspoken was why new content fluctuates. If Google is operating on a continuous update, why is there fluctuation?

I saw a post on a private Facebook Group where someone asked why new pages fluctuated.

Possible reasons that were offered were:

  1. User metrics
  2. A temporary PageRank score that is later updated
  3. Clicks on the listing to test how relevant the page was

Those are great suggestions for why rankings fluctuate for new pages.

There is a patent filed by Google that’s titled, Modifying Search Result Ranking Based on Implicit User Feedback.

It describes monitoring clicks to a web page as well as when a web page is not clicked in the search engine results pages (SERPs), in order to understand if a web page appeals to users. It also discusses using how long a user stays on a site before returning.

Here’s what it says:

“…identifies user selections (clicks) of individual document results and also identifies when the user returns to the results page, thus indicating the amount of time the user spent viewing the selected document result.”

That may contribute to ranking fluctuations. But it’s not a paper about testing new web pages that are added to Google’s search index. 

Most if not all the papers involving CTR are about improving search results and predicting click through rates for advertising.

Google’s Indexing System

I believe that the answer has to do with how Google stores data. Google’s index is continuous since the Caffeine update. The Caffeine update was really about something called Google Instant. In 2010 the software underlying the storage of all of Google’s data was called BigTable and the Google File System was called Colossus.

When talking about indexing, we’re talking about Caffeine and the underlying Percolator system.  Here is a Google PDF document detailing how Percolator works and that also discusses the trade-offs between the old Google File System using MapReduce and the new distributed file system that runs computations in parallel.

An important distinction is that Percolator has an overhead that’s 30 times greater than the old system. Thus, as long as Google can keep throwing hardware at it the system can keep on scaling.

Here is what the Google document from 2010 says about that overhead:

“We chose an architecture that scales linearly over many orders of magnitude on commodity machines, but we’ve seen that this costs a significant 30-fold overhead compared to traditional database architectures. …how much is fundamental to distributed storage systems, and how much can be optimized away?”

This is a notable section from the Google document about Percolator:

“Percolator applications are structured as a series of observers; each observer completes a task and creates more work for “downstream” observers by writing to the table.

In our indexing system, a MapReduce loads crawled documents into Percolator by running loader transactions, which trigger the document processor transaction to index the document (parse, extract links, etc.). The document processor transaction triggers further transactions like clustering. The clustering transaction, in turn, triggers transactions to export changed document clusters to the serving system.

Percolator applications consist of very few observers — the Google indexing system has roughly 10 observers. Each observer is explicitly constructed in the main() of the worker binary, so it is clear what observers are active.

…at most one observer’s transaction will commit for each change of an observed column. The converse is not true, however: multiple writes to an observed column may cause the corresponding observer to be invoked only once. We call this feature message collapsing, since it helps avoid computation by amortizing the cost of responding to many notifications. For example, it is sufficient for http://google.com to be reprocessed periodically rather than every time we discover a new link pointing to it.

That’s kind of interesting, isn’t it? It’s talking about a periodic reprocessing rather than processing a page or domain every time a link is discovered.

Could that contribute to the latency in ranking changes, where something is updated but a change in ranking shows up later?

Here is another Google document about Google’s storage systems (PDF).

The main points described in that document are:

  1. Cluster-level structured storage
  2. Exports a distributed, sparse, sorted-map
  3. Splits and rebalances data based on size and load
  4. Asynchronous, eventually-consistent replication
  5. Uses GFS or Colossus for file storage

Link Graph Algorithms

There are other considerations.  I think once you read what I wrote below, you may agree that the reason rankings fluctuate for new content  may be because of how Google’s indexing and link ranking system works.

A Plausible Explanation

I say this may be the explanation because while we know as a fact there is a ranking fluctuation, there is no explicit statement of exactly what causes this fluctuation. Thus, we can only do our best to research what can be known and make a plausible explanation armed with that knowledge.

A plausible explanation is better than pure speculation because knowledge of what is possible puts you right in the range of what is actually happening.

How Link Graphs are Generated and Maintained

The link graph is a map of the Internet. Any time a new link is created or a new page page is published, the link graph changes.

According to a Google patent from 2009, a way to efficiently accomplish the link ranking calculation is to divide the link graph into shards. Each shard takes a turn independently updating and recalculating rankings after something changes.

It’s a practical thing, like breaking a problem up into multiple pieces and handing them out to a crowd of people who in turn work on their little piece of the project then return it to you to put it all back together.

This patent describes a system for ranking links between web pages. What it does is to divide the map of all the web pages in the Internet into pieces called shards. These shards represent little pieces of the problem to be solved independently from the rest of the map of the web, which we sometimes refer to as Google’s index.

What is interesting is that all these calculations are performed in parallel. In parallel means at the same time.

Ranking web pages and links at the same time, on a continuous basis, is what Google’s index, known as Google Instant (or Caffeine or Colossus) is all about.

In the paragraph above I have restated in plain English what the patent describes.

Here is what the 2009 patent states:

“A system, comprising:
multiple computer servers programmed to perform operations comprising: dividing a directed graph representing web resources and links into shards, each shard comprising a respective portion of the graph representing multiple web resources and links associated with the multiple web resources; assigning each of the shards to a respective server, including assigning, to each of the respective servers, data describing the links associated with the multiple web resources represented by the portion of the graph corresponding to the shard assigned to the server; and calculating, in parallel, a distance table for each of the web resources in each shard using a nearest seed computation in the server to which the shard was assigned using the data describing the links associated with the multiple web resources.”

Further down the page the patent describes how the web crawler (diagram element #204) crawls the Internet and stores the information in the data centers (diagram element #208). Then the engine responsible for indexing and ranking (diagram element #206) divides the index into shards at which point everything gets ranked on a continuous basis.

Why New Pages Fluctuate in Google SERPs

The above paragraph is my plain English translation of what the patent states below:

“The compression, indexing and ranking module 206 can receive a link graph and a list of the nodes of the link graph to be used as seeds. In addition, the module 206 can divide the link graph into shards and compute nearest seed distances using a large number of shard servers as described above.”

Checkpointing

This is the part of the patent that mentions time. Each shard is independently creating a checkpoint of it’s current state and updating. What’s interesting is how it is described as being asynchronous.

Asynchronous means that the computations are done independently, at random intervals, at any time. That’s the opposite of synchronous, which means that it has to wait for something else to finish before it can begin updating or processing.

So now we have a description of an index system that is updating on a rolling basis. Each shard updates and recalculates it’s section of the Internet according to it’s own time schedule.

The patent states that RAM (memory) is used to store data. The time schedule described in the patent is timed to optimize memory resources.

The above is the plain English version of what is described in the patent.

This is the part that mentions memory and the timing of the re-calculated web page ranking is integrated:

“More specifically, once the leaf table reaches a certain size, it is flushed to a disk file in node identifier sorted order, so that it is sorted the same way the distance table is. If flushing occurs too infrequently or too slowly, the leaf table may grow to a maximum allowed size, causing the server to stop processing incoming updates, so as not to run out of memory. The size of the leaf table is a tradeoff between memory consumption and the amount of data written to disk. The larger the table the higher the chances of updating an existing in-memory leaf entry before it is flushed to disk (leaf disk space).

At predetermined time intervals, or at times of low activity, or after many changes have accumulated, each shard server stores an incremental checkpoint of the distance table and of the leaf table on a reliable data store, e.g., as a distance table increment file and a leaf table increment file on a GFS. The leaf table may be flushed at other times as well.”

Now here is the part of the patent where time is again mentioned in terms of how the updated web page rankings are folded into the main index.

It states that a “checkpoint” represents a change in how web pages are interlinked with each other.

For example, when someone publishes a new web page, the link relationships within that site change. The checkpoint represents that change.

The above is my plain English translation. Below is what the patent describes:

“The term “checkpoint” is used within this specification to describe a data structure that may describe a state change in a portion of the distance table respective to a particular shard server and a particular time interval.

Each checkpoint includes a timestamp and a delta representing the changes to the data from the previous checkpoint. The shard servers will generally write checkpoints to the GFS independently of one another, and their regular interval times may include an element of pseudo-randomness so as to smooth out the demand for GFS resources. Because each shard server determines when a particular checkpoint is written to the GFS, the process is asynchronous.”

So there you have it. The timing may have something to do with optimizing the use of RAM so that it doesn’t become overburden. It’s possible that for a site that is constantly being updated and links are constantly being added, the time it takes to rank may be faster.

Just my opinion, but I believe that this isn’t necessarily because the page or website is “authoritative” but because the shard that is responsible for updating this part of the index is particularly busy and it needs to create a checkpoint in order to keep running smoothly.

In my experience, rankings tend to stabilize from ten to fifteen days. If it’s a large amount of content in the tens of thousands of pages, in my experience stabilization of ranking positions can take as long as a month.

Why New Pages Bounce in the Search Results

The answer to a seemingly simple question such as “why do new pages fluctuate in Google’s search results” can be extraordinarily complex.

It’s easy to toss out random guesses such as Google is tracking clicks and user satisfaction metrics. But there is zero evidence in scientific papers and patents that are published by Google to support that it’s Google tracking clicks.  To choose to believe such a conclusion, without any evidence at all, is an exercise in willful ignorance.

John Mueller simply said that new pages are subject to extreme fluctuations (as noted in the link at the top of this article). The reasons, as I outlined in this article, may be many. I highlighted several of these reasons

More Resources

Images by Shutterstock, Modified by Author
Screenshots by Author, Modified by Author

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!

Ebook





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

How to make your website ADA-compliant and win at SEO

Published

on


30-second summary:

  • The Americans with Disabilities Act (ADA) passed in 1990 does now include mobile apps and websites.
  • An ADA-compliant website helps more people than those covered by ADA.
  • There are many SEO benefits such as increased visibility on google image searches, and featured snippets.
  • Co-founder of Ally digital media, Abhishek Shah says, “Responsive websites help with ADA compliance and further improve your website’s overall search presence.”
  • The four best ways to make your website ADA-compliant with a clear outline of its ADA as well as SEO benefits.

The Americans with Disabilities Act (ADA) passed in 1990 does now include mobile apps and websites. Specifically, Title III of the ADA has taken an official stand on how websites should be accessible for disabled users. However, when you look at what’s necessary to make a website ADA-compliant, you will see that these also will help improve your site’s SEO.

Some elements such as title tags, heading structure, alt text, and responsive design are things all websites should include. By ensuring these are done properly and in an ADA-compliant way will maximize your website’s effectiveness.

How ADA accessibility prioritization benefits everyone

Ensuring your website complies with the ADA helps you serve a larger audience and gives a boost to your search engine rankings. This is because most of the necessary components of making your website ADA compliant feed directly into SEO best practices.

After all, the whole point is to make your website easier to view, understand, and navigate. What business doesn’t want all that for their website?

Four ways an ADA-compliant website helps improve your SEO

Here are 4 ADA-compliant must-haves (in no particular order) that will help improve your SEO. This list is by no means comprehensive, but it is a good place to start.

1. Title tags help screen searches and readers

Title tags are very basic SEO. They let the reader, and search engines, know what the page is about. A title tag doesn’t show up on your website. Rather, it appears on the results page of a search engine, and the tab at the top of your web browser.

SEO benefits

Title tags, while basic SEO, are very important. This tag needs to match your user’s intent. For example, when someone googles “best phone” the phrase best phone (or a variation like “best smartphone”) will appear in the title tag.

Writing a title that accurately reflects what the page is about is the best way to get found and clicked on. It’s why a title tag should be specific: “The best Android phones for 2020” is far better than “Why you will want to buy one of these phones.”

ADA benefits

For those who need screen readers to help them use a computer, a specific title tag such as the above example is much more user-friendly. So, it is vital the title tag accurately reflects the page content.

The accessibility guidelines say the title should be “The best Android phones for 2020” instead of “Why you will want to buy one of these phones.”

2. Descriptive alt text

Alt text is not the same thing as a caption. A caption is visible usually beneath an image. Whereas alt text is not visible on the front end of the site. The alt text is a written alternative to a page’s visual elements. This includes: .jpegs, .pngs, and .gifs. the alt text is a description of an image that lives in the backend of the site.

SEO benefits

Alt text lets search engines know the subject matter of an image. It also helps search engines to better understand the page. Additionally, if you want images to show up in Google, then writing descriptive alt text is a must-have.

ADA benefits

For web users with visual impairment using screen readers, descriptive alt text is read aloud. This helps a visually impaired reader get a better sense of what’s going on, on any given page.

A useful descriptive alt text might be: “woman at café with laptop drinking coffee” 

A useless alt text would be: “SEO tips for freelancers | Get more clients with SEO | Writing your way to success with SEO”

3. Responsive design

Responsive design has been around since 2012/2013 in one form or another. But it means more than just your website being able to adapt to whichever screen size it finds itself on.

It’s about where your logo sits, how easy is your site to navigate, how easy is it to read, and how quickly does it load?

SEO benefits

Websites that offer good, functional user experience rank better in search results. User experience isn’t just one ranking factor but an umbrella term for quite a few. Google has said that a site that takes longer than three seconds to load on a mobile site will rank higher.

How easy content is to read (and how useful it is) is also an important ranking factor.

ADA benefits

Good responsive design puts the user first. It starts from the premise that a website needs to be easy to look at, easy to navigate, and be easy to understand.

This is why you need legible text for the visually impaired. As well as quick load times for people with slow internet. And straightforward navigation to make it easy for people to get around your website.

4. Proper heading (and subheading) structure

Headings (which show up in the code as <h1> or <h2> or <h3> etc.) define your content’s hierarchy. These headings (and subheadings) work along similar lines to when you wrote essays in school.

Proper heading structure:

  • Goes in order: a h3 doesn’t go directly after a h1.
  • Describes the copy beneath it.
  • Follows a sequence: if your h2 is “4 ways…” then the h3s would be each of those points.

SEO benefits

When your writing is clearly structured it is easier to read, and easier to follow. It’s also easier for Google to crawl your content and understand what is the most important (starting with h1, and so on).

Good header structure can also your content appear in the featured snippets in the search engine results page (SERPs).

ADA benefits

For users who have limited reading comprehension or cognitive impairments, clear and direct headings make it easier to read. Headings and subheadings let a reader know what’s worth reading and what’s worth skipping over.

And just like a reader skips heading, so too can a screen reader. Which only reinforces the need for a strong, clear heading structure.

An example of a website that has both good SEO and is ADA compliant is Enviro Safety Products. When you review this site you will see it ticks all the boxes, and provides the user a seamless, friendly experience.

Americans with Disabilities Act (ADA-compliant) - Example

Source: Enviro Safety Products

How making your website ADA compliant will help you win at SEO

By applying all the necessary ADA compliant elements to your website, you are helping the one in four Americans with a disability use your website. Additionally, you will also greatly enhance your website’s SEO.

If you would like to know more about how making your website ADA compliant will help you win at SEO, you can throw questions in the comments section below.

Abhishek Shah is the co-founder of Ally Digital Media, a leading voice in digital media and marketing. He advocates for evidence-based marketing strategies to fuel the businesses. He can be found on Twitter @abiishek.





Source link

Continue Reading

SEO

How to drive digital innovation necessary during the pandemic

Published

on


30-second summary:

  • COVID-19 has kept consumers in their homes, which has led to significant spikes in internet use and companies scrambling to digitize in order to meet customers where they are.
  • The ability to quickly develop digital capabilities will continue to be critical for meeting customer needs and ensuring organizations’ survival.
  • To remain competitive, companies must enhance the digital customer experiences they offer through upgraded social media, optimized conversion, strategies, better marketing research, an effective internal website search, and fresh customer touchpoints.

Emerging digital technologies like artificial intelligence (AI) and cloud computing enticed leaders with their agility and efficiency. Many companies planned to make digitization a goal for the new decade.

In hindsight, they probably wish they hadn’t waited.

The novel coronavirus upended every aspect of our lives. As businesses and governments around the world try to combat the pandemic, millions of consumers sit inside their homes. And where do people go during a government-mandated lockdown? Online.

The unprecedented shift to remote work and online learning, combined with a dramatic increase in movie streaming, videoconferencing, and social media traffic, has led to significant spikes in internet use. In this same time frame, big tech companies — the businesses at the forefront of digital innovation — have flourished, as have brands that capitalized on the power of social media engagement.

The biggest trick to digitization right now is meeting customers where they are. For example, my company, Teknicks, is working with an online K-12 speech and occupational therapy provider. When schools began transitioning to remote learning, students’ needs changed, too. We helped the provider pivot its value proposition and messaging to accommodate school districts’ new realities. By focusing on teletherapy tools and reassuring parents, we’ve seen substantial growth and brand recognition during the pandemic.

Until we find a vaccine for the novel coronavirus, your customers will likely engage with you through online channels. The ability to develop digital capabilities quickly will continue to be critical for meeting customer needs and ensuring survival for your organization. With that in mind, here’s how you can enhance your digital customers’ experiences:

1. Upgrade your social media

It’s not hard to be good at social media marketing — it’s hard to be great. As you build your audience on websites like Facebook and Instagram, be sure to engage with followers consistently. Create a content calendar mapping out your posts and sharing strategies and stick to it. These platforms are also a great channel for customer service, allowing you to provide personalized support and become instantaneously useful (something that customer support tickets and chatbots never seem to be).

If you already have a sizable engaged audience, it’s time to work on your content strategy. Don’t build your content strategy around keywords. Instead, focus on your audiences’ needs. A truly effective content strategy will be customized for the platform you’re on and will account for the user behavior most characteristic of that platform. Naturally, you will use keywords and phrases that are optimized for discoverability while maintaining authenticity.

One key strategy is to conduct marketing research using a survey. This tactic goes well beyond traditional keyword research and generates content ideas directly from your targeted audience, not a keyword tool. Surveying your prospective customers allows them to tell you what type of content they want to consume, significantly increasing the likelihood of engagement. Often, this strategy is the key to successful marketing strategy. I’ll go into more detail below.

2. Focus on and prioritize conversion optimization

Ideally, your website looks good and loads quickly, but those qualities alone don’t make a website great. The user experience that your website offers is ultimately what determines whether customers bounce in droves or actually stick around. Attempting to boost your initial traffic will exponentially increase customer acquisition costs, so improving your conversion rates via website optimization is a more affordable (and profitable) solution.

We often see double-digit increases in conversion rates on our first test. We typically focus on the most trafficked pages to increase the likelihood of big, impactful wins. There is an entire science behind conversion optimization, but the core fundamentals have remained the same for years.

To make sure your website’s architecture is seamless and intuitive, develop a conversion rate optimization strategy that works for you. This will require you to ask visitors for feedback, experiment with different messaging options, and regularly review your analytics, among other things. The idea is to get to know your visitors well. It takes work, but it will pay off over time as the incremental conversion rate increases impact top-line revenue.

3. Conduct marketing research surveys

With the right insights, you can turn every engagement into a memorable and valuable experience for both you and your customers. The best way to get customer insights is to ask. Design a survey of up to 10 questions in a variety of formats along with some screening questions to make sure the feedback you get is actually useful.

When designing, consider your potential customers’ preferences and pain points. For example, if you know your audience is mostly on Instagram, asking “What do you like about social media?” won’t be as effective as “What makes Instagram posts better than Facebook posts?” Once the survey’s drafted, post it to your social channels and send it out to your mailing list. You want to understand which messages resonate with your audience before you spend a cent on marketing. Learning how to conduct marketing research is one of the most important marketing skills you can attain.

Asking individual customers how they feel about various messaging options can give you a goldmine of useful data to help inform the language and design choices you make. Not every customer will choose to participate in a survey, but some will. Show them you appreciate their input by offering a small discount or another incentive once the survey is completed. You’ll be surprised by how many responses you get and how beneficial the precursory information is.

4. Review your internal website search

As much as you’d love for every visitor to spend hours exploring every nook and cranny of your website, most will want to get on with their lives after they’ve found what they came for. To make the process faster, you should offer some sort of internal website search functionality. If you don’t already have one, add a search box to your navigation menu.

Not every website has one, and even the ones that do have very surface-level functions. However, search bars are a valuable asset that can increase internal sessions and conversion. Internal website searchers are 216% likelier to convert, according to WebLinc. Search bars assist your visitors and expand your understanding of user behavior, providing you with the information you need in order to adjust your website accordingly.

Evaluate the effectiveness of your internal search, taking notice of how it finds and organizes the content after a search. Most native search functionality is very basic and just looks for the presence of “search term,” but you may want to test out more advanced filters that help users more effectively find the information they are looking for.

I recommend looking at the search data monthly to see what users have been looking for. Be sure to review what searches yielded zero results and which searches brought up irrelevant content. Identify areas that can be approved and understand your content gaps that need additional content to support the demand.

5. Identify new customer touchpoints

Innovation is all about using new technology to improve old processes. While your typical customer journey might depend on your industry and business, chances are good that you can find ways to enhance it with emerging technologies.

Evaluating whether an emerging technology is a fit for your business and whether you should invest in testing it out, starts with (drumroll …) a survey. As we discussed earlier, surveys can answer just about anything you want to know about your target audience. Go ahead and ask your audience if they own or use the emerging tech and validate its place in the customer journey.

Take the new home buying process, for example. David Weekley Homes, the largest privately-held home builder in the U.S., wanted to better understand whether voice-enabled devices can play a role in the customer journey. The company also wanted to propose a voice app idea to the audience and understand how they felt about the emerging technology concept. By conducting a survey, we uncovered that 81% of the respondents would consider the voice app idea to be somewhat to extremely valuable and 70% would possibly to definitely use the voice app if it existed.

The increasing usage of voice search and voice-enabled devices also offers an opportunity for consumer brands to make it easier than ever for customers to find their products. Tide, for example, has capitalized on marketing on Amazon’s Alexa Skills platform to remove a step from the purchasing process. Customers can use the company’s skill to order Tide products without having to pull up the Amazon app or go to the Tide website. In that way, new tech makes an old process (purchasing detergent) more frictionless than ever.

The COVID-19 pandemic has made digital innovation a business imperative. Regardless of your industry, you should look for ways to anticipate and meet customer needs. Your customers expect a seamless digital experience. If you can’t provide it, they won’t have to leave their homes to find someone else that can.

Nick Chasinov is the founder and CEO of Teknicks, a research-based internet marketing agency certified by Google in Analytics, Tag Manager, and a Google Premier AdWords partner.



Source link

Continue Reading

SEO

Core Web Vitals, E-A-T, or AMP?

Published

on


30-second summary:

  • The biggest Google update of the year is called the Page Experience update.
  • Core Web Vitals are part of that update, and they are definitely ranking factors to keep in mind, especially when optimizing images.
  • AMP is no longer the only way to get a “Top Stories” feature on mobile. Starting in 2021, any news webpage can become a “Top Story”.
  • Combining AMP’s privacy concerns and cost of operation might mean that AMP will disappear within a couple of years.
  • E-A-T is not a ranking factor right now, and we don’t know if it will become one in the future.

2020. What a year. History is happening around us, and Google? Well, Google keeps on revamping their search algorithms. Over the years, there have been many many major algorithm updates, as Google worked to keep us on our toes. 2020 was no different: in one fell swoop, we got the news about a Page Experience update and AMP news. All the while the debate about whether or not you need E-A-T for ranking rages on. How do the Core Web Vitals stand in changing the search game in 2021?

Let’s go over each of these innovations and see which will change the way we do SEO, and which will fade into obscurity sooner rather than later.

1. Importance of core web vitals for SEO

Core Web Vitals were part of Page Experience update, and, by far, caused the biggest ruckus.

There’s a lot to learn about Core Web Vitals, but they boil down to the three biggest issues on our webpages:

  1. LCP — Largest Contentful Paint, which deals with the loading speed of the largest single object on the page.
  2. FID — First Input Delay, which means the reaction time of the page to the first user input after (whether they click, tap, or press any keys).
  3. CLS — Cumulative Layout Shift — this is the measure of how much the content of the page jumps while loading content, mostly visual content, after opening.

How core web vitals influences rankings

Of course, some SEO experts think that the entire Page Experience update is nothing special, and could even: “[…] distract, […] from the core mission of communication and storytelling,”.

And, sure, most of Page experience update is simply an assembly of things we’ve known for a while: use HTTPS, be mobile-friendly, control your page speed, and so on.

But Core Web Vitals are a bit different and can influence the SEO practice in unexpected ways. Key factor that’s already changing rankings is Cumulative Layout Shift.

As most SEO experts know, for a while an important part of image optimization was using the <decoding=async> attribute in the <img> tag to avoid losing page speed while rendering the page.

Using <decoding=async> could lead to some seriously janky pages if coders didn’t specify the height and width of every single image to be rendered. Some websites did it anyway, for example, Wikipedia on most of its pages has a predefined space for images created ahead of time.

Core Web Vitals and other ranking factors for 2021 - Wikipedia

But as SEO experts we didn’t have to worry about pages being jumpy all too much, as that didn’t influence the rankings. Now with CLS being formally announced as a ranking factor, things will change for a whole slew of websites and SEO experts.

We’ll need to make sure that every webpage is coded with CLS in mind, with the needed space for every image defined ahead of time, to avoid the layout shifts.

The verdict

Overall, of course, it’s too early to tell, and more work by SEO’s around the web needs to be done here. However, it seems that if you aren’t used to focusing on technical SEO, Core Web Vitals becoming ranking signals might not influence your day-to-day work at all.

However, if you are conducting complicated technical SEO, then Core Web Vitals will definitely change the way you work in as-yet unexpected ways.

2. Importance of AMP for SEO

The AMP’s relevance today is kind of an open question. While it’s always been great as a quick-and-easy way to increase page speed, the privacy concerns have been voiced over and over again since the technology’s very inception.

But in 2020, significant changes are afoot, since, within the same Page Experience update, Google announced that there’s finally no requirement for us to create AMP pages to occupy the “Top Stories” SERP feature.

That’s a pretty huge step for anybody trying to accrue as many SERP features as they can, and, in particular, for news websites.

Core Web Vitals and other search ranking factors for 2021 - Top Stories

How AMP influences rankings

If we believe John Muellers’ words, then AMP is not a ranking factor. Seems plain and simple enough. But of course, things aren’t so simple, because AMP comes with pretty significant gains in page speed, and speed is an important ranking factor.

Thanks to AMP’s pre-rendering combined with some severe design limitations, AMP webpages often really do win in page speed, even if not in ranking as is.

The “Top Stories” SERP feature, however, was a huge benefit to using an AMP for any news agency with a website, and it’s easy to understand why. Just look at how much of the page is occupied by the “Top Stories” results.

Not only do “Top Stories” automatically get top 1 ranking on the SERP, but they also sport a logo of the website posting them, standing out even more from the boring old blue-link SERP.

This means that for a few years now news websites were essentially forced into using AMP to get into a “Top Stories” SERP feature on mobile since it absorbs a whole lot of clicks.

On the other hand, it takes quite a lot of resources to support AMP versions of the webpages, because you are basically maintaining a whole additional version of your website.

Added to which, a page that’s been properly optimized for speed might not need AMP for those speed gains, as well.

The verdict

While it’s tough to imagine that AMP will fade away completely within the next couple of years, AMP’s privacy issues combined with the cost of maintaining it might spell the end of it being a widely used practice.

Now, with the “Top Stories” becoming available to non-AMP pages, there’s virtually no reason to jeopardize the users’ security for speed gains you could get by proper optimization.

3. Importance of E-A-T for SEO

Expertise. Authority. Trust. All perfectly positive words and something we should all strive for in our professional lives. But what about search optimization?

Coming straight from Google’s Quality Rater Guidelines, E-A-T has been the talk of the town for a good moment now. Let’s dive in and see how they might change the way we optimize for search.

How E-A-T influences rankings

For most of us, they don’t really.

Sure, Quality Rater Guidelines provide valuable insights into Google’s ranking process. However, E-A-T is one of the lesser-important factors we should be focusing on, partly because these are nebulous, abstract concepts, and partly because Google doesn’t exactly want us to.

As Google’s official representatives informed us, E-A-T is not in itself a ranking factor.

Receiving follow-up questions, Google’s John Mueller then reiterated that point, and Ben Gomes, Google’s VP of search engineering confirmed that quality raters don’t influence any page’s rankings directly.

However, in practice, we often see that the so-called YMYL websites already can’t rank without having some expertise and authority established. A very popular example is that it’s virtually impossible to rank a website providing medical advice without an actual doctor writing the articles.

The problem here is that expertise, authority, and trustworthiness are not easily interpreted by the search algorithms, which only understand code.

And, at the moment, there seems to be no surefire way for Google to transform these signals into rankings, except to read the feedback of their quality raters before each algorithm update.

The verdict

While using E-A-T to rank websites might sound like an inarguable benefit for the searcher, there is a couple of concerns that aren’t easily solved, namely:

  1. Who exactly will be determining the E-A-T signals, and according to which standard?
  2. The introduction of such factors creates a system where the smaller and newer websites are punished in rankings for not having the trustworthiness that they couldn’t realistically acquire.

Responding to both of these concerns requires time and effort on the search engine’s side.

As things stand right now, E-A-T is not something to keep in mind while doing day-to-day SEO operations.

Let’s imagine a fantastical scenario where a webmaster/SEO expert has some free time. Then they might want to work on E-A-T, to try and stay ahead of the curve.

On the other hand, there simply isn’t any proof that Google will actually use E-A-T. Or that, even if used, these signals will become major ranking factors. For this reason, E-A-T shouldn’t be your priority ahead of traditional SEO tasks like link building and technical optimization.

Additionally, consider this. The entire Quality Rater Guidelines is 168 pages long. However, a comprehensive explanation of what E-A-T is and why it might be calculated a certain way will take many more pages than that.

Conclusion

As of the time of this writing, the Core Web Vitals seems to be the most important ranking news to come out in 2020 in practical terms. However, search is an extremely volatile field: what worked two weeks ago may not work today, and what works today might not work for most of us.

The matters are further complicated because we’re fighting an uneven battle: it’s simply not in search engines’ best interest to give us a full and detailed picture of how ranking works, lest we abuse it.

This is why it’s crucial to keep our hand on the pulse of optimization news and changes occurring every single day. With constant efforts from our SEO community to work out the best way to top rankings, it’s possible for us to close that gap and know for sure which trends are paramount, and which we can allow ourselves to overlook.

Aleh Barysevich is Founder and CMO at SEO PowerSuite and Awario.





Source link

Continue Reading

Trending

Copyright © 2019 Plolu.