Connect with us

SEO

How to prepare for a JS migration

Published

on


An 80 percent decrease in organic traffic is the nightmare of every business. Unfortunately, such a nightmarish scenario may become reality if a website migration is done incorrectly; instead of improving the current situation it eventually leads to catastrophe.

Source: http://take.ms/V6aDv

There are many types of migrations, such as changing, merging or splitting the domains, redesigning the website or moving to a new framework.

Web development trends are clearly showing that the use of JavaScript has been growing in recent years and JavaScript frameworks are becoming more and more popular. In the future, we can expect that more and more websites will be using JavaScript.

Source: https://httparchive.org/reports/state-of-javascript

As a consequence, SEOs will be faced with the challenge of migrating to JavaScript frameworks.

In this article, I will show you how to prepare for a migration of a website built with a static HTML to a JavaScript framework.

Search engines vs. JavaScript

Google is the only search engine that is able to execute JavaScript and “see” the elements like content and navigation even if they are powered by JavaScript. However, there are two things that you always need to remember when considering changes to a JS framework.

Firstly, Google uses Chrome 41 for rendering pages. This is a three-year old browser that does not support all the modern features needed for rendering advanced features. Even if they can render JS websites in general, it may happen that some important parts will not be discovered due to the reliance on technology that Google can’t process.

Secondly, JS executing is an extremely heavy process so that Google indexes JS websites in two waves. The first wave gets the raw HTML indexed. In the case of JS-powered websites, this translates to almost an empty page. During the second wave, Google executes JavaScript so they can “see” the additional elements loaded by JS. Then they are ready for indexing the full content of the page.

The combination of these two elements makes it so that if you decide to change your current website to the JavaScript framework, you always need to check if Google can efficiently crawl and index your website.

Migration to a JS framework done right

SEOs may not like JavaScript, but it doesn’t mean that its popularity will stop growing. We should get prepared as much as we can and implement the modern framework correctly.

Below you will find information that will help you navigate through the process of changing the current framework. I do not provide “ready-to-go” solutions because your situation will be the result of different factors and there is no universal recipe. However, I want to stress the elements you need to pay particular attention to.

Cover the basics of standard migration

You can’t count on the miracle that Google will understand the change without your help. The whole process of migration should be planned in detail.

I want to keep the focus on JS migration for this article, so if you need detailed migration guidelines, Bastian Grimm has already covered this.

Source: Twitter

Understand your needs in terms of serving the content to Google

This step should be done before anything else. You need to decide on how Google will receive the content of your website. You have two options:

1. Client-side rendering: This means that you are totally relying on Google for rendering. However, if you go for this option you agree on some inefficiency. The first important drawback of this solution is the deferred indexing of your content due to the two waves of indexing mentioned above. Secondly, it may happen that everything doesn’t work properly because Chrome 41 is not supporting all the modern features. And last, but not least, not all search engines can execute JavaScript, so your JS website will seem empty to Bing, Yahoo, Twitter and Facebook.

Source: YouTube

2. Server-side rendering: This solution relies on rendering by an external mechanism or the additional mechanism/component responsible for the rendering of JS websites, creating a static snapshot and serving it to the search engine crawlers. At the Google I/O conference, Google announced that serving a separate version of your website only to the crawler is fine. This is called Dynamic Rendering, which means that you can detect the crawler’s User Agent and send the server-side rendered version. This option also has its disadvantages: creating and maintaining additional infrastructure, possible delays if a heavy page is rendered on the server or possible issues with caching (Googlebot may receive a not-fresh version of the page).

Source: Google

Before migration, you need to answer if you need option A or B.

If the success of your business is built around fresh content (news, real estate offers, coupons), I can’t imagine relying only on the client-side rendered version. It may result in dramatic delays in indexing so your competitors may gain an advantage.

If you have a small website and the content is not updated very often, you can try to leave it as client-side rendered, but you should test before launching the website if Google really does see the content and navigation. The most useful tools to do so are Fetch as Google in GSC and the Chrome 41 browser.

However, Google officially stated that it’s better to use Dynamic Rendering to make sure they will discover frequently changing content correctly and quickly.

Framework vs. solution

If your choice is to use Dynamic Rendering, it’s time to answer how to serve the content to the crawlers. There is no one universal answer. In general, the solution depends on the technology AND developers AND budget AND your needs.

Below you will find a review of the options you have from a few approaches, but the choice is yours:

  • I need an as simple a solution as possible.

Probably I’d go for pre-rendering, for example with prerender.io. It’s an external service that crawls your website, renders your pages and creates static snapshots to serve them if a specific User Agent makes a request. A big advantage of this solution is the fact that you don’t need to create your own infrastructure.

You can schedule recrawling and create fresh snapshots of your pages. However, for bigger and frequently changing websites, it might be difficult to make sure that all the pages are refreshed on time and show the same content both to Googlebot and users.

  • I need a universal solution and I follow the trends.

If you build the website with one of the popular frameworks like React, Vue, or Angular, you can use one of the methods of Server Side Rendering dedicated to a given framework. Here are some popular matches:

Using one of these frameworks installed on the top of React or Vue results in creating a universal application, meaning that the exact same code can be executed both on the server (Server Side rendering) and in the client (Client Side Rendering). It minimizes the issues with a content gap that you could have if you rely on creating snapshots and heavy caching, as with prerender.

  • I need a universal solution and I don’t use a popular framework.

It may happen that you are going to use a framework that does not have a ready-to-use solution for building a universal application. In this case, you can go for building your infrastructure for rendering. It means that you can install a headless browser on your server that will render all the subpages of your website and create the snapshots that are served to the search engine crawlers. Google provides a solution for that – Puppeteer is a library that does a similar job as prender.io. However, everything happens on your infrastructure.

  • I want a long-lasting solution.

For this, I’d use hybrid rendering. It’s said that this solution provides the best experience both to users and the crawlers because users and crawlers receive a server-side rendered version of the page on the initial request. In many cases, serving an SSR page is faster for users rather than executing all the heavy files in the browser. All subsequent user interactions are served by JavaScript. Crawlers do not interact with the website by clicking or scrolling so it’s always a new request to the server and they always receive an SSR version. Sounds good, but it’s not easy to implement.

Source: YouTube

The option that you choose will depend on many factors like technology, developers and budgets. In some cases, you may have a few options, but in many cases, you may have many restrictions, so picking a solution will be a single-choice process.

Testing the implementation

I can’t imagine a migration without creating a staging environment and testing how everything works. Migration to a JavaScript framework adds complexity and additional traps that you need to watch out for.

There are two scenarios. If for some reason you decided to rely on client-side rendering, you need to install Chrome 41 and check how it renders and works. One of the most important points of an audit is checking errors in the console in Chrome Dev Tools. Remember that even a small error in processing JavaScript may result in issues with rendering.

If you decided to use one of the methods of serving the content to the crawler, you will need to have a staging site with this solution installed. Below, I’ll outline the most important elements that should be checked before going live with the website:

1. Content parity

You should always check if users and crawlers are seeing exactly the same content. To do that, you need to switch the user agents in the browser to see the version sent to the crawlers. You should verify the general discrepancies regarding rendering. However, to see the whole picture you will also need to check the DOM (Document Object Model) of your website. Copy the source code from your browser, then change the User Agent to Googlebot and grab the source code as well. Diffchecker will help you to see the differences between the two files. You should especially look for the differences in the content, navigation and metadata.

An extreme situation is when you send an empty HTML file to Googlebot, just as Disqus does.

Source: Google

This is what their SEO Visibility looks like:

Source: http://take.ms/Fu3bL

They’ve seen better days. Now the homepage is not even indexed.

2. Navigation and hyperlinks

To be 100 percent sure that Google sees, crawls and passes link juice, you should follow the clear recommendation of implementing internal links shared at Google I/O Conference 2018.

Source: YouTube

If you rely on server-side rendering methods, you need to check if the HTML of a prerendered version of a page contains all the links that you expect. In other words, if it has the same navigation as your client-side rendered version. Otherwise, Google will not see the internal linking between pages. Critical areas where you may have problems is facet navigation, pagination, and the main menu.

3. Metadata

Metadata should not be dependent on JS at all. Google says that if you load the canonical tag with JavaScript they probably will not see this in the first wave of indexing and they will not re-check this element in the second wave. As a result, the canonical signals might be ignored.

While testing the staging site, always check if an SSR version has the canonical tag in the head section. If yes, confirm that the canonical tag is the correct one. A rule of thumb is always sending consistent signals to the search engine whether you use client or server-side rendering.

While checking the website, always verify if both CSR and SSR versions have the same titles, descriptions and robots instructions.

4. Structured data

Structured data helps the search engine to better understand the content of your website.

Before launching the new website make sure that the SSR version of your website displays all the elements that you want to mark with structured data and if the markups are included in the prerendered version. For example, if you want to add markups to the breadcrumbs navigation. In the first step, check if the breadcrumbs are displayed on the SSR version. In the second step, run the test in Rich Results Tester to see if the markups are valid.

5. Lazy loading

My observations show that modern websites love loading images and content (e.g. products)  with lazy loading. The additional elements are loaded on a scroll event. Perhaps it might be a nice feature for users, but Googlebot can’t scroll, so as a consequence these items will not be discovered.

Seeing that so many webmasters are having problems with lazy loading in an SEO-friendly way, Google published a guideline for the best practices of lazy loading. If you want to load images on a scroll, make sure you support paginated loading. This means that if you scroll, the URLs should change (e.g., by adding the pagination identifiers: ?page=2, ?page=3, etc.) and most importantly, the URLs are updated with the proper content, for example by using History API.

Do not forget about adding rel=”prev” and rel=”next” markups in the head section to indicate the sequence of the pages.

Snapshot generation and cache settings

If you decided to create a snapshot for search engine crawlers, you need to monitor a few additional things.

You must check if the snapshot is an exact copy of the client-side rendered version of your website. You can’t load additional content or links that are not visible to a standard user, because it might be assessed as cloaking. If the process of creating snapshots is not efficient e.g. your pages are very heavy and your server is not that fast, it may result in creating broken snapshots. As a result, you will serve e.g. partially rendered pages to the crawler.

There are some situations when the rendering infrastructure must work at high-speeds, such as Black Friday when you want to update the prices very quickly. You should test the rendering in extreme conditions and see how much time it takes to update a given number of pages.

The last thing is caching. Setting the cache properly is something that will help you to maintain efficiency because many pages might be quickly served directly from the memory. However, if you do not plan the caching correctly, Google may receive stale content.

Monitoring

Monitoring post-migration is a natural step. However, in the case of moving to a JS framework, sometimes there is an additional thing to monitor and optimize.

Moving to a JS framework may affect web performance. In many cases, the payload increases which may result in longer loading times, especially for mobile users. A good practice is monitoring how your users perceive the performance of the website and compare the data before and after migration. To do so you can use Chrome User Experience Report.

Source: Google

It will provide information if the Real User Metrics have changed over time. You should always aim at improving them and loading the website as fast as possible.

Summary

Migration is always a risky process and you can’t be sure of the results. The risks might be mitigated if you plan the whole process in detail. In the case of all types of migrations, planning is as important as the execution. If you take part in the migration to the JS framework, you need to deal with additional complexity. You need to make additional decisions and you need to verify additional things. However, as web development trends continue to head in the direction of using JavaScript more and more, you should be prepared that sooner or later you will need to face a JS migration. Good luck!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Maria Cieslak is a Senior Technical SEO Consultant at Elephate, the “Best Small SEO Agency” in Europe. Her day to day involves creating and executing SEO strategies for large international structures and pursuing her interest in modern websites built with JavaScript frameworks. Maria has been a guest speaker at SEO conferences in Europe, including 2018’s SMX London, where she has spoken on a wide range of subjects, including technical SEO and JavaScript. If you are interested in more information on this subject, you should check out Elephate’s “Ultimate Guide to JavaScript SEO“.





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

How to drive digital innovation necessary during the pandemic

Published

on


30-second summary:

  • COVID-19 has kept consumers in their homes, which has led to significant spikes in internet use and companies scrambling to digitize in order to meet customers where they are.
  • The ability to quickly develop digital capabilities will continue to be critical for meeting customer needs and ensuring organizations’ survival.
  • To remain competitive, companies must enhance the digital customer experiences they offer through upgraded social media, optimized conversion, strategies, better marketing research, an effective internal website search, and fresh customer touchpoints.

Emerging digital technologies like artificial intelligence (AI) and cloud computing enticed leaders with their agility and efficiency. Many companies planned to make digitization a goal for the new decade.

In hindsight, they probably wish they hadn’t waited.

The novel coronavirus upended every aspect of our lives. As businesses and governments around the world try to combat the pandemic, millions of consumers sit inside their homes. And where do people go during a government-mandated lockdown? Online.

The unprecedented shift to remote work and online learning, combined with a dramatic increase in movie streaming, videoconferencing, and social media traffic, has led to significant spikes in internet use. In this same time frame, big tech companies — the businesses at the forefront of digital innovation — have flourished, as have brands that capitalized on the power of social media engagement.

The biggest trick to digitization right now is meeting customers where they are. For example, my company, Teknicks, is working with an online K-12 speech and occupational therapy provider. When schools began transitioning to remote learning, students’ needs changed, too. We helped the provider pivot its value proposition and messaging to accommodate school districts’ new realities. By focusing on teletherapy tools and reassuring parents, we’ve seen substantial growth and brand recognition during the pandemic.

Until we find a vaccine for the novel coronavirus, your customers will likely engage with you through online channels. The ability to develop digital capabilities quickly will continue to be critical for meeting customer needs and ensuring survival for your organization. With that in mind, here’s how you can enhance your digital customers’ experiences:

1. Upgrade your social media

It’s not hard to be good at social media marketing — it’s hard to be great. As you build your audience on websites like Facebook and Instagram, be sure to engage with followers consistently. Create a content calendar mapping out your posts and sharing strategies and stick to it. These platforms are also a great channel for customer service, allowing you to provide personalized support and become instantaneously useful (something that customer support tickets and chatbots never seem to be).

If you already have a sizable engaged audience, it’s time to work on your content strategy. Don’t build your content strategy around keywords. Instead, focus on your audiences’ needs. A truly effective content strategy will be customized for the platform you’re on and will account for the user behavior most characteristic of that platform. Naturally, you will use keywords and phrases that are optimized for discoverability while maintaining authenticity.

One key strategy is to conduct marketing research using a survey. This tactic goes well beyond traditional keyword research and generates content ideas directly from your targeted audience, not a keyword tool. Surveying your prospective customers allows them to tell you what type of content they want to consume, significantly increasing the likelihood of engagement. Often, this strategy is the key to successful marketing strategy. I’ll go into more detail below.

2. Focus on and prioritize conversion optimization

Ideally, your website looks good and loads quickly, but those qualities alone don’t make a website great. The user experience that your website offers is ultimately what determines whether customers bounce in droves or actually stick around. Attempting to boost your initial traffic will exponentially increase customer acquisition costs, so improving your conversion rates via website optimization is a more affordable (and profitable) solution.

We often see double-digit increases in conversion rates on our first test. We typically focus on the most trafficked pages to increase the likelihood of big, impactful wins. There is an entire science behind conversion optimization, but the core fundamentals have remained the same for years.

To make sure your website’s architecture is seamless and intuitive, develop a conversion rate optimization strategy that works for you. This will require you to ask visitors for feedback, experiment with different messaging options, and regularly review your analytics, among other things. The idea is to get to know your visitors well. It takes work, but it will pay off over time as the incremental conversion rate increases impact top-line revenue.

3. Conduct marketing research surveys

With the right insights, you can turn every engagement into a memorable and valuable experience for both you and your customers. The best way to get customer insights is to ask. Design a survey of up to 10 questions in a variety of formats along with some screening questions to make sure the feedback you get is actually useful.

When designing, consider your potential customers’ preferences and pain points. For example, if you know your audience is mostly on Instagram, asking “What do you like about social media?” won’t be as effective as “What makes Instagram posts better than Facebook posts?” Once the survey’s drafted, post it to your social channels and send it out to your mailing list. You want to understand which messages resonate with your audience before you spend a cent on marketing. Learning how to conduct marketing research is one of the most important marketing skills you can attain.

Asking individual customers how they feel about various messaging options can give you a goldmine of useful data to help inform the language and design choices you make. Not every customer will choose to participate in a survey, but some will. Show them you appreciate their input by offering a small discount or another incentive once the survey is completed. You’ll be surprised by how many responses you get and how beneficial the precursory information is.

4. Review your internal website search

As much as you’d love for every visitor to spend hours exploring every nook and cranny of your website, most will want to get on with their lives after they’ve found what they came for. To make the process faster, you should offer some sort of internal website search functionality. If you don’t already have one, add a search box to your navigation menu.

Not every website has one, and even the ones that do have very surface-level functions. However, search bars are a valuable asset that can increase internal sessions and conversion. Internal website searchers are 216% likelier to convert, according to WebLinc. Search bars assist your visitors and expand your understanding of user behavior, providing you with the information you need in order to adjust your website accordingly.

Evaluate the effectiveness of your internal search, taking notice of how it finds and organizes the content after a search. Most native search functionality is very basic and just looks for the presence of “search term,” but you may want to test out more advanced filters that help users more effectively find the information they are looking for.

I recommend looking at the search data monthly to see what users have been looking for. Be sure to review what searches yielded zero results and which searches brought up irrelevant content. Identify areas that can be approved and understand your content gaps that need additional content to support the demand.

5. Identify new customer touchpoints

Innovation is all about using new technology to improve old processes. While your typical customer journey might depend on your industry and business, chances are good that you can find ways to enhance it with emerging technologies.

Evaluating whether an emerging technology is a fit for your business and whether you should invest in testing it out, starts with (drumroll …) a survey. As we discussed earlier, surveys can answer just about anything you want to know about your target audience. Go ahead and ask your audience if they own or use the emerging tech and validate its place in the customer journey.

Take the new home buying process, for example. David Weekley Homes, the largest privately-held home builder in the U.S., wanted to better understand whether voice-enabled devices can play a role in the customer journey. The company also wanted to propose a voice app idea to the audience and understand how they felt about the emerging technology concept. By conducting a survey, we uncovered that 81% of the respondents would consider the voice app idea to be somewhat to extremely valuable and 70% would possibly to definitely use the voice app if it existed.

The increasing usage of voice search and voice-enabled devices also offers an opportunity for consumer brands to make it easier than ever for customers to find their products. Tide, for example, has capitalized on marketing on Amazon’s Alexa Skills platform to remove a step from the purchasing process. Customers can use the company’s skill to order Tide products without having to pull up the Amazon app or go to the Tide website. In that way, new tech makes an old process (purchasing detergent) more frictionless than ever.

The COVID-19 pandemic has made digital innovation a business imperative. Regardless of your industry, you should look for ways to anticipate and meet customer needs. Your customers expect a seamless digital experience. If you can’t provide it, they won’t have to leave their homes to find someone else that can.

Nick Chasinov is the founder and CEO of Teknicks, a research-based internet marketing agency certified by Google in Analytics, Tag Manager, and a Google Premier AdWords partner.



Source link

Continue Reading

SEO

Core Web Vitals, E-A-T, or AMP?

Published

on


30-second summary:

  • The biggest Google update of the year is called the Page Experience update.
  • Core Web Vitals are part of that update, and they are definitely ranking factors to keep in mind, especially when optimizing images.
  • AMP is no longer the only way to get a “Top Stories” feature on mobile. Starting in 2021, any news webpage can become a “Top Story”.
  • Combining AMP’s privacy concerns and cost of operation might mean that AMP will disappear within a couple of years.
  • E-A-T is not a ranking factor right now, and we don’t know if it will become one in the future.

2020. What a year. History is happening around us, and Google? Well, Google keeps on revamping their search algorithms. Over the years, there have been many many major algorithm updates, as Google worked to keep us on our toes. 2020 was no different: in one fell swoop, we got the news about a Page Experience update and AMP news. All the while the debate about whether or not you need E-A-T for ranking rages on. How do the Core Web Vitals stand in changing the search game in 2021?

Let’s go over each of these innovations and see which will change the way we do SEO, and which will fade into obscurity sooner rather than later.

1. Importance of core web vitals for SEO

Core Web Vitals were part of Page Experience update, and, by far, caused the biggest ruckus.

There’s a lot to learn about Core Web Vitals, but they boil down to the three biggest issues on our webpages:

  1. LCP — Largest Contentful Paint, which deals with the loading speed of the largest single object on the page.
  2. FID — First Input Delay, which means the reaction time of the page to the first user input after (whether they click, tap, or press any keys).
  3. CLS — Cumulative Layout Shift — this is the measure of how much the content of the page jumps while loading content, mostly visual content, after opening.

How core web vitals influences rankings

Of course, some SEO experts think that the entire Page Experience update is nothing special, and could even: “[…] distract, […] from the core mission of communication and storytelling,”.

And, sure, most of Page experience update is simply an assembly of things we’ve known for a while: use HTTPS, be mobile-friendly, control your page speed, and so on.

But Core Web Vitals are a bit different and can influence the SEO practice in unexpected ways. Key factor that’s already changing rankings is Cumulative Layout Shift.

As most SEO experts know, for a while an important part of image optimization was using the <decoding=async> attribute in the <img> tag to avoid losing page speed while rendering the page.

Using <decoding=async> could lead to some seriously janky pages if coders didn’t specify the height and width of every single image to be rendered. Some websites did it anyway, for example, Wikipedia on most of its pages has a predefined space for images created ahead of time.

Core Web Vitals and other ranking factors for 2021 - Wikipedia

But as SEO experts we didn’t have to worry about pages being jumpy all too much, as that didn’t influence the rankings. Now with CLS being formally announced as a ranking factor, things will change for a whole slew of websites and SEO experts.

We’ll need to make sure that every webpage is coded with CLS in mind, with the needed space for every image defined ahead of time, to avoid the layout shifts.

The verdict

Overall, of course, it’s too early to tell, and more work by SEO’s around the web needs to be done here. However, it seems that if you aren’t used to focusing on technical SEO, Core Web Vitals becoming ranking signals might not influence your day-to-day work at all.

However, if you are conducting complicated technical SEO, then Core Web Vitals will definitely change the way you work in as-yet unexpected ways.

2. Importance of AMP for SEO

The AMP’s relevance today is kind of an open question. While it’s always been great as a quick-and-easy way to increase page speed, the privacy concerns have been voiced over and over again since the technology’s very inception.

But in 2020, significant changes are afoot, since, within the same Page Experience update, Google announced that there’s finally no requirement for us to create AMP pages to occupy the “Top Stories” SERP feature.

That’s a pretty huge step for anybody trying to accrue as many SERP features as they can, and, in particular, for news websites.

Core Web Vitals and other search ranking factors for 2021 - Top Stories

How AMP influences rankings

If we believe John Muellers’ words, then AMP is not a ranking factor. Seems plain and simple enough. But of course, things aren’t so simple, because AMP comes with pretty significant gains in page speed, and speed is an important ranking factor.

Thanks to AMP’s pre-rendering combined with some severe design limitations, AMP webpages often really do win in page speed, even if not in ranking as is.

The “Top Stories” SERP feature, however, was a huge benefit to using an AMP for any news agency with a website, and it’s easy to understand why. Just look at how much of the page is occupied by the “Top Stories” results.

Not only do “Top Stories” automatically get top 1 ranking on the SERP, but they also sport a logo of the website posting them, standing out even more from the boring old blue-link SERP.

This means that for a few years now news websites were essentially forced into using AMP to get into a “Top Stories” SERP feature on mobile since it absorbs a whole lot of clicks.

On the other hand, it takes quite a lot of resources to support AMP versions of the webpages, because you are basically maintaining a whole additional version of your website.

Added to which, a page that’s been properly optimized for speed might not need AMP for those speed gains, as well.

The verdict

While it’s tough to imagine that AMP will fade away completely within the next couple of years, AMP’s privacy issues combined with the cost of maintaining it might spell the end of it being a widely used practice.

Now, with the “Top Stories” becoming available to non-AMP pages, there’s virtually no reason to jeopardize the users’ security for speed gains you could get by proper optimization.

3. Importance of E-A-T for SEO

Expertise. Authority. Trust. All perfectly positive words and something we should all strive for in our professional lives. But what about search optimization?

Coming straight from Google’s Quality Rater Guidelines, E-A-T has been the talk of the town for a good moment now. Let’s dive in and see how they might change the way we optimize for search.

How E-A-T influences rankings

For most of us, they don’t really.

Sure, Quality Rater Guidelines provide valuable insights into Google’s ranking process. However, E-A-T is one of the lesser-important factors we should be focusing on, partly because these are nebulous, abstract concepts, and partly because Google doesn’t exactly want us to.

As Google’s official representatives informed us, E-A-T is not in itself a ranking factor.

Receiving follow-up questions, Google’s John Mueller then reiterated that point, and Ben Gomes, Google’s VP of search engineering confirmed that quality raters don’t influence any page’s rankings directly.

However, in practice, we often see that the so-called YMYL websites already can’t rank without having some expertise and authority established. A very popular example is that it’s virtually impossible to rank a website providing medical advice without an actual doctor writing the articles.

The problem here is that expertise, authority, and trustworthiness are not easily interpreted by the search algorithms, which only understand code.

And, at the moment, there seems to be no surefire way for Google to transform these signals into rankings, except to read the feedback of their quality raters before each algorithm update.

The verdict

While using E-A-T to rank websites might sound like an inarguable benefit for the searcher, there is a couple of concerns that aren’t easily solved, namely:

  1. Who exactly will be determining the E-A-T signals, and according to which standard?
  2. The introduction of such factors creates a system where the smaller and newer websites are punished in rankings for not having the trustworthiness that they couldn’t realistically acquire.

Responding to both of these concerns requires time and effort on the search engine’s side.

As things stand right now, E-A-T is not something to keep in mind while doing day-to-day SEO operations.

Let’s imagine a fantastical scenario where a webmaster/SEO expert has some free time. Then they might want to work on E-A-T, to try and stay ahead of the curve.

On the other hand, there simply isn’t any proof that Google will actually use E-A-T. Or that, even if used, these signals will become major ranking factors. For this reason, E-A-T shouldn’t be your priority ahead of traditional SEO tasks like link building and technical optimization.

Additionally, consider this. The entire Quality Rater Guidelines is 168 pages long. However, a comprehensive explanation of what E-A-T is and why it might be calculated a certain way will take many more pages than that.

Conclusion

As of the time of this writing, the Core Web Vitals seems to be the most important ranking news to come out in 2020 in practical terms. However, search is an extremely volatile field: what worked two weeks ago may not work today, and what works today might not work for most of us.

The matters are further complicated because we’re fighting an uneven battle: it’s simply not in search engines’ best interest to give us a full and detailed picture of how ranking works, lest we abuse it.

This is why it’s crucial to keep our hand on the pulse of optimization news and changes occurring every single day. With constant efforts from our SEO community to work out the best way to top rankings, it’s possible for us to close that gap and know for sure which trends are paramount, and which we can allow ourselves to overlook.

Aleh Barysevich is Founder and CMO at SEO PowerSuite and Awario.





Source link

Continue Reading

SEO

How to optimize and use effectively

Published

on


30-second summary:

  • Partial match domains refer to when your domain name has partially included the main keyword that you are trying to rank for.
  • There are many aspects that make it different from how the exact match domain works.
  • Tudor Lodge Consultants share a quick guide to help you succeed at partial match domains, understand the caveats, and optimize effectively.

Partial match domains refer to when your domain name has partially included the main keyword that you are trying to rank for.

Commonly used by SEO professionals to gain an advantage when it comes to ranking in the search engines or from business owners who have a company name that is closely linked to the services they offer or area they work in.

Examples of partial matches include having vital keywords like “insurance”, “loans”, or “casino” in the domain name or adding words like “hub”, “network”, or “quick” to the beginning or end of the domain, such as casinohub.com, everydayinsurance.com or quickmoney.com

This is different from an exact match domain (EMD) which stipulates the exact keywords you are trying to rank for in your domain name e.g carinsurance.com, plumbing.com, bestcasinos.com

Content created in partnership with Tudor Lodge Consultants.

Why can partial match domains be an issue?

Historically, having an exact match or partial match domain was a sure-fire way to rank top for your target keywords – only for Google to weigh this down considerably in recent years as a way to make SEO positions more ‘earned’ rather than ‘gained.’

Partial match and exact match domain have been shown to have a higher click-through-rate (CTR) in search results – largely because they mention the exact words that the customer is looking for. Unsurprisingly, these domains can be worth thousands and are put on sale through the likes of GoDaddy and 123 Reg.

Whilst having a partial match domain can be an advantage for SEO, there is always the risk of exposing your business to a Google penalty, especially as Google’s guidelines become more strict and give preference to brands that demonstrate good use of the content, link-building, varied traffic sources, and user experience.

Although you may demonstrate very good SEO results initially, you may find yourself compromised during the next algorithm update, which could have a massive impact on your website and its rankings – and make it very challenging to recover from the penalty. Not to mention, the financial implications to you and your client.

Therefore, being conscious of partial matches and how they work for SEO is of vital importance.

When partial match domains are high risk

Partial matches are high risk when optimizing in an industry that is very highly competitive and prone to algorithm updates – such as casino or gamblings, loans and credit, finance and insurance, web hosting, FX, and more.

Reason 1: There is a risk that you may use too many keywords in your URL, meta-data, and content and this is deemed as keyword stuffing by Google and is therefore penalized in the next update.

Reason 2: You may be generating links back to the site, but getting your brand name linked back to the site might be considered overkill if it mentions high-risk words like “casino”, “loans”, or “insurance” too often.

When partial match domains are low risk

Partial match domains are low risk when targeting local SEO searches (that is, a location) or the keywords are not competitive.

After all, if you have the domain name malibu-hairdressers.com, there are only going to be a handful of hairdressers in the Malibu area to compete against and this is a viable name for a company in that area. Also, local SEO searches are not often included in algorithm updates, which makes them a safer bet and you can always gain good and free exposure through the three results that feature on Google Local Listings.

If your keywords are not competitive and you are more or less the only person in your industry, you should be low risk, since you may not need many optimizations to get to position one of Google and the role of keyword stuffing does not come into play as much.

In addition, if your website is an information resource, you are trying to capture lots of search phrases and not heavily relying on just a few that might be struck by an algorithm. A website that is full of guides or news, should generate content and links more naturally, even though it has a partial match domain. Successful examples of sites like this include searchenginewatch.com, moneyadviceservice.co.uk, and smcrcompliance.com.

How to optimize partial match domains

1. Be as natural as possible

If you have a partial match domain and are already optimizing it, try to make the SEO as natural as possible. Create good quality content guides or blog posts and when getting links, drive them towards these pages, not your money pages.

If you are linking back money pages, use anchor like ‘read more’ or ‘find out more’ to hyperlink back to them. Try to stay clear or exact match or partial match anchor text as this could be seen as too spammy. It’s not too late to message all the links you have generated so far and get them redirected to safer pages or blog posts on your website. This approach may take longer but will be much more safer and effective long-term.

2. Manage your keyword stuffing

Try and avoid using the main keyword like “casino” or “insurance” too often. One of the simplest ways is to mention it one only in the meta-title, meta-description, and URL too.

Rather than: quickcarinsurance.com/car-insurance

Use: quickcarinsurance.com/car

3. Try to avoid using one from the start

If you can avoid using a partial match domain from the start, this would be ideal. As SEOs, we never know what is round the corner with Google’s guidelines, but we can certainly see the trend of brands taking center stage on page one. So with this in mind, try using a brand name if you can.

Be clever with your domain name: You do not necessarily have to include the money word to get the value of a high-click-rate. You can be smart with your domain choices, such as the company Fetch.com which is a pick-up delivery app, or Paw.com for dog accessories, or GetIndemnity.co.uk, the large business insurance broker. Think of good synonyms or words connected to the brand, without compromising your Google ranking.

4. Manage the expectations of your client

The majority of SEO clients want quick results, even though we really need six to 12 months (or longer) to show the full impact of our work. When pitching to a client with a partial match or exact match domain, you need to manage expectations that there might be a fall in rankings during the course of a year due to an algorithm change – and you may need to make changes for this to recover. Someone with a long-term view on their SEO will appreciate this, but someone who wants quick results will not and will likely demand their money back before you know it.



Source link

Continue Reading

Trending

Copyright © 2019 Plolu.