Connect with us

SEO

How to prepare for a JS migration

Published

on


An 80 percent decrease in organic traffic is the nightmare of every business. Unfortunately, such a nightmarish scenario may become reality if a website migration is done incorrectly; instead of improving the current situation it eventually leads to catastrophe.

Source: http://take.ms/V6aDv

There are many types of migrations, such as changing, merging or splitting the domains, redesigning the website or moving to a new framework.

Web development trends are clearly showing that the use of JavaScript has been growing in recent years and JavaScript frameworks are becoming more and more popular. In the future, we can expect that more and more websites will be using JavaScript.

Source: https://httparchive.org/reports/state-of-javascript

As a consequence, SEOs will be faced with the challenge of migrating to JavaScript frameworks.

In this article, I will show you how to prepare for a migration of a website built with a static HTML to a JavaScript framework.

Search engines vs. JavaScript

Google is the only search engine that is able to execute JavaScript and “see” the elements like content and navigation even if they are powered by JavaScript. However, there are two things that you always need to remember when considering changes to a JS framework.

Firstly, Google uses Chrome 41 for rendering pages. This is a three-year old browser that does not support all the modern features needed for rendering advanced features. Even if they can render JS websites in general, it may happen that some important parts will not be discovered due to the reliance on technology that Google can’t process.

Secondly, JS executing is an extremely heavy process so that Google indexes JS websites in two waves. The first wave gets the raw HTML indexed. In the case of JS-powered websites, this translates to almost an empty page. During the second wave, Google executes JavaScript so they can “see” the additional elements loaded by JS. Then they are ready for indexing the full content of the page.

The combination of these two elements makes it so that if you decide to change your current website to the JavaScript framework, you always need to check if Google can efficiently crawl and index your website.

Migration to a JS framework done right

SEOs may not like JavaScript, but it doesn’t mean that its popularity will stop growing. We should get prepared as much as we can and implement the modern framework correctly.

Below you will find information that will help you navigate through the process of changing the current framework. I do not provide “ready-to-go” solutions because your situation will be the result of different factors and there is no universal recipe. However, I want to stress the elements you need to pay particular attention to.

Cover the basics of standard migration

You can’t count on the miracle that Google will understand the change without your help. The whole process of migration should be planned in detail.

I want to keep the focus on JS migration for this article, so if you need detailed migration guidelines, Bastian Grimm has already covered this.

Source: Twitter

Understand your needs in terms of serving the content to Google

This step should be done before anything else. You need to decide on how Google will receive the content of your website. You have two options:

1. Client-side rendering: This means that you are totally relying on Google for rendering. However, if you go for this option you agree on some inefficiency. The first important drawback of this solution is the deferred indexing of your content due to the two waves of indexing mentioned above. Secondly, it may happen that everything doesn’t work properly because Chrome 41 is not supporting all the modern features. And last, but not least, not all search engines can execute JavaScript, so your JS website will seem empty to Bing, Yahoo, Twitter and Facebook.

Source: YouTube

2. Server-side rendering: This solution relies on rendering by an external mechanism or the additional mechanism/component responsible for the rendering of JS websites, creating a static snapshot and serving it to the search engine crawlers. At the Google I/O conference, Google announced that serving a separate version of your website only to the crawler is fine. This is called Dynamic Rendering, which means that you can detect the crawler’s User Agent and send the server-side rendered version. This option also has its disadvantages: creating and maintaining additional infrastructure, possible delays if a heavy page is rendered on the server or possible issues with caching (Googlebot may receive a not-fresh version of the page).

Source: Google

Before migration, you need to answer if you need option A or B.

If the success of your business is built around fresh content (news, real estate offers, coupons), I can’t imagine relying only on the client-side rendered version. It may result in dramatic delays in indexing so your competitors may gain an advantage.

If you have a small website and the content is not updated very often, you can try to leave it as client-side rendered, but you should test before launching the website if Google really does see the content and navigation. The most useful tools to do so are Fetch as Google in GSC and the Chrome 41 browser.

However, Google officially stated that it’s better to use Dynamic Rendering to make sure they will discover frequently changing content correctly and quickly.

Framework vs. solution

If your choice is to use Dynamic Rendering, it’s time to answer how to serve the content to the crawlers. There is no one universal answer. In general, the solution depends on the technology AND developers AND budget AND your needs.

Below you will find a review of the options you have from a few approaches, but the choice is yours:

  • I need an as simple a solution as possible.

Probably I’d go for pre-rendering, for example with prerender.io. It’s an external service that crawls your website, renders your pages and creates static snapshots to serve them if a specific User Agent makes a request. A big advantage of this solution is the fact that you don’t need to create your own infrastructure.

You can schedule recrawling and create fresh snapshots of your pages. However, for bigger and frequently changing websites, it might be difficult to make sure that all the pages are refreshed on time and show the same content both to Googlebot and users.

  • I need a universal solution and I follow the trends.

If you build the website with one of the popular frameworks like React, Vue, or Angular, you can use one of the methods of Server Side Rendering dedicated to a given framework. Here are some popular matches:

Using one of these frameworks installed on the top of React or Vue results in creating a universal application, meaning that the exact same code can be executed both on the server (Server Side rendering) and in the client (Client Side Rendering). It minimizes the issues with a content gap that you could have if you rely on creating snapshots and heavy caching, as with prerender.

  • I need a universal solution and I don’t use a popular framework.

It may happen that you are going to use a framework that does not have a ready-to-use solution for building a universal application. In this case, you can go for building your infrastructure for rendering. It means that you can install a headless browser on your server that will render all the subpages of your website and create the snapshots that are served to the search engine crawlers. Google provides a solution for that – Puppeteer is a library that does a similar job as prender.io. However, everything happens on your infrastructure.

  • I want a long-lasting solution.

For this, I’d use hybrid rendering. It’s said that this solution provides the best experience both to users and the crawlers because users and crawlers receive a server-side rendered version of the page on the initial request. In many cases, serving an SSR page is faster for users rather than executing all the heavy files in the browser. All subsequent user interactions are served by JavaScript. Crawlers do not interact with the website by clicking or scrolling so it’s always a new request to the server and they always receive an SSR version. Sounds good, but it’s not easy to implement.

Source: YouTube

The option that you choose will depend on many factors like technology, developers and budgets. In some cases, you may have a few options, but in many cases, you may have many restrictions, so picking a solution will be a single-choice process.

Testing the implementation

I can’t imagine a migration without creating a staging environment and testing how everything works. Migration to a JavaScript framework adds complexity and additional traps that you need to watch out for.

There are two scenarios. If for some reason you decided to rely on client-side rendering, you need to install Chrome 41 and check how it renders and works. One of the most important points of an audit is checking errors in the console in Chrome Dev Tools. Remember that even a small error in processing JavaScript may result in issues with rendering.

If you decided to use one of the methods of serving the content to the crawler, you will need to have a staging site with this solution installed. Below, I’ll outline the most important elements that should be checked before going live with the website:

1. Content parity

You should always check if users and crawlers are seeing exactly the same content. To do that, you need to switch the user agents in the browser to see the version sent to the crawlers. You should verify the general discrepancies regarding rendering. However, to see the whole picture you will also need to check the DOM (Document Object Model) of your website. Copy the source code from your browser, then change the User Agent to Googlebot and grab the source code as well. Diffchecker will help you to see the differences between the two files. You should especially look for the differences in the content, navigation and metadata.

An extreme situation is when you send an empty HTML file to Googlebot, just as Disqus does.

Source: Google

This is what their SEO Visibility looks like:

Source: http://take.ms/Fu3bL

They’ve seen better days. Now the homepage is not even indexed.

2. Navigation and hyperlinks

To be 100 percent sure that Google sees, crawls and passes link juice, you should follow the clear recommendation of implementing internal links shared at Google I/O Conference 2018.

Source: YouTube

If you rely on server-side rendering methods, you need to check if the HTML of a prerendered version of a page contains all the links that you expect. In other words, if it has the same navigation as your client-side rendered version. Otherwise, Google will not see the internal linking between pages. Critical areas where you may have problems is facet navigation, pagination, and the main menu.

3. Metadata

Metadata should not be dependent on JS at all. Google says that if you load the canonical tag with JavaScript they probably will not see this in the first wave of indexing and they will not re-check this element in the second wave. As a result, the canonical signals might be ignored.

While testing the staging site, always check if an SSR version has the canonical tag in the head section. If yes, confirm that the canonical tag is the correct one. A rule of thumb is always sending consistent signals to the search engine whether you use client or server-side rendering.

While checking the website, always verify if both CSR and SSR versions have the same titles, descriptions and robots instructions.

4. Structured data

Structured data helps the search engine to better understand the content of your website.

Before launching the new website make sure that the SSR version of your website displays all the elements that you want to mark with structured data and if the markups are included in the prerendered version. For example, if you want to add markups to the breadcrumbs navigation. In the first step, check if the breadcrumbs are displayed on the SSR version. In the second step, run the test in Rich Results Tester to see if the markups are valid.

5. Lazy loading

My observations show that modern websites love loading images and content (e.g. products)  with lazy loading. The additional elements are loaded on a scroll event. Perhaps it might be a nice feature for users, but Googlebot can’t scroll, so as a consequence these items will not be discovered.

Seeing that so many webmasters are having problems with lazy loading in an SEO-friendly way, Google published a guideline for the best practices of lazy loading. If you want to load images on a scroll, make sure you support paginated loading. This means that if you scroll, the URLs should change (e.g., by adding the pagination identifiers: ?page=2, ?page=3, etc.) and most importantly, the URLs are updated with the proper content, for example by using History API.

Do not forget about adding rel=”prev” and rel=”next” markups in the head section to indicate the sequence of the pages.

Snapshot generation and cache settings

If you decided to create a snapshot for search engine crawlers, you need to monitor a few additional things.

You must check if the snapshot is an exact copy of the client-side rendered version of your website. You can’t load additional content or links that are not visible to a standard user, because it might be assessed as cloaking. If the process of creating snapshots is not efficient e.g. your pages are very heavy and your server is not that fast, it may result in creating broken snapshots. As a result, you will serve e.g. partially rendered pages to the crawler.

There are some situations when the rendering infrastructure must work at high-speeds, such as Black Friday when you want to update the prices very quickly. You should test the rendering in extreme conditions and see how much time it takes to update a given number of pages.

The last thing is caching. Setting the cache properly is something that will help you to maintain efficiency because many pages might be quickly served directly from the memory. However, if you do not plan the caching correctly, Google may receive stale content.

Monitoring

Monitoring post-migration is a natural step. However, in the case of moving to a JS framework, sometimes there is an additional thing to monitor and optimize.

Moving to a JS framework may affect web performance. In many cases, the payload increases which may result in longer loading times, especially for mobile users. A good practice is monitoring how your users perceive the performance of the website and compare the data before and after migration. To do so you can use Chrome User Experience Report.

Source: Google

It will provide information if the Real User Metrics have changed over time. You should always aim at improving them and loading the website as fast as possible.

Summary

Migration is always a risky process and you can’t be sure of the results. The risks might be mitigated if you plan the whole process in detail. In the case of all types of migrations, planning is as important as the execution. If you take part in the migration to the JS framework, you need to deal with additional complexity. You need to make additional decisions and you need to verify additional things. However, as web development trends continue to head in the direction of using JavaScript more and more, you should be prepared that sooner or later you will need to face a JS migration. Good luck!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Maria Cieslak is a Senior Technical SEO Consultant at Elephate, the “Best Small SEO Agency” in Europe. Her day to day involves creating and executing SEO strategies for large international structures and pursuing her interest in modern websites built with JavaScript frameworks. Maria has been a guest speaker at SEO conferences in Europe, including 2018’s SMX London, where she has spoken on a wide range of subjects, including technical SEO and JavaScript. If you are interested in more information on this subject, you should check out Elephate’s “Ultimate Guide to JavaScript SEO“.





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

Marketing strategies during COVID-19 times

Published

on


30-second summary:

  • If a site has never been promotedthere have been no SEO audits and there are a lot of errors in codethen link building will not help.
  • If the company is set up to work in messengersit can receive subscribers – even without the support.
  • SMM is less about direct sales and more about building a link between user and brandand content plays a primary role in this.
  • The main task now is to build an approximate plan to get back to the world and maintain feedback with clients and the team.
  • In this article, we discuss marketing strategies during COVID-19 on the lines of contextual advertising, SEO, email, social networks, and more.

Not everyone was as lucky in the pandemic as mask manufacturersfood delivery companiesZoom, and othersThe offline business had to decideeither to quickly move all the work online or limit the business in all directionsIn this article, we will discuss marketing strategies during COVID-19. 

Simply moving to the Internet is not enough: you need to understand what you are doing, and under the influence of coronavirus, marketing strategy is changing rapidly. I am going to explain what happens with contextual advertising, SEO, messengers, email, and social networks.  

You’ll find out:

  • How the demand structure in these channels is changing, 
  • How to work with them now, 
  • Whether marketing can be paused without affecting the company, 
  • Which marketing strategies are best to choose, even if there’s a shortage of money. 

PPC marketing in COVID-19 times

The trend in the PPC is as follows due to the decline in demand in general, we can see a decrease in demand in search, and search advertising subsidies in particular. Previously, PPC search and product advertising was the main source of sales. People were looking for something in search, saw ads, and bought. Context has always been an auxiliary marketing tool – creating demand, brand, remarketing. 

But nowadays, the trend returns a little, if you look at last month’s stats, banner ads and videos have become the main source of traffic from advertising. This is due to the fact that people are using streaming services, watching movies, TV series, courses, and there are banner ads everywhere. 

If we talk about conversions, then again, in most cases there is a fell. I would even say that the conversion has not fallen, but lengthened. In the past, a person coming from an advertisement would buy a simple product at once, and a more complicated product in a few clicks.  

Now, this funnel has lengthened. It became even more important to work with email, chatbots, maintain communication with other advertising channels. 

If in some areas the decision-making period was three-to-four days, it could grow to two-to-three weeks. I assume that this is due to the decrease in purchasing power because many people are on vacation or without work. Those who have money are not in a hurry to spend it, because it is not clear what will happen tomorrow. Even those who have an intention to buy something started to buy less. 

Conclusion 

Advertising is greatly reduced, due to this decreases the advertising competition. The click is cheaper and you can get cheaper traffic. And if you are ready to work with these long conversion leads – you can get them cheaper than before. 

SEO in COVID-19 times 

In most cases, organic traffic drops. And the positions of the site may be good, but the traffic is falling heavily. Demand has decreased in SEO, but again, it depends on the subject. In some niches demand has increased – someone was selling masks and antiseptics, and he had no demand but then suddenly got it. And there is, for example, tire fitting there is no change in demand because people need to retrain the car anyway. There’s no drop in demand. 

Traffic is falling in everything that concerns business services – if you take, for example, furniture for private use, the demand has not fallen much. And for offices, demand has fallen to almost zero, even those who have ordered before, stopped doing it.  

If now there is no need for “burning” clients, business is on pause, it is better to invest at least minimal in SEO, in social networks, in maintenance, and stop contextual advertising.  

How can your business save money while working on SEO? 

There are free sites, directories, where you can go and place links for free. If there is no money, this way you can optimize a good part of the budget. 

If a company does not work with content, you can follow this direction. This is a conditionally free tool – even if you don’t write it yourself, you can hire a copywriter at growyourstaff, conditionally it is not so expensive. Content can help SEO a lot, and it is much cheaper than buying links and working with technical optimization. 

If a site has never been promoted, there have been no SEO audits and there are a lot of errors in code, then link building will not help. If the site had been worked with before, now you just need to reduce budgets, you can buy fewer links or look for cheaper and free sources.

Pay more attention to the content – by publishing new articles you cover more keywords, more search queries. For example, if you sell laptops, then write “how to clean a laptop”, “how to pick up a laptop for games” and so on. A person looking for information – gets into an article, read advice, can subscribe to the mailing list, social networks, become your regular customer. It is possible in this way to reduce the budget for SEO. 

Messenger marketing and COVID-19 times

Being in a situation of forced closure, some businesses could not afford to keep their marketing budget at the same level. Accordingly, some suspended the work with messengers, as with any other channels generating leads. The reason was not even that the companies had no money left. But also the fact that there is nothing to sell and no one to sell due to quarantine if the business is related to offline services. 

Is it worth stopping the activities in messengers at all, if the company is very tight with money? If not, then how to reduce budgets for this channel with minimal damage?  

The work with messengers is divided into several categories. In terms of generating leads, traffic, and conversion, it is the same as in targeted advertising and other lead gen channels. If the business closes, it makes no sense to generate hot leads.

But the messengers themselves can be used in many different ways. For example, the funnel does not have to sell quickly. It can be long and work for involvement, heating, work not only with potential clients but also with existing ones. It makes sense to maintain such a funnel whether the business is working now or not. When the company opens its doors again, it will be able to sell to the same people – no one has forgotten about them, they have been communicating and maintaining relationships throughout this period – it is important. 

If the company is set up to work in messengers

The company may continue receiving subscribers – even without support, even in suspended advertising campaigns. However, as a budget cut, it is possible to suspend work with the contractor. In such a situation, this is a normal solution. In a few months, nothing should break. Thus, it is possible to cut the budget through new developments, testing, active generation of leads. 

SMM in COVID-19 times 

It’s bad for those who are affected by an offline fall. If offline is closed, the whole company has stopped working, and SMM too. For example, they somehow manage the account themselves, they only have enough strength for some content activity. And advertising – nobody simply comes to them, and they cannot work, and they do not maintain advertising. 

Those who are forbidden to work, and who can not accept clients in the office, reduced to almost zero advertising in social networks and other sources. And someone, on the contrary, increased, like VR clubs. They launched a new service – previously there was an offline point, which is now closed, and the equipment is idle. They’ve set up a rental service and are actively developing it. 

What are the changes in campaign traffic that continue working during quarantine? 

In some campaigns, traffic has increased, but this is due to the fact that the auction has been released. On Facebook, the current price of a click has dropped several times – simply because many competitors have left. The price is going down, the number of clicks is going up, and the traffic is going up accordingly. 

In general, somewhere it has increased, somewhere it has decreased. Now it all depends on the area in which you work. If you can reformat online painlessly enough, you’ll have some minimal reductions, there will be growth. 

Is it possible to save money on social networking?

In general, it is better not to reduce the number of posts, and make them better in quality. If the budget for advertising has decreased, then focus on the content. 

SMM is less about direct sales and more about building a link between user and brand, and content plays a primary role in this. If you lose content, the connection is broken.

In paid promotion it is possible to save on what does not bring results right now – it can be reduced. In terms of conversions now everything should be actively connected to analytics. You look through the analytics – does the campaign bring you additional conversions after the transition and application. If the results are down dramatically, you should turn it off. 

And the content takes a little time, especially if the company initially approached it correctly – there is a content strategy, a content plan, and so on. If it’s all there, there’s nothing stopping you from giving it all to one employee who’s sitting at home at a remote location to write according to a ready-made plan. If there is a strategy, there is nothing difficult about continuing to write, and it is not so expensive. 

Email marketing during COVID-19 times

At the moment the main task of the channel is to keep in touch with the client and not to give false hopes. Therefore, the only dynamics that are important to us are the unsubscribe rate and the remaining amount of “live” users. For example, these are openings in the last 60 days. 

I would advise not to stop and not to panic, the situation will somehow be solved and the brand will either resist or not. The main task now is to build an approximate plan to get back to the world and maintain feedback with clients and the team.

Stop everything – it’s like stopping a blast furnace, it’s easier to build a new one than the old one to run. So it is definitely worth reducing the volume, stopping the retention, and reviewing the basic onboard messages. 

It is important that customers know that you are alive and in control of the situation on your side. Therefore, informational digests and regular alerts when you update the situation shouldn’t be stopped. We all have already learned how to wash our hands and listen to all the CEOs, so if you have something specific – then write about it necessarily.

Evelina Brown is an internet marketing, trainer, and founder of marketing courses expert who has been involved in brand development and creation since 2012.



Source link

Continue Reading

SEO

Google updates, SEO education & A/B testing

Published

on


SearchCap: Google updates, SEO education & A/B testing










Source link

Continue Reading

SEO

Google AMP errors, Google My Business messaging & Search Engine Land Awards

Published

on


SearchCap: Google AMP errors, Google My Business messaging & Search Engine Land Awards










Source link

Continue Reading

Trending

Copyright © 2019 Plolu.