Connect with us


How to recognize a lost cause in SEO projects and request for proposals (RFPs)



Like many of my search colleagues, I genuinely enjoy seeing my search engine optimization (SEO) projects succeed. I like seeing client sites in Position Zero. I like seeing client listings in the People Also Ask section of a search engine results page (SERP). I especially like seeing top search listings with rich snippets − such as recipes – because I know how technically challenging they can be.

Even a second-page listing pleases me. People often feel that not being on the first page of search results is a bad thing. I don’t. Reason? I’ve seen plenty of sites get great conversions with listings on the second page of search results. Serious, knowledgeable searchers know to dig deeper into search results when they want reliable, timely, and accurate information.

And this brings me to this article’s topic: lost causes or no-win situations in SEO. I am not perfect. I make mistakes. Sometimes, I get involved with an SEO project without initially knowing that I’m proverbially walking into a lost-cause situation.

I hope that sharing my experiences will help my fellow SEOs.

Content management system doesn’t support SEO

First, I do not agree with some of Google’s statements about BERT. You see, BERT is all about words, writing, linguistics, keywords (yes, I said it) and context.

When searchers arrive at a site via a web search engine, they usually do not arrive via the site’s home page. They tend to land on the page in the middle of a site. This concept is known as Inceptor’s Pyramid.

Figure 1: Inceptor’s Pyramid. The inverted pyramid illustrates that most entries into a website are not via the home page.

Searchers want to know that they arrived on the right page and the right website. For these reasons, communicating aboutness on web pages near the top of the screen is extremely important. Validating information scent is also important for individuals performing search queries. They want to see their keywords, or query words, on the page.

In other words, the landing page should validate and support searcher mental models. If websites can achieve these three items, they normally receive qualified search engine traffic over time.

In summary, any content management system should support the following items:

  • Inceptor’s Pyramid
  • Aboutness
  • Information scent
  • User/searcher mental models

Question: If a CMS does not support these four items, is this SEO situation a lost cause?

Answer: Maybe.

The aforementioned four items are SEO principles. They are not flavor-of-the-month SEO concepts. They were fundamental concepts 20 years ago. They are fundamental concepts now.

I understand that search engine algorithms change. Before BERT, there was RankBrain. How many Google updates have we seen over the years? Medic, Panda, Penguin, Top Heavy, Hummingbird, Pirate, Pigeon – to name a few. Even with all of these changes, these four SEO principles still apply to all types of web documents.

After I sign a non-disclosure agreement (NDA), I will want to know if a prospect’s CMS will support our firm’s SEO recommendations.

What I look for is willingness to:

  • Modify the existing CMS to be more search-engine friendly,
  • Purchase a different (and less expensive) CMS that better accommodates searchers and search engines, or
  • Create a custom CMS that is 100% tailored for website owners, their target audience(s), and technology.

I do not give away specific recommendations in a proposal. I simply want to know whether or not: (a) implementing our recommendations is possible and (b) client teams will support our recommendations.

Website design doesn’t support SEO

This type of situation is a tricky one. Many organizations use pre-formatted design templates. Website owners might like a particular design. They might like the “cool features” on them that they believe will elevate their site in terms of UX (user experience). In the usability industry, aesthetic design refers to the aesthetic-usability effect:

Definition: The aesthetic-usability effect refers to users’ tendency to perceive attractive products as more usable. People tend to believe that things that look better will work better — even if they aren’t actually more effective or efficient.

In addition, many design-template sellers make the claim that their designs are search-engine friendly. I find that claim to be somewhat misleading. Most of the time, it simply means that the design template creates URLs (web addresses) that search engines can crawl.

Search-engine friendly design involves far more than a URL structure. Color matters. How text is formatted and placed is important. A visual hierarchy that supports task completion is critical for both SEO and usability. And 10 design templates are reasonable for a small site. An enterprise site should have considerably more design templates.

Question: If website owners are too attached to a site design that you know is not completely search-engine friendly, is this SEO situation a lost cause?

Answer: It depends on the site design.

Some design elements are easy to modify via cascading style sheets (CSS), such as modifying the color, font/typeface, alignment and other formatting.

Some design elements are more difficult to modify, such as navigation elements that appear in the wrong place…where users/searchers do not expect to see them. Some navigation elements are not in the design at all, like location-based breadcrumb links and fat footers.

My answer to this question is similar to the answer I gave for an allegedly search-engine friendly CMS. After I sign a non-disclosure agreement (NDA), I will want to know if a prospect is willing to modify and/or add design templates

What I look for is willingness to:

  • Modify the existing design to be more search-engine friendly,
  • Purchase a different (and less expensive) site design that better accommodates searchers and search engines, or
  • Create a custom design that is 100% tailored for website owners, their target audience(s), and technology.

I do not give away specific design recommendations in a proposal. Again, I want to know whether or not implementing client teams will support our design recommendations. Sometimes, I use one example of design modification to measure how intense (or not) the pushback is. I can do this during pre-bid phone calls.

Let’s use location-based breadcrumb links as an example. Reason? Even now, I constantly encounter resistance to: (a) their usage, and (b) the way they should be formatted.

My firm is very much in favor of location-based breadcrumb links because one to two lines of text communicates a lot of very important information to both human users and technology. Furthermore, when implemented properly, they communicate context…something that is important for the BERT algorithm.

Do you know what I’ve heard the most complaints? From designers. The most common complaint? “They ruin our design.”

Well, that is not exactly true. The issue isn’t the presence of breadcrumb links. The issue is their design.

Breadcrumb links can be formatted in multiple ways. (But if you want them to appear in search listings, you will have to follow breadcrumb structured data guidelines.) It’s the design team’s job to make them look good and to follow usability, UX, and search engines’ guidelines.

If the design team is unwilling to modify content to make it more user-friendly and search-engine friendly? And if other teams kowtow to the design team’s decisions? In all likelihood, the proposal will be a lost cause until all teams learn to work together to accomplish common goals.

Information architecture doesn’t support SEO

According to the former Information Architecture Institute, information architecture (IA) is organizing and labeling content so that it is: (a) easier to use, and (b) easier to find.

Ideally, information architecture should PRECEDE (come before) website design and development. Many web design and development issues arise because the content isn’t organized well in the first place. Furthermore, a website’s labeling system won’t be as effective because the content isn’t organized properly.

I am an educated, trained, and experienced information architect. I can practically look at a website and tell if its information architecture and corresponding navigation system is problematic. What I want to know before I submit a proposal are two questions.

Questions: Are website owners willing to modify information architecture? Will there be one or more executive and/or managerial champions to ensure that recommendations will be implemented by design, development and content teams?

Answer: Both answers should be a resounding YES.

I once had a client with a successful e-commerce site. However, search traffic had leveled off and was beginning to decrease.

What was great about working with this large organization is that they had very talented UX/usability staff. This group knew when to use specific tests to solve specific problems. They knew how to use qualitative data to understand quantitative data in context.

For example, they learned that a faceted classification scheme was the best IA for organizing the major portion of their website. They even minimized duplicate content delivery to both human users and search engines.

Figure 2:facet taxonomy allows an item to be assigned to multiple taxonomies (sets of attributes), enabling the classification to be ordered in multiple ways, rather than in a single, predetermined order (as in a strict hierarchy). This definition is from one of my favorite books: Introduction to Cataloging and Classification by Arlene G. Taylor.

The problem? The web development team took it upon themselves to mix in a facet that had little to do with the classification system. You could even observe the confused expressions on users’ faces (test sessions were videotaped) after the facet was added.

The solution was simpler than anticipated. The unusual facet should be removed. Instead, the facet links should be modified as contextual navigation (upsells, in particular). We could even show the search-traffic decrease corresponding with the addition of the confusing facet.

We were able to present this solution to all of the company’s teams (marketing, content, design, development, UX, etc.) I admit I was surprised when we dealt with the development team. They did not argue. They did not debate or ask for further clarification. It seemed as if they were in agreement.

Boy was I wrong. The answer came to me in an email. “We don’t believe you.”

Nothing I said to this team mattered despite my years of experience. To them, I didn’t know what I was talking about. Other teams agreed with our solution…just not the development team.

Was this particular SEO situation a lost cause? Yes, it was. This company clearly needed an executive champion and a development manager to ensure that all tickets from the UX/usability team were implemented properly.

Until that happened, there was little else we could do for this company. The problem was the architecture. The solution was to fix the architecture.

I understand that many SEO firms specialize in link development. Websites can get instantaneous, allegedly high-quality link development in a short amount of time. That solution sounds attractive to many design, development, and architecture teams. Team members don’t have to admit that there might be issues with how they implemented findability best practices.

A website is a form of communication to both human users and technology. On large sites, teams should work together to ensure they are communicating clear and consistent messages to human and technology users. There’s usually give-and-take on all sides.

Overall, though, I hope all readers understand that we are all on the same team, even outside consultants. We want what’s best for the users of your site.

Shari Thurlow will be speaking at the SMX East session, “Making The Pitch: Putting Your Best Foot Forward Without Giving Away The Goods” on Nov. 13.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Shari Thurow is the Founder and SEO Director at Omni Marketing Interactive Omni Marketing Interactive and the author of the books Search Engine Visibility and When Search Meets Web Usability. Shari is currently a contributing editor for the ASLIB Journal of Information Management. She also served on the Board of Directors of the Information Architecture Institute (IAI) and the User Experience Professionals Association (UXPA).

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply


Google desktop favicon search results study



In 2019, Google introduced a new format in mobile search results which included a small “favicon” icon from the website, to the left of the snippet. This week, Google announced that a similar format was being launched in desktop results.

Yard carried out a user study on mobile results in September that concluded that some users think that favicon results are ads. When the desktop results launched we carried out a similar study to identify the effect of the changes.


We asked a set of 250 users to look at 12 images of desktop search results and answer the question “Does this image contain any adverts?” with a “yes” or “no.” These results were compared to results of the old format, without favicons, for the same search queries.

Search query: Car insurance

There were 4 paid ads at the top of the search results. In the old format, 73% of users identified that there was advertising on the page. In the new “favicon” format, this changes slightly to 71%.

Search query: Online programming courses

“Online programming courses” was the query used as an example by Google so it seemed right to test it thoroughly. We looked at three versions – with a single ad at the top, with no ads and with a single ad at the bottom.

Single ad at the top

Users identified that there was advertising on the page 67% of the time with the new favicon format, an increase from 63% on the old format.

Single ad at the bottom

There was no change for the result where the ad was at the bottom of the page. This suggests that the new “Ad” marker might be clearer to users, given it’s more visible at the top of the page than at the bottom.

No ads

The most significant change found was when there were no ads at all. The number of users who identified that there were no ads dropped from 65% to 57% with the new format.

Search query: home insurance

The pattern of users seeing ads that weren’t there was found consistently throughout the study. The percentage of users correctly identifying that there were no ads here dropped from 62% to 55% for “home insurance.”

Search query: Flights to New York

The biggest drop we noted was for “flights to new york” where only 58% correctly identified that there were no ads on the old format, this plummeted to 42% with the new format.


The effect that we noted on mobile, where recognition of advertising remains the same or even increases a little with the introduction of favicons is also seen on desktop. It is possible that the number of people recognizing advertising from the correct signals (e.g. “ad” markers) drops as it is clear that some users think the favicons indicate ads.

It seems likely also that some users may have missed the black ad markers due to their similar appearance to favicons but this is difficult to confirm using this methodology.

The number of people who identify ads on Google search results was alarmingly low with the old format. The new format doesn’t seem to have changed that percentage significantly but has made identification of the actual ads less accurate. In other words, people think more things are ads when they are not and may identify real ads less often.

With this change on mobile and desktop, Google has reached the limits of hiding “ad” markers and has chosen instead to change the paradigm, making natural results appear more like regular results.

The logical conclusion of this approach is that the idea that search results are paid for is normalized, gradually increasing the amount of advertising space which is considered acceptable to users.

That assumes, of course, that we shouldn’t accept Hanlon’s razor to never attribute to malice that which can be adequately explained by stupidity. In this case, Google is smart enough to know exactly what it is doing and has tested these changes thoroughly before rolling them out.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Richard Falconer is an experienced SEO and managing director of Yard, a UK based global agency that produces brand performance through data.

Continue Reading


OneSearch, Verizon Media’s new search engine, sounds awfully familiar



Verizon Media has launched OneSearch, which appears to be a direct DuckDuckGo competitor, capitalizing on heightened awareness around privacy and tracking issues. The new search engine will serve results powered by Bing and run contextual ads rather than behaviorally targeted ads that rely on cookies.

Microsoft and Verizon Media. OneSearch’s organic search results are provided by Microsoft’s Bing search engine and its contextual ads will be served by Microsoft Advertising (formerly Bing Ads) under the Verizon Media and Microsoft Advertising partnership. However, at the time of publishing, ads do not appear to be showing on search results pages.

Privacy features. OneSearch does not employ cookies to track online behavior for targeting or retargeting users with ads.

The search engine encrypts search terms entered by the user and builds the search URL using those encrypted search terms. The encryption key expires after one hour, which may stop third parties from accessing search history via browser history after the one-hour period.

“Advanced Privacy Mode,” which is enabled by default, must be turned on to encrypt search terms and expire search history links.

Trending Now. When a user clicks on the search box, a list of “Trending Now” searches appears (before the user begins inputting their search terms).

“Trending Now uses search logs to determine if there are entities (people, places or things) for which searches have been peaking recently,” a Verizon Media spokesperson told Search Engine Land. “We compare the number of searches for that entity within a short, recent, period of time with the average baseline for that entity over an extended period of time. A pool of those that are showing the most dramatic uplift are selected, and we rotate through that pool.”
Why we care. OneSearch offers another search option for privacy-oriented users. DuckDuckGo, which also serves contextual search ads that do not rely on cookies, has gained in popularity over the years and will appear on Google’s search choice screen for Android users in the EU, which may increase its share of the mobile search market. OneSearch will have to differentiate itself if it seeks to overtake similar competitors such as DuckDuckGo, let alone more well-known search engines.

The way that OneSearch leverages trending searches also lends itself to the content that other Verizon-owned companies, such as Yahoo, produce.

“We crawl the web for content, which includes Verizon Media websites, and pull the most recent, relevant content from that pool,” a Verizon Media spokesperson told Search Engine Land. “If you tend to see Yahoo content, that just means Yahoo has a good depth of recent, relevant articles that pertain to the content.”

About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Continue Reading


Case study: The true value of informational content for e-commerce SEO



The true value of informational content for e-commerce SEO is often difficult to prove. Some content marketers and SEOs are convinced that informational content can serve as a link-worthy asset that will attract natural links from other websites. The acquired links are then supposed to help improve the SEO performance of the entire online shop.

But how do we isolate factors to demonstrate that it is indeed the informational content that contributes to a website’s overall SEO performance, and not other optimizations and developments? This article presents a case that supports the theory of informational content helping commercial pages rank better and generate additional sales.

Are you in a hurry? Jump straight to the TL;DR summary of this article.

Background: SEO strategy and performance

The e-commerce site we are looking at in this case study was first launched in 2011, but the current SEO strategy was only implemented in 2017. In a nutshell, it consists of the following activities:

Technical SEO:

  • Making the website’s content (60.000 product pages and 80 category pages) crawlable and indexable for search engines
  • Heavy focus on improving page speed for users and search engine crawlers


  • Reorganizing the existing informational content pages and adding new ones
  • Adding contextual internal links from informational content to category pages

Active link building has never played a role in this website’s SEO efforts. All links that are currently pointing to the domain were either by-products of other marketing activities and partnerships or were generated naturally by people voluntarily linking to the website’s content.

This approach has yielded great results for the business over the past two and a half years. Since the implementation of the new SEO strategy, the domain’s visibility in Google’s SERPs, as calculated by Sistrix, has grown significantly. The growth was only interrupted by a major setback caused by the so-called Medic Update in the summer of 2018, but it recovered during the March 2019 Core Update:

The growth in visibility has translated into an increase in sales generated via organic search from EUR 359k in 2016, the year before the SEO strategy was implemented, to EUR 914k in 2019:

The revenue numbers for organic search traffic were tracked with Google Analytics and calculated with the help of a custom attribution model that does not only take into account the last interaction before the sale, but also factors in all previously tracked interactions of a user with the website.

Using the same attribution method, we can also show the contributions of the different page types as landing pages to the overall revenue generated through organic search. In 2019, the informational content pages only attracted traffic worth 2,36% of the overall SEO revenue (screenshot from the Google Analytics Model Comparison Tool):

One area in which the informational content pages have shown a very good performance is the natural acquisition of backlinks. Out of the ten pages on the website that have the highest number of links from other domains pointing to them, five are informational content pages, four are category pages and one is the home page.

Looking at the available information, it is hard to tell which role the informational content pages really play for the great overall SEO performance of the shop. Their direct contribution to sales is small, but they do have a decent share of the website’s backlinks pointing to them. Do these backlinks really help the commercial shop pages rank better and generate more sales through organic search traffic?

Removing all informational content pages

In late 2019, a drastic business decision brought about some major changes for the company. Despite the great performance over the past few years, the shop is just a relatively small player within a bigger organization. The company belongs to a bigger corporation, which also owns a big direct competitor of the shop. In order to cut costs, it was decided to move the entire handling of the online shop over to the big direct competitor.

The new owners are planning to replace the entire shop with their own systems, and they only want to keep the domain name and the logo of the current shop. They have made it very clear that they are not planning to use any of the informational content that has been built up over the past years. Because of this, the company that is currently still in control of the shop decided to remove the content and save it for other projects that they might work on in the future.

Roughly 25 informational content pages were removed and the URLs were redirected to the shop’s home page. The redirects were not implemented in the hope that rankings would be transferred to the home page. In a hopeless situation, setting up redirects instead of 404 or 410 status codes was just a spontaneous decision without any specific motivation. No other changes were made to the shop in the weeks before and after the informational content was removed.

Impact on the website’s rankings

For the first few days after the informational content pages were removed, Google seemed to be quite forgiving. Some of the URLs of the removed pages kept ranking although they were redirecting all traffic to the shop’s home page. This might also be due to the fact that the pages were not crawled immediately, so it took Google a while to detect and process all changes.

Ten days after the content was removed, all rankings for the directory that previously contained the informational content pages were completely gone:

Please note that the daily visibility curve in the above screenshot might not represent the drop with complete accuracy. Although Sistrix calculates the visibility daily, it seems that they do not scrape the SERPs for every single one of the millions of keywords in their database every day, so ranking changes might only have an impact on the daily visibility graph a few days after they occur.

Interestingly, the overall visibility of the domain also took a serious blow, with losses far bigger than just the visibility of the removed directory. Within three weeks after the removal of the informational content pages, the shop had lost almost one third of its overall visibility, although the removed content previously only made up roughly 1% of the domain’s visibility:

The home page and several category pages lost lots of their page 1 rankings for commercial intent search queries with high search volumes. The following screenshot from the Sistrix ranking changes report shows an extract of the most important rankings that were lost after the informational content pages were removed:

Please note that the shop did not sell sunglasses and that the language was not English. The keywords and the URLs in the above screenshot were changed in order to protect the identity of the business, but the search intents were maintained and all other numbers in the screenshot (search volume, CPC, positions, etc.) are real.

Are the ranking drops and visibility losses of the home page and category pages directly linked to the removal of the informational content pages, and if so, how and why?

Possible reasons for the ranking drop

The most convincing theory for why the rankings of the home page and category pages dropped after the informational content pages were removed is related to links. As mentioned above, the informational content pages had a fair amount of links from other domains pointing to them. Within their content, there were also contextual links pointing to the home page and category pages, which were supposed to pass on the relevance of the backlinks pointing to the informational content pages to pages that could cater to more commercial search intents.

After the removal of the informational content pages, the backlinks pointing to them lost all of their relevance and the internal links pointing from the informational content pages to other pages on the domain were also completely lost.

However, there are a number of other factors that might play a role and there are also some unknowns that should be addressed.

The URLs of the informational content pages were redirected to the home page, which probably results in the backlinks pointing to the original pages no longer passing all of their relevance to the redirect targets, as their content is completely different from the originally linked pages. Google officials have confirmed that redirects to less relevant pages can be treated as so-called “soft 404s.”

The question arises whether this situation would have played out differently if the URLs of the informational content pages had given back 404 or 410 status codes instead of redirecting to the home page. Would a backlink to a 404 page or to a URL that gives back a 410 status code lose less relevance than a “soft 404” caused by a redirect to a non-matching target?

It would seem that in this particular case, it would not make a difference if the pages gave back a 404 or 410 status code instead of redirecting to the home page. The only way to save some of the relevance of the backlinks might have been to redirect each removed URL to a similar piece of content, which was not an option in this case.

It might also be tempting to connect this case to theories about the topical relevance of the entire domain or the more recent SEO buzzword “E-A-T”. One might argue that informational content has a value in itself that goes beyond backlinks and internal links passing on relevance to commercial pages. While these ideas should not be discarded entirely, they are quite vague and even more difficult to prove.

Another unknown that needs to be addressed are external factors that might have had an impact on the website’s rankings. The ranking changes that happened after the informational content pages were removed might be coincidental and not directly related to the removal of the content. It is always difficult to completely exclude the possibility of Google algorithm changes or updates being the real cause of observed ranking changes.

The next section of this case study delivers more indicators that support the belief that the ranking changes were not coincidental, but indeed related to the removal of the informational content pages.

Recovery after putting the content back

About 3 weeks after removing the informational content pages, the company that was still in control of the website decided to put the entire content back temporarily. This decision was made for the sake of SEO science, in order to test if the ranking loss was indeed related to the removal of the informational content pages. Also, they wanted to hand over the website in the best possible state, even if the new owners had not changed their mind about using the informational content on the new website.

The following screenshot of the daily Sistrix visibility shows what happened to the domain’s overall visibility in Google’s search results after the informational content pages were put back:

Three weeks after putting all informational content pages back in place, the overall visibility of the website has fully recovered and gone back to the level it had before the informational content pages were removed. The home page and category pages regained most of the top rankings for commercial intent queries that they had before the removal of the content, except for some slight changes that are within the usual range you would expect over a period of six weeks, due to normal fluctuations and seasonality.

What do we learn from this case?

The main takeaway from observing this case is that the data strongly suggests that the informational content pages do indeed help the home page and category pages rank better for queries with commercial intent.

The exact distribution of factors remains unclear, but it seems likely that links from other domains pointing to the informational content pages together with internal links pointing from the informational content pages to commercial pages play a major role in the improved SEO performance of the commercial pages.


  • An e-commerce site with 60.000 product pages, 80 category pages and 25 informational content pages removed all informational content pages and redirected the URLs to the home page.
  • Before their removal, the informational content pages only generated 2,36% of sales via organic search traffic, but a significant share of the domain’s backlinks pointed to them.
  • After the removal of the informational content pages, the shop lost about one-third of its overall visibility as the home page and category pages lost most of their top rankings for commercial intent search queries.
  • About three weeks later, the informational content was put back, in order to test if the ranking drops were indeed related to the removal of the content.
  • Another three weeks later, the domain’s visibility was fully recovered and the home page and category pages regained their good rankings for commercial intent search queries.
  • The data suggests that the informational content pages do indeed help the home page and category pages rank better for search queries with commercial intent.
  • The most likely explanation lies in the relevance of backlinks from other domains pointing to the informational content pages, which is then passed on to commercial pages via contextual internal links.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Eoghan Henn is a freelance technical marketer and the co-founder of searchVIU, a website migration SEO tool provider. He also teaches digital marketing at the University of Santiago de Compostela. Before moving to Spain and starting his current roles, he worked as a consultant for digital marketing agencies in Germany and Belgium. His areas of specialization include international and technical SEO, Google Tag Manager implementations and web analytics.

Continue Reading


Copyright © 2019 Plolu.