Google has come a long way since it was first introduced to the public in 1998.
As the search engine has evolved, so has the SEO industry.
But in that time, we as an industry have also moved further away from considerations of Google’s actual ranking factors and become more concerned with digital strategy and long-term thinking.
Overall, this is a positive change.
However, it would be a mistake to neglect factors where strong evidence of an influence on the search results exists.
A great deal has changed in recent years and we haven’t really been discussing how those changes may have affected the ranking factors.
Let’s revisit what may be happening with the algorithm and discuss a few potential new factors as well.
Note that while we reference correlation studies, experiments, and Google patents as evidence for some of these potential factors, nothing short of confirmation from a Google representative should be considered proof that a factor exists.
In August of 2014 Google announced that the use of SSL would be considered a positive ranking factor.
At the time they said that the change would impact “fewer than 1 percent of global queries,” but they were very clear that “over time, we may decide to strengthen it” in an effort to encourage all websites to use HTTPS.
Google’s Gary Illyes later said that HTTPS could act as a “tie-breaker.”
In August of 2017, Google Chrome started marking any page with a form as “insecure” in the browser if it did not use HTTPS.
This indicates that Google is still concerned with HTTPS and may continue to increase its influence as a ranking factor in the future.
In April of 2015, Google began rolling out a mobile-friendly update, designed to identify pages that were difficult to use from mobile devices and remove them from search results on mobile devices.
Various successor updates have increased the importance of this factor since.
3. Mobile Interstitials
In January 2017, Google began rolling out a penalty for sites that use intrusive pop-ups and interstitial pages that interfere with the mobile user experience.
4. Mobile-First Indexing
Google has introduced mobile-first indexing, meaning that for a large portion of sites, Google indexes the mobile-version of a page rather than the desktop version, and uses that for all search results (on both mobile and desktop devices).
Google notified sites that were mobile-first via the Google Search Console.
5. Safe Search
Because Google blocks certain types of adult content by default, some pages won’t show up in search results unless safe search is turned off.
Safe search exists primarily to block pornographic results, but bear in mind that explicit language could result in your pages being blocked from certain search results.
6. Domain Age
Former Googler Matt Cutts previously confirmed that domain age is a factor, but past a relatively short period of time, it isn’t an important factor.
For example, the difference in ranking between a domain that is a year old and one that is six months old is virtually identical.
Only brand new sites will be viewed with more scrutiny due to the possibility that they could be spam sites.
7. Domain Registration Length
A Google patent suggests that Google has considered factoring in how far in advance you have registered your domain.
Domains that have been registered further into the future can be assumed to be more serious business endeavors, while those registered only on the short-term may be more likely to be considered spam for link schemes or something similar.
8. Public Who-Is
Cutts also pointed out in the past that if the who-is data for a site is kept private, that can be a sign that it is operating as part of a link scheme or otherwise not trustworthy.
While private who-is on its own is unlikely to result in a penalty, it is possible that it can limit your ability to rank, especially if this trait is mixed in with other traits that might look nefarious to the search engines.
From here, it’s a reasonable extension to think that if the actual who-is owner has been previously penalized by Google, it’s likely that new sites with the same who-is data will face difficulties ranking.
9. Content Depth
Cutts has confirmed that giving the search engines more information to work with can help with SEO.
Many Page 1 results have word counts that are more than 2,000 words – and the number one position averages roughly 2,450 words.
While it is clear that word count itself can’t be a ranking factor, the comprehensive nature of content seems to be a factor that Google takes into account, either directly or indirectly.
10. Heading Tags
Since heading tags were specifically developed to provide a hierarchy of information for the underlying document, there is reason to believe that Google interprets them as such, rather than simply as bigger words.
The most important of these is, of course, the H1 tag, which identifies what the page is about.
SEO experiments suggest that using the H1 tag does indeed give a slight rankings boost, but it’s important to keep in mind that its practical significance is limited.
Including heading tags from the get-go is useful, but revamping an entire site to include them is likely a poor use of resources.
11. Meta Description Length
While not technically a ranking factor, meta descriptions are important because of their ability to convince users to click through and visit the page.
Google’s cutoff length for meta descriptions is variable, but the most common cutoff length appears to be at roughly 300-325 characters.
12. Internal PageRank Distribution
The foundation of the Google search engine was PageRank, an estimation of the probability that a random internet user would stumble upon a web page if all they did was click on links at random.
Setting up your site’s internal link architecture to maximize the chances that a user will be able to find any given page is still highly important, not only for rankings, but for usability.
13. Authoritative Backlinks
Everybody takes this factor for granted, but it’s worth reminding ourselves of its meaning within the framework of PageRank, which was the probability that somebody would stumble on your page if all they did was click random links.
As a result, links from pages that also had a lot of inbound links, which in turn had plenty of inbound links, and so on, had a stronger impact on rankings.
Google has clearly reevaluated the way that links are afforded value, but the core principle is the driving force to remember. Its goal with backlinks was always to determine how likely it would be for somebody to come across your site naturally.
While their models have gotten far more sophisticated, the rationale behind PageRank is likely still the driving force behind how they analyze links.
14. Link Anchor Text
Google factors the anchor text of a backlink into the rankings and uses that information to determine whether a search result is relevant for a query.
However, in response to abuse, Google has made it clear that anchor text manipulation (e.g., article marketing or press releases with keyword-rich backlinks) can result in a penalty, and algorithms such as Penguin have been built to combat this kind of manipulation.
Efforts to stuff keywords into anchor text are generally frowned upon and considered dangerous in modern SEO, so the goal should always be to include anchor text that is useful, informative, and relevant for the user.
15. Backlink Stability
A Google patent suggests that Google may be affording more influence to links that have stood the test of time, since spam links such as “followed” sponsored links, spam comments, other non-editorial links, and similar forms of manipulation often result in links that will eventually be removed.
16. Link Context
The semantic content surrounding a link is something that Google has been considering using since at least 2004.
The methods in which this information is put to use have undoubtedly grown far more sophisticated since the patent released at the time.
17. Link Relevance
Google acquired the “Hilltop algorithm” in 2003, a method of identifying how relevant a page was for a given keyword based on whether it had backlinks coming from an “expert” page.
The expert pages were curated lists containing links to pages on specific topics, and the pages that received the most links from such expert pages where considered authorities.
Undoubtedly this has been heavily altered since its introduction, but the core idea is that links from pages about a related topic indicate that your page is also relevant to that topic, even if they don’t necessarily target identical keywords.
18. Image Alts
In other words, if somebody references your image with HTML, the image alt tags that they use may behave like anchor text.
19. Outbound Links
Cutts once cryptically stated that:
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.”
While it’s clear that linking out can’t be a PageRank boost, it can be a trust signal if the sites you link to are trustworthy, and this seems to have been the case for a long time.
20. Query Deserves Freshness
This classifier is used by the search engine to identify if the query the user is searching for indicates that they would be more interested in fresh content.
This means that for some industries and topics, the more recent you have updated your content, the better your rankings will be.
Keep an eye on the search rankings for keywords you are targeting to determine if they are frequently updated.
This is essentially the opposite of freshness.
According to a Google patent, for certain types of queries, a page may be more trustworthy due to the fact that it has been up for longer, as well as due to the ways in which it has been updated since it was first introduced.
22. Keyword in Title
Google’s John Mueller has confirmed that the title is a ranking factor, although he downplays the idea that it is the most critical part of a page.
23. Keyword in URL
A Google patent suggests that Google may be using keyword data within the URL for a page as a method of determining, in part, what the page is about.
Cutts has also made it clear that an excessive number of keyword variations included within the URL could count against you.
While ranking factors alone should not guide your SEO strategy, it’s important to remind ourselves which factors the search engines may be considering when they rank our sites in order to develop a strategy best equipped to perform well in the current digital landscape.
More SEO Resources:
Subscribe to SEJ
Get our weekly newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!