Connect with us

SEO

Google’s John Mueller on Why Some Sites Rank

Published

on


In a Google Webmaster Hangout Google’s John Mueller was asked why content published on an established site tended to rank higher. The publisher asked why articles on this site consistently received “top Google rankings.”

There is no simple way to answer this question. Google’s John Mueller offered a nuanced explanation of why Google trusted some sites enough to consistently rank them at the top.

Screenshot of Google's John Mueller in a Webmaster HangoutGoogle’s John Mueller discussed the effort necessary to make an authoritative site that Google trusts enough to rank consistently at the top.

The question was asked if the success was due to a lack of competition or “is it somehow even though each individual site is a sub site of the main site, any blogging gets you ranked because” of the website itself.

John Mueller responded that it’s not directly related to the domain.

“It’s more a matter of your putting out new content… that’s relevant for people who are searching at the moment and that we’re ranking them based more on that.

That’s something that we often see from various sites like, should I be blogging, should I write… ten articles articles a day or five articles a day?

…from our point of view it’s not a matter of going out and blogging and creating so many articles a day… but more a matter of… you have some really fresh content here, some of this content is really relevant for some searchers at the moment so we’ll show that.

…it’s not that blogging itself is something that makes the site rank higher or makes the content rank higher. It’s just you happen to have some new content here that happens to be relevant so we’ll show that.”

There’s an approach to content that seems to focus on quantity and quality but leaves out the part about relevance. A common mistake I see in site audits is chatty and conversational content, like you might hear at the water cooler.

For certain situations, content that is focused on relevance to a person’s situation, their goals or aspirations are more appropriate.  I believe that’s what John Mueller was getting at when he encouraged the publisher to create content that is relevant to the searchers at the moment they were searching.

I think it’s worth pointing out that he didn’t say to be relevant to the keywords. He encouraged the publisher to create content that is relevant to the searcher. 

John Mueller went on to focus on the blogging part of the question, whether blogging was the secret behind the site’s top ranking.

But that answer might not have been what the questioner was hoping for. She appeared to be focused on whether the domain itself, perhaps some kind of authority, was powering the rankings.

Thus, the publisher asked again in an attempt to get John Mueller to focus on whether or not the domain itself was powering the rankings.

She asked:

“…so it’s completely independent of the domain that I’m blogging on? There’s a lot going on on that website other that has no effect if I… start my own dot com it was blogging it would have the same effect?”

John Mueller responded,

“Pretty much… there are always some kind of supplemental effects with regard to us able to find the content quickly, us being able to understand that this website is generally creating high quality content. So there is some amount of… additional information that we collect for the website on a whole.”

This is interesting because it expands on his previous statement that you just can’t create content and expect it to rank. Here he adds that there is a process whereby Google gains an understanding that the site is a good resource to rank. He alludes to “additional information” that Google collects in order to make the determination that a site is creating high quality content.

What might he be referring to? Google’s algorithm has so many moving parts to it that it could be any number of things.

Just as an example of the complexity involved, there’s a patent filed in 2012 called, “Classifying Sites as Low Quality Sites”  that discusses a number of factors that Google could use to create a “link quality score” that could be used to classify an entire site as low quality.

The patent classifies inbound links to a site as Vital, Good, and Bad.

According to the patent, Google could then use this link rating system to lower a website’s chance of ranking:

“The system decreases ranking scores of candidate search results identifying sites classified as low quality sites.”

The above is an example of a patent that may or may not be in use at Google. The point is that there are so many ways that a site can be ranked, from links to the content itself. The reference to “additional information” can be a reference to so many things including the plethora of ranking factors themselves.

Google’s John Mueller goes on to say,

“So it’s not that you could just create random URLs on the web and put your blog post up there and we would find them magically and rank them number one.

It kind of does require some amount of structure within that so that we can understand that over time actually this is pretty good content and we should check it regularly to make sure that we don’t miss any of the updates. “

At this point the publisher tried a third time to get Google’s Mueller to say that there is something about the domain that is helping posts published on that domain to rank better.

“Okay, so there is something to the domain itself and that it’s got your attention.”

Google’s John Mueller declined to say that was the case. He offered that it’s not possible for him to comment on a site he hasn’t looked at.

He then suggested that it was her option to choose to build her own site but that it would take time for the site to get established. He positioned it as a choice between taking the time to establish something of her own for the long run or taking the easy route and using the established website to rank her articles on.

Takeaways:

  1. It’s not enough to just create content.
  2. Content must be relevant to a user at the moment they are searching
  3. Top rankings do not come right away

Watch the Webmaster Hangout here.

More Resources

Images by Shutterstock, Modified by Author
Screenshots by Author, Modified by Author

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!

Ebook





Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

Google Search Console image search reporting bug June 5-7

Published

on


Google posted a notice that between the dates of June 5 through June 7, it was unable to capture data around image search traffic. This is just a reporting bug and did not impact actual search traffic, but the Search Console performance report may show drops in image search traffic in that date range.

The notice. The notice read, “June 5-7: Some image search statistics were not captured during this period due to an internal issue. Because of this, you may see a drop in your image search statistics during this period. The change did not affect user Search results, only the data reporting.”

How do I see this? If you login to Google Search Console, click into your performance report and then filter by clicking on the “search type” filter. You can then select image from the filters.

Here is a screen shot of this filter:

How To Filter By Image Traffic in Google Search Console

Why we should care. If your site gets a lot of Google Image search traffic, you may notice a dip in your traffic reporting within Google Search Console. You may have not noticed a similar dip in your other analytics tools. That being said, Google said this is only a reporting glitch within Google Search Console and did not impact your actual traffic to your web site.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Continue Reading

SEO

Facebook Changes Reach of Comments in News Feed

Published

on


Facebook announced a change to it’s algorithms that will affect the reach of comments on a post. Comments that have specific quality signals will  be highly ranked. Low quality comment practices may result in less reach.

Comment Ranking in News Feeds

Facebook noted that not only are posts ranked in news feeds but comments are also ranked as well.

Posts with comments that have positive quality signals will be seen by more people. Posts with low quality signals will have their news feed reach reduced.

Facebook Comment-Quality Signals

Facebook noted that their updated comment algorithm has four features:

  1. Integrity signals
  2. User indicated preferences
  3. User interaction signals
  4. Moderation signals

Integrity Signals

Integrity Signals are a measure of authenticity. Comments that violate community standards or fall into engagement-bait are negative signals. Violations of community standards are said to be removed.

Engagement Bait

Facebook engagement bait is a practice that has four features:

1. React Baiting

Encouraging users to react to your post

2. Follow and Share Baiting

This is described as telling visitors to like, share or subscribe.

3. Comment Baiting

Encouraging users to comment with a letter or number are given as examples.

. Monetization Baiting

This is described as asking for “stars” in exchange for something else, which could include something trivial like “doing push ups.”

User Indicated Preferences

This is a reference to user polls that Facebook conducts in order to understand what users say they wish to see in comments.

User Interaction Signals

These are signals related to whether users interact with a post.

Moderation Signals

This is a reference to how users hide or delete comments made in their posts.

Here is how Facebook describes it:

“People can moderate the comments on their post by hiding, deleting, or engaging with comments.

Ranking is on by default for Pages and people with a a lot of followers, but Pages and people with a lot of followers can choose to turn off comment ranking.

People who don’t have as many followers will not have comment ranking turned on automatically since there are less comments overall, but any person can decide to enable comment ranking by going to their settings. (See more details here.) “

Facebook Targeting Low Quality Comments

One of the stated goals of this update is to hide low quality posts from people’s Facebook feeds and to promote high quality posts by people you might know.

This is how Facebook described it:

“To improve relevance and quality, we’ll start showing comments on public posts more prominently when:

  • The comments have interactions from the Page or person who originally posted; or
  • The comments or reactions are from friends of the person who posted.”

Read Facebook’s announcement here: Making Public Comments More Meaningful

Images by Shutterstock, Modified by Author

 



Continue Reading

SEO

Build your PPC campaigns with this mini campaign builder script for Google Ads

Published

on


Need to quickly build a campaign or add keywords to an existing one? This script will do the work for you!

All you need to do is input a few keywords and headlines in a spreadsheet and BAM! You’ve got yourself the beginnings of a great campaign.

I’m a firm believer in Single Keyword per Ad Group (SKAG) structure – it increases ad/keyword relevance and therefore improves quality score, makes CPCs cheaper, gets you a higher ad rank and a better CTR.

Sadly, building out SKAG structures is a pretty time-consuming endeavor. You can’t implement millions of keywords and ads without PPC tech powering your builds.

But if a client just needs a couple of new keywords after updating their site with new content, this script is a quick and easy solution.

And that’s exactly what I love about PPC. There’s a special place in my heart for simple scripts anyone can use to achieve tasks that are otherwise repetitive or near-impossible.

What does the script do?

This tool will save a lot of time with small-scale builds where you know exactly which keywords and ad copy you need, for example when you’re adding a few keywords to an existing campaign.

You input your campaign name, keywords, headlines, descriptions, paths and final URL, and it will output three tabs for you: one with keyword combinations, one with negatives, and ads to upload to Google Ads Editor.

It creates one exact and one broad match modifier campaign and creates a list of keywords as exact negatives in the broad campaign to make sure that search terms that match exactly will go through the exact keyword.

I’m sure you’re dying to give it a whirl, so let’s get cracking!

How do you use it?

Make a copy of this spreadsheet (note: you’ll need to authorize the script to run). You’ll find all the instructions there as a future reminder.

Once you’ve got the spreadsheet ready, input the following:

  • The campaign name
  • The campaign name delimiter to distinguish between broad and exact campaigns
  • Headline 1 (if this cell is not specified, then it will be the same as the keyword)
  • Headline 2
  • Optionally, headline 3
  • Description 1
  • Optionally, description 2
  • Optionally, path 1 and path 2
  • The final URL
  • The keywords (you can keep going outside of the box with these!)

You’ll see a handy character counter which will go red if you exceed the character limit. Bear in mind that this tool will assume that you’re using it correctly and so you’ll need to make sure that you’re staying within the limit!

You can also optionally create a second ad variant by choosing the part of your text you want to vary (e.g., headline 2 or description 2) and inputting the copy. Otherwise, just select “None” from the dropdown menu.

Once you’re done, click the gigantic “Go!” Button, and wait for the magic to happen.

It will generate three tabs labelled “Keywords,” “Negatives” and “Ads.” If you want to run the script again with different keywords, make sure you save these tabs elsewhere or rename them to prevent the script from overriding them.

Finally, you can paste these tabs into Editor and update all the relevant settings and adjustments. Job done!

DOWNLOAD: You’ll need to authorize the script to run after you make a copy of this spreadsheet.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Daniel Gilbert is the CEO at Brainlabs, the best paid media agency in the world (self-declared). He has started and invested in a number of big data and technology startups since leaving Google in 2010.

Continue Reading

Trending

Copyright © 2019 Plolu.