Connect with us

SEO

SMX Advanced Overtime: Your questions answered about webspam and penalties

Published

on


Frédéric Dubut and Fili Wiese speaking at SMX Advanced in Seattle in June. This session was so popular they will be teaming up again to talk about the latest news with Bing and Google penalties and algorithms at SMX East in New York on Nov. 13.

Frédéric Dubut (lead of the spam team at Bing) and I spoke together in a first-ever Bing and ex-Google joint presentation at SMX Advanced about how Google and Bing go about webspam, penalties and algorithms. We did not have the time to address every question from the attendees during the Q&A and so we wanted to follow up here. Below are questions submitted during our session about Google and Bing penalties along with our responses.

Q: Did the disavow tool work for algo penalties or was it mostly for manual action?
A: The disavow tools from Bing and Google most definitely help with manual spam actions. In fact, it is crucial to resolve link related manual spam actions/penalties with the disavow tool. At the same time, if your website has a history of active link building, the disavow tools are also a great way of getting rid of those low-quality links that you can’t remove and are now in violation of the Google or Bing Webmaster Guidelines. While there is no such thing as algorithmic penalties from Google’s side, disavow link data will be used by both Bing and Google as a potential data point for testing the various algorithms that power rankings.

Q: Thoughts or tips on combating spam user’s posts on UGC sections of a site? (reviews, forums, etc.)

A: Vigilance is key while combating user-generated spam and monitoring communities for brand protection purposes. There are some quick and easy ways of mass reviewing or limiting abuse. For example, using CSRF tokens or batch review user submissions by loading the last 100 posts onto one page and skim over them to find the abusive ones, then move on to the next 100, etc. You can also decide to always review any posts with a link before publishing, or you can use commercial tools like akismet or reCaptcha to limit spammer activity. If you don’t think you can commit any resources at all to moderating your UGC sections, you may also consider not allowing the posting of any links. It is important to remember that no tool will stop human ingenuity, which is why committing resources, including trained outreach for employees, is a must if the risk associated with user-generated spam is to be reduced.

Q: How can you tell if someone buys links?

A: It is all about intent and trends. In general, it doesn’t take a thorough manual review of every single link to detect something suspicious. Most often, one quick look at the backlink data is enough to raise suspicions and then reviewing the backlink profile in detail delivers the smoking gun.

Q: With known issues regarding javascript indexing, how are you dealing with cloaking since the fundamentals around most SSR and dynamic solutions seem to mirror cloaking? Is it hard to tell malicious versus others?

A: Actually, if we focus on the intent, why a certain solution is put in place, it is rather easy. In a nutshell, if something is being done so that search engines can be deceived and substantially different content is displayed to bots versus users, that is cloaking, which is a serious violation of both Google and Bing Webmaster Guidelines. However if you want to avoid risking being misunderstood by search engine algorithms and at the same time provide a better user experience with your Javascript-rich website, make sure that your website follows the principles of progressive enhancement.

Q: Can a site be verified in GSC or BWT while a manual penalty is applied?

A: Definitely. In the case of Bing Webmaster Tools, if you want to file a reconsideration request and don’t have an account yet, we highly recommend creating one in order to facilitate the reconsideration process. In the case of Google Search Console, you can log in with your Google account, verify your site as a domain property and see if any manual actions are applied anywhere on your domain.

Q: Is there a way that I can “help” Google find a link spammer? We have received thousands of toxic backlinks with the anchor text “The Globe.” If you visit the site to look for contact info they ask for $200K to remove the backlinks so we spend a lot of time disavowing.

A: Yes, absolutely. Google Webmaster Guidelines violations, including link spamming, can be reported to Google through a dedicated channel: the webspam report. On top, there are Google Webmaster Help forums, which are also monitored by Google Search employees and where bringing such issues to their attention stands an additional chance to trigger an investigation.

To report any concern to Bing, including violations to Bing Webmaster Guidelines, you can use this form.

Q: Does opening a link in a new tab (using target=_blank) cause any issues / penalties / poor quality signals? Is it safe to use this attribute from an SEO perspective or should all links open in the current tab?

A: Opening a link in a new tab has zero impact on SEO. However think about the experience you want to give to your users when you make such decisions, as links opening in new tabs can be perceived as annoying at times.

Q: Should we be proactively disavowing scrapper sites and other spam looking links that we find (not part of a black hat link building campaign)? Does the disavow tool do anything beyond submitting leads to the spam team? Or are those links immediately discredited from your backlink profile once that file is updated?

A: Definitely, if this is a significant part of your website backlink profile. Spam links need to be dealt with in order to mitigate the risk of a manual penalty, algorithms being triggered or even undesirable Google or Bing Search team attention. The disavow tool primarily serves the purpose of being a backlink risk management tool for you and enabling you to distance your website from shady backlinks. However, a submitted disavow file is merely a suggestion for both Google and Bing and not a very reliable lead for active spam fighting. Whether search engines abide by the submitted disavow file or use it in part or not at all is up to each search engine.

Q: How is a cloaking penalty treated? At the page level, sitewide. Can it be algo treated? Or purely manual?

A: Cloaking is a major offense to both Google and Bing, given its utterly unambiguous intent, which is a deception of the search engine and the user. Both engines are targeting cloaking in several complementary ways – algorithmically, with manual penalties, as well as other means of action. The consequence of deceptive user-agent cloaking is typically complete removal from the index. Google and Bing will be trying to be granular in their approach, however, if a website’s root is cloaking or the deception is too egregious, the action will be taken at the domain level.

Q: If you receive a manual penalty on pages on a subdomain, is it possible that it would affect the overall domain? If so, what impact could be expected?

A: It is possible indeed. The exact impact depends on the penalty applied and how it impairs a website’s overall SEO signals once it has manifested itself. This is something that needs to be investigated on an individual site level. If you end up in a situation where you have a penalty applied to your website, your rankings will be impaired and your site’s growth limited. The best course of action is to apply for reconsideration with the search engine in question.

Q: Do Bing and Google penalize based on inventory out of stock pages? For example, I have thousands of soft 404s on pages like these. How do you suggest to best deal with products that go out of stock on large e-commerce sites?

A: No, neither Google nor Bing penalizes sites with large volumes of 404 Not Found pages. Ultimately, when you have any doubt about the legitimacy of a specific technique, just ask yourself if you’d be comfortable sharing it with a Google or Bing employee. If the answer is no, then it is probably something to steer clear of.

The problem here is that with a lot of soft 404s, search engines may trust your server and/or content signals significantly less. As a result, this has the potential to have a major impact on your search visibility. One of the best ways to deal with out of stock items is to be using smart 404’s, which offer users a way to still find suitable available alternatives to the item currently unavailable while at the same time serving a 404 HTTP status code or noindex to users and bots alike. Talk to an SEO professional to discuss what the best strategy is for your website because there are a number of additional factors (e.g. the size of the website, available products and duration of unavailability) which can have a big impact on picking the right SEO strategy.

Have more questions?

Do you have more questions for us? You are in luck because at SMX East this year we will present the latest about Bing and Google penalties and algorithms. Be sure to join us at SMX East!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Fili is a renowned technical SEO expert, ex-Google engineer and was a senior technical lead in the Google Search Quality team. At SearchBrothers he offers SEO consulting services with SEO audits, SEO workshops and successfully recovers websites from Google penalties. Fili is also a frequent speaker at SMX and other online marketing events.



Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

See the ‘top signals’ informing your Google Ads bidding strategies

Published

on


Google’s smart bidding strategies use a host of signals to inform bids with each auction. Now, Google is starting to show which signals are driving performance to optimize bids for people more or less likely to convert.

Top signals. The signals shown might include device type, location, day of week, time of day, keywords, remarketing and Customer Match lists and potentially some other signals. You might also see combinations of signals such as time and keyword. Signals in red are less likely to convert in that strategy, while signals in green are more likely to convert.

(Click to enlarge.) Top signals for portfolio bidding strategies now show in Google Ads.

Where to see top signals reporting. The top signals will show in the bid strategy report. Keep in mind, that report is only available for portfolio bid strategies. The bid strategy report is located from Tools > Shared Library > Bid Strategies. Then select a portfolio strategy.

Google said it will show for Target CPA and Maximize conversions on Search, but you may be able to see top signals for other portfolio strategies. The example above is just for eCPC, in fact.

Why we care. Understanding which contextual signals have particular influence on your automated bidding can give you insights into your target customers and potentially inform your strategy. For example, if you see a keyword being “down signaled,” it may just be a poor match for that particular bid strategy, or perhaps there are ad or landing page optimizations you could make to improve its likelihood to convert.

You might also see trends that can inform other marketing efforts such as email send times. The screenshot above, for example, shows weekends are a strong signal. That could be a good time to test email flights rather than on weekdays.

More about pay-per-click advertising


About The Author

Ginny Marvin is Third Door Media’s Editor-in-Chief, running the day to day editorial operations across all publications and overseeing paid media coverage. Ginny Marvin writes about paid digital advertising and analytics news and trends for Search Engine Land, Marketing Land and MarTech Today. With more than 15 years of marketing experience, Ginny has held both in-house and agency management positions. She can be found on Twitter as @ginnymarvin.



Continue Reading

SEO

Google’s ‘Duplex on the web’ enables the Assistant to buy movie tickets for you

Published

on


Google is making it possible to use the Assistant (via Duplex) to buy movie tickets online. Back in May at Google I/O, the company announced that it was expanding the AI-powered Duplex beyond restaurant reservations to booking rental cars and buying movie tickets.

Duplex on the web. Called “Duplex on the web,” users will be able to use the Google Assistant for new reservations and purchase categories. Movies is the latest example.

As shown below, Android users in the U.S. or U.K. can ask the Assistant for movie showtimes or search movies in the Google app. The Assistant will then lead searchers through a “buy tickets” process that involves theater selection, movie times and, if available, seat selection. A saved payment card needs to be in Chrome to work in this case.

Expanding to many more categories. It’s not clear that users will prefer this process to manually booking tickets. However, it illustrates how Google is bringing the sophistication of its Duplex technology to the broader mobile internet.

It’s also not clear how much back end integration needs to be done by publishers to enable this; I suspect not that much. Regardless, I’m sure Google has a roadmap that extends to many other categories where online scheduling, reservations and basic transactions are involved.

Rand Fishkin has been speaking, including at SMX East, about how Google has evolved from “everyone’s search engine to everyone’s competitor” and the SEO implications of this. My view is a bit different.

Why we should care. Google has now talked repeatedly about “helping users get things done in search and with the Google Assistant. This is about making search more transactional and owning the transaction. Google is doing this in shopping and across the board in local (e.g., food ordering).

Google is trying to remove friction and compress the process between search and a sale. It’s handing that process off much less and less to third parties and site owners. This helps Google 1) improve the consumer experience, 2) keep users within its system, 3) create a closed loop for analytics and 4) generate fees or revenue from commerce, which has implications for smart speakers.

If these capabilities (i.e., Duplex on the web) take off, publishers and brands will need to be partnered or integrated with Google actions/services or risk losing the transaction to a competitor. It will also mean that Google owns the customer.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.



Continue Reading

SEO

Google Search Console adds Product results filters to performance report

Published

on


Google announced it has added new filters to the performance report within Google Search Console to show you how well your product results are doing in search. Google now captures and displays click and impression data when rich results display based on your use of product rich results markup.

The report. Find this data under the Performance report by clicking on “search appearance” and then on “product results.” You’ll see clicks and impressions and can further segment by device, geography and queries.

What it looks like. Here is a screen shot of the report:

What is a product rich result? Below is a screenshot of what a product rich result looks like, but you can learn more about this in this developer document. Product rich results typically show product ratings, price, availability and some description information. Note that product rich results are not new, just the report in Search Console.

An example of a product rich result in Google search results.

Why we care. The more data the better for SEOs and publishers, and this gives us more granular data on the impact of us adding product rich result markup to our pages. Google said this will show you how much traffic comes from experiences with rich data like price and availability and how does shopping traffic change over time, and the shopping search queries your website shows.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.



Continue Reading

Trending

Copyright © 2019 Plolu.