Ex-Googler Talks Penguin, SEO, Spam, & More

Ex-Googler Talks Penguin, SEO, Spam, & More

November 20th, 2012

Andre Weyher, an ex-member of Matt Cutts’ webspam team, was recently interviewed by Jayson DeMers on Search Engine Journal. Here’s what he had to say about Penguin, SEO, and spam.


1. What was your role on Matt Cutts’ team, and how long were you a part of it? 

The spam team is a pretty large organisation within Google. It consists of many people working towards one goal; keeping the organic search results free of poor quality sites and penalising the ones that got their ranking due to techniques that are against the Google guidelines. Within the spam team people usually get their own speciality. I was responsible for content quality and backlink profile. I’ve been with Google for 4.5 years, two of those in Matt Cutts’ team.

2. What’s Google’s process for determining when to apply a manual penalty to a website based on its inbound link profile?

When reviewing a profile, the spam fighter would look at the quality of the pages where the links are hosted and the anchors used in the links. If the profile and anchors are not coherent with what a “natural” profile would look like, action would be taken. Lets take an example of a travel website – if there are 100,000 links coming in and 90,000 of them use an anchor like “cheap flights” or “book flight”, it would straight away arouse suspicion because this would never be the case if the links were natural. The quality of the pages linking in is of critical importance. Is it authentic? Or does it purely exist to host the link?

3. How does Google’s Penguin algorithm determine what domains to penalize? 

First of all, it’s important to stress that being affected, or as people commonly refer to as “slapped” by Penguin, is not the same as a penalty. It’s just a new, unfortunately disadvantageous ranking. A penalty is more severe. Penguin has been specifically designed to combat the most commonly used blackhat SEO techniques. The most obvious element that it focuses on is ranking due to a large amount of bad quality backlinks but it also takes into account spammy on-page techniques like keyword stuffing and over-optimization of tags and internal links.

4. How does Google spot blog networks and/or bad neighborhoods?

Search engines rely on website fingerprinting to identify clusters of ownership. If a particular website is relying on techniques that are not abiding the guidelines, it’s likely that the other sites owned by the same person are doing the same. 

5. What’s the best way to recover a website that has been sent a notification via Google Webmaster Tools of manual spam action? 

That very much depends on the type of penalty that has been applied. There are 2 scenarios here, one regarding the quality of the content on the page itself, the second regarding the links coming in to it. In the first case it’s “merely” a question of adding value to your site. In most of these cases the penalty would be applied to a site that has affiliate links but does not offer the user any added value apart from clicking away to a third party.

In the second case it’s a bit tougher. If you have been relying on poor quality link building, you have to get rid of as many bad links as you can. This used to be a very time consuming and difficult process but luckily the new disavow tool in WMT has made this much easier. You do have to be very careful with what you choose to disavow! 

6. What’s the best way to recover a website affected by Google Penguin? 

I wish there was an easy, straightforward answer that I can give here, but the only thing I can recommend is to have a very critical look at your website and try to figure out what it is that Google saw and was not entirely in line with the guidelines. From what I have seen since I left the team, a lot of webmasters are relying on techniques that they know are risky. After penguin it’s very difficult to get away with it, so my advice would be to certainly stop any grey activity and focus on creating compelling content and leveraging social signals. These have become very important.

7. What are some of the biggest misconceptions or myths you’ve seen about “bad links” and link profile penalties in the SEO community? 

Some of the biggest misconceptions that I have seen out there include; “directories are altogether bad” or “anything that is below a certain PR is considered spammy by Google”, I see a lot of people panicking and cutting off the head to cure the headache due to lack of knowledge. The most dangerous one of all I would consider to be the opinion that if an automated link building scheme is expensive, it must be good. Google has made it very clear that it wants links to be a sign of a real reason to link, an AUTHENTIC vote of confidence if you will. Anything that is paid for, is not considered quality by Google and participating in it puts your site at risk!

8. What do SEOs need to know right now to prepare for future link profile-related algorithm updates? 

I think the best way of preparing yourself against future updates is to build an SEO strategy that depends on smart on-page techniques and internal linking on one side and relationship based linkbuilding on the other side. This means that the obtained links should come from a source that has a genuine reason to link to your site. The relevance of your linking partner to the topic of your site is the key!


I have also started my own blog where I will be taking questions about SEO and online marketing, you can find it on http://netcomber.com/blog/. I invite everyone to challenge me with their questions

Link to Original Content: