Google has a lot of stuff in the works that will have a direct impact on webmasters and the search engine optimization community. In a seven-minute “Webmaster Help” video, Google’s Matt Cutts (sporting a Mozilla Firefox shirt), ran down much of what Google’s webspam team has planned for the coming months, and what it all means for webmasters. It involves the Penguin update, the Panda update, advertorials, hacked sites, link spam, and a lot more.
Are you paying close attention to Google’s algorithm updates these days? Are you looking forward to the updates, or are you afraid of what they will bring?
Cutts is careful to note that any of this information is subject to change, and should be taken with a grain of salt, but this pretty much the kind of stuff they have planned at the moment.
We already knew the Penguin update was on the way, and he touches on that.
“We’re relatively close to deploying the next generation of Penguin,” says Cutts. “Internally we call it ‘Penguin 2.0,’ and again, Penguin is a webspam change that’s dedicated to try to find black hat webspam, and try to target and address that. So this one is a little more comprehensive than Penguin 1.0, and we expect it to go a little bit deeper in have a little bit more of an impact than the original version of Penguin.”
Google recently changed its updating strategy for Panda. Webmasters use to anxiously await coming Panda updates, but Google has turned it into a rolling update, meaning that it will continue to update often and regularly, to the point where anticipating any one big update is not really possible any longer. On top of that, Google stopped announcing them, as it just doesn’t make sense for them to do so anymore.
That doesn’t mean there isn’t Panda news, as Cutts has proven. It turns out that the Panda that has haunted so many webmasters over the last couple years may start easing up a little bit, and become (dare I say?) a bit friendlier.
Cutts says, “We’ve also been looking at Panda, and seeing if we can find some additional signals (and we think we’ve got some) to help refine things for the sites that are kind of in the border zone – in the gray area a little bit. And so if we can soften the effect a little bit for those sites that we believe have some additional signals of quality, then that will help sites that have previously been affected (to some degree) by Panda.”
Sites And Their Authority
If you’re an authority on any topic, and you write about it a lot, this should be good news (in a perfect world, at least).
“We have also been working on a lot of ways to help regular webmasters,” says Cutts. “We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.”
Also on the Google menu is a bigger crackdown on advertorials.
“We’ve also been looking at advertorials,” says Cutts .”That is sort of native advertising – and those sorts of things that violate our quality guidelines. So, again, if someone pays for coverage, or pays for an ad or something like that, those ads should not flow PageRank. We’ve seen a few sites in the U.S. and around the world that take money and do link to websites, and pass PageRank, so we’ll be looking at some efforts to be a little bit stronger on our enforcement as advertorials that violate our quality guidelines.”
“There’s nothing wrong inherently with advertorials or native advertising, but they should not flow PageRank, and there should be clear and conspicuous disclosure, so that users realize that something is paid – not organic or editorial,” he adds.
Queries With High Spam Rates
Google will also be working harder on certain types of queries that tend to draw a lot of spam.
Cutts says, “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.”
Denying Value To Link Spam
Google will continue to be vigilant when it comes to all types of link spam, and has some new tricks up its sleeve, apparently.
Cutts says, “We’re also looking at some ways to go upstream to deny the value to link spammers – some people who spam links in various ways. We’ve got some nice ideas on ways that that becomes less effective, and so we expect that that will roll out over the next few months as well.”
“In fact, we’re working on a completely different system that does more sophisticated link analysis,” he adds. “We’re still in the early days for that, but it’s pretty exciting. We’ve got some data now that we’re ready to start munching, and see how good it looks. We’ll see whether that bears fruit or not.”
Hopefully this won’t lead to a whole lot of new “fear of linking” from webmasters, as we’ve seensince Penguin first rolled out, but that’s probably wishful thinking.
Google intends to get better on the hacked sites front.
“We also continue to work on hacked sites in a couple different ways,” says Cutts. “Number one: trying to detect them better. We hope in the next few months to roll out a next-generation site detection that is even more comprehensive, and also trying to communicate better to webmasters, because sometimes they see confusion between hacked sites and sites that serve up malware, and ideally, you’d have a one-stop shop where once someone realizes that they’ve been hacked, they can go to Webmaster Tools, and have some single spot where they could go and have a lot more info to sort of point them in the right way to hopefully clean up those hacked sites.”
Clusters Of Results From The Same Site
There have been complaints about domain clustering in Google’s results, and Google showing too many results from the same domain on some queries.
Cutts says, “We’ve also heard a lot of feedback from people about – if I go down three pages deep, I’ll see a cluster of several results all from one domain, and we’ve actually made things better in terms of – you would be less likely to see that on the first page, but more likely to see that on the following pages. And we’re looking a change, which might deploy, which would basically say that once you’ve seen a cluster of results from one site, then you’d be less likely to see more results from that site as you go deeper into the next pages of Google search results.”
“We’re going to keep trying to figure out how we can give more information to webmasters…we’re also going to be looking for ways that we can provide more concrete details, [and] more example URLs that webmasters can use to figure out where to go to diagnose their site.”
So Google has a lot of stuff in the works that SEOs and webmasters are going to want to keep a close eye on. It’s going to be interesting to see the impact it all has. Given that Google makes algorithm changes every day, this has to be far from everything they have in the works, but I guess the video makes up for the lack of “Search Quality HIghlights” from Google in recent months. Still wondering if those are ever coming back. They were, after all, released to keep Google more transparent.