commit: 0edbc38 - prod push (2014-04-21 08:50:10 -0400)
This is bizarre, and totally open to abuse. What kind of data do you think could be gleamed from a Google Docs form where anyone can anonymously submit URLs...
More importantly, why does Google even resort to this kind of public submissions system?
- To show face and let search professionals feel involved in cleaning up spam
- Because they want to pattern match between their computed "spam" types and what search professionals are calling "spam" to make adjustments and/or re-educate the industry.
- ... because they genuinely don't have enough data and don't know where to move next
Either way, a weird move. What next?
A tweet from Matt Cutts saying they've always been doing this. True, https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1
But this time it's an active call to action (Googlers don't tweet and blog to a community regularly asking them to submit spam reports).
And the anonymity is definitely an issue. If Google uses G+, authorship and social signals to apportion weight to links... they can definitely use the wealth of information they have about us as users to decide whether our spam reports are kosher or not.
Even if that were optional I'd be happy :)
Oh, come on, Ed! So much drama over another tool, similar to their forum but easier in this situation right after update, to make it easier for web spam team to hear feedback. They, clearly, won't act on most submissions (just like with disavow tool). And it's not like they HAVE to resort to this move. They could care less. I'm assuming it's because some "SEOs" will start pointing to flaws in algo update and, for some reason, don't know how to report that on G forums. That's just more convenience for them. Nothing more than that...
It's free work, and less PR (public relations) they have to deal with. Google acknowledges that there will be flaws. If someone abuses the system, it's shame on us and not Google.
Bloody hell. If they're having to resort to this in their efforts to fight spam, they're a lot further behind than I thought they were.
Yeah we were expecting a more stable Google after this Penguin release but it seems where in the same boat as last April. The sites I'm seeing in top spots are mind blowing.
I'd say it's a mistake to judge the submission form on face value. They'll be a lot of clever stuff going on below the surface. Here's what I'd be doing with that data if I were Google.
1) Waiting until I've got a shedload of data and then analysing it for patterns - ie which URLs/sites keep cropping up? That'll get rid of any domains which have been dropped in there by people hoping to get rivals penalised. It's really hard for a small group of people to mess with large-scale crowdsourced data.
2) Analyse the link profiles of the sites which do keep cropping up and see if any similarities arise (this will detect any link networks which are still helping sites rank).
3) Manually review the sites if need be.
4) Blitz any link networks that have been discovered, roll any other useful takeaway into a Penguin 2.1 update.
It's a smart move, and I'd argue one any white hat SEO worth his/her salt should be helping out with.
Anyone else feels like we're in 1984? Reporting fellow SEOs for "thoughtcrimes"?
Going off of what Ed questioned about Google resorting to this sort of public submissions system, getting SEOs to buy into this type of scheme can definitely lead the industry to a state of complacency. With this process working at scale, SEOs will report one another left and right whenever there's deviation from the norm. This gives Google's Webspam team greater power and authority over movement in SERPs and SEOs less wiggle room for anything but the status quo.
Now this isn't inherently a bad thing; Google's done pretty well in "doing no evil" so far. Regardless, the parallels to having a Thought Police, especially when Google has such a large market share in the Search vertical is something that all SEOs should keep an eye on in the long run. Simply looking at the short-term implications of such a tool isn't enough for an industry where our paychecks are based on Google's algo changes.
This is very disturbing. C'mon Google. Don't you know people will just abuse this form?
Interesting, but is there a source on this? Has this been published to the SEO community or did an intrepid investigator just find it?
I don't think this is particularly bizarre - Cutts has often had some kind of backchannel feedback available for major updates, special inboxes or careful attention paid to certain Webmaster World threads.They certainly have the manpower within their army of search quality evaluators to handle these submissions. They might not have any goal here except to root out the edge cases.
The fact that this form is completely anonymous is upsetting. Anyone can report anyone for any reason. If you're going to report spam you should at least be required to stand behind your report.
Submitted all of my competitors. I'm the best SEO eva, only way they can outrank me is if they're spamming. The proof is in the pudding. :)
I believe that the potential for abuse of the form might by high, but the potential for harm coming from that abuse is very low. It's not like they will take each submission as hard truth. What's interesting is that the spammy URL is the only required field, while the google result is secondary.
The result URL gives them MUCH more information to cross reference against, and those who are using the tool properly are more likely to actually provide the additional data. Makes you wonder if everyone spamming their competitor's sites into the doc will even be weighed.
Either way, I highly doubt they'll be taking the data they collect at face value.
Hey, Inbound.org's latest redesign of the submission looks an awful lot like Goog... oh dear.
You guys are acting like Google will immediately de-index any site sent through that form. More likely, it is the same situation that they had with Penguin that there are sites that are on the line as far as what their algorithm is flagging to devalue. They tweaked that later on to make it a bit more lenient to allow sites on the line to not be penalized. Similar here, they probably have sites that are on the line in terms of spamminess. They can use data like this to cross-reference with what their algorithm is reporting to increase their certainty with their results. So, I would say that even if it is abused, you are probably giving sites already on the edge a little push :-)
The fact that they have resorted to this tells me all I need to know about the current state of the algorithm, after the release of Penguin 2.0 the spam is still there and their update has not had the dramatic effect they were looking for.
They have resorted to getting the community to submit their own findings, then I imagine they will manually review these and dish out penalties. It's worrying that even after all this time and all the updates, spam is still winning.
You'd think that it'd be open for abuse, but cannot their WMT webspam report also be used for bad?
Crowdsourcing spam reports to fill in the gaps in the algorithm update makes some sense, but I think Google will regret allowing anonymity. Having to login to WMT would at least cause people to stop and think a minute. Being anonymous allows for reporting of "SPAM" meaning Site Positioned Above Mine, and will lead to mountains of useless reports.
Hey Guys, the site which I reported for spam is now removed from search result. Thanks!