Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
Translation: “We see you people trying to game our system by doing a bunch of bogus link crap. Stop it.”
The always poignant Marco Arment had a post earlier today about Google’s decreasingly useful and spam-filled search results. He writes, in part:
… it’s now nearly impossible to find good results for many commonly asked types of queries.
Part of what exacerbates this is the apparent explosion recently of cheap-“content” sites that try to answer every search query ever asked. Like affiliate-marketing spam, much of it seems to be generated by humans (technically — I wouldn’t classify them as such), but it’s functionally useless: sites like About.com, eHow, and countless clones with .info domain names that promise to address every niche question and informational topic, but whose content lacks all quality and substance.
When I read this earlier this morning, I said, “Good points” and tucked it away. This afternoon I did some Googling for great “OS X WYSIWYG HTML editors” (wow, that’s a mouthful). First result: webdesign.about.com:
#1 : Dreamweaver is one of the most popular professional web development software packages available. It offers power and flexibility to create pages that meet your needs. I use it for everything from JSP, XHTML, PHP, and XML development. It is a good choice for professional web designers and developers…
What a joke. Way to pull crap right off the box from Adobe. Marco suggests the solution is for Google to change their algorithm. Which, would work for a while and we’d be right back to square one.
Plus, I’ll admit as a developer that it’s hard not to try to game the system, at least a little. Small business owners used to be able to open up a store on Main Street and bam, they’d probably be the only hardware, jewelry or grocery store in miles. Now, these same people are fighting a war with the likes of Amazon and losing miserably. How can you not try to get them up to the top at least a little? What hope do they have? Sure, they need to be relevant, but they only used to have to be relevant in their small town. Now you want them to be relevant across the globe?
Maps and Places are still good places to turn for some searches because it’s largely human-driven and the verification techniques work relatively well. Although, a number of web designers and other service providers list their location in Indianapolis at “Washington and Meridian Streets”, the dead center of town, despite actually being based in Carmel or some other suburb. How can I report those people?
Maybe Google needs to be a little less automated and a little more human?
My suggestion would be for Google to get more customizable. I know I can do filters and searches within the search box, but that’s a pain every time I want to do a search. Why can’t I just tell Google, “Never show me results from about.com, ehow.com, etc.”? A simple blacklist feature in my account settings can go a long way.
Or how about a Digg / Reddit style system where a person can upvote/downvote sites based on specific searches? So if this blog shows up while you’re searching for a recipe, you can push me down. But, if I show up and offer you something useful related to a “web development” search, vote me up.