🌪️ My new book is available for pre-order! Reserve your copy.

Google hints at demoting “low quality” sites in favor of better “high quality” rankings

In various talks, conferences, and patent filings, Google has hinted they’re preparing to adjust search rankings even more based on “quality”.

Google’s already always done a little bit of this. It used to be that every website was the same — a link was a link and the more links to your site the higher up the ranking you went. Today, pages still increase in Google rankings based on the number of other websites that link to your pages. But to combat spammy “link farms”, or sites that exist just to link to other sites, Google ignores those link farm sites from their rankings (and sometimes penalizes people for it). Instead, Google prefers to see links to your site from other sites that have inherent authority.

So imagine you have a website about Abraham Lincoln and so does your competitor. You both launch on the same day and time. But the White House and New York Times decides to link to your competitor’s site, but not yours. Those sites have an inherent amount more “quality and authority” to them and now your competitor’s site far outranks yours in Google’s rankings.

You could compete by getting a lot of smaller sites to link to you, but it’s known that the New York Times, an educational institution (with a .edu domain), or a government website (with a .gov) has an inherent amount more authority.

In the future, it seems Google is poised to take its rankings even further by analyzing the quality of the content and not just the domain name. So if you start getting every random small blog you can find to link to your site, it may not help much.

What counts as low quality content?

If a site that’s full of links from mostly boilerplate content, or redundant material from other sites (meaning someone has a site that consists of material they copied or re-posted from an older source), that will get dinged as “low-quality” or “duplicate content”.

The goal is to make sure original authors and original work is rewarded with higher ranking and higher viewership.

This also means content duplicated across a website internally, like in a footer or sidebar, can be seen as “duplicate content” and is only noticed by Google once. Other repeated mentions and instances, while useful for humans, will be ignored by Google.

What does this mean for my site?

If your website currently receives little to no links from other sites, or if your site is linked to mostly by large content aggregator services, or if your site is linked to by sites that are of little to no use, Google will likely remove any value from those sites and thus lower your site’s value.

So an example might be a nonprofit’s website that is linked to by a few small blogs and some other small partners, like a local church or other small nonprofit. You know these kinds of sites when you see them: they have little to no information and the information that is presented isn’t of much use to most viewers. Things like mission statements and generic “welcome to my website” paragraphs won’t cut it.

Social websites like Facebook and Twitter do not count. Posting links on those services will not modify your rankings in Google search results up or down (but may yield human visitors, which is never a bad thing). The exception is when Google notices “social signals”. If your Tweet or status update goes viral, that will trigger Google to increase your site’s overall ranking.

What can be done to get ahead of the curve?

I’m looking long-term at this and thinking Google has either already quietly started to do this or will fully implement this vision within the next year or two.

If you’re reliant on your website for leads, customers, or donors, there are some steps worth taking now:

  1. Avoid text-based images. We try very hard to “overrule” requests where people send us an image that’s really just a bunch of text on top of a photo, but sometimes that falls on deaf ears. Images with text have zero search value. It’s like reading a book where each page is a picture of another book’s page.
  2. Setup a blogging schedule. Almost every site we build today has a blogging mechanism available or included. Treat it like a part of your routine and recognize your audience.
  3. Write in your blog information that is important to your audience. If your business is about door knockers, don’t bother writing about your vacation or last week’s Game of Thrones. Write about the products, the finishes, the metals, how they’re made, applications and use cases, how to repair them, and other information related to just door knockers.
  4. Be specific and avoid the obvious. If a reasonable human being could read your post and end it by saying, “Duh”, your writing is too obvious and will be drowned out by competitors. So if your site is about gardening, don’t write a whole post about the importance of watering your flowers when it’s hot. However, if you wrote about the details of not watering your flowers when it’s hot, that could be unique and authoritative. For instance, “Here’s what happens to a petunia’s root system when it goes without water for just one 90 degree day”.
  5. Do not re-post material from other sites. If there’s a news story that morning in the newspaper or on a TV news site, merely copying part or all of that will be of little use. That will be dinged as “low quality” and “duplicate content”. Google can know the original source based on the reported server time a story was published. While it may be worth talking about on your site, you’re going to need to write more about how it’s important, what it means, or provide further details before it’ll be a unique piece of work again. And that has to be substantive — an extra sentence or two is not useful.
  6. If your site is a bit broader — like for a writer — consider how you can focus in on a niche. If that doesn’t seem doable and you’re not writing about a specific book or piece of work, focus on writing pieces that generate interest from other writers, bloggers, and sites to get them to link to you in large numbers.
  7. Keep at it. Sites are like wine: they get better with age. Google rewards older sites. Plus, there’s value in quality and quantity.
  8. Ensure mobile friendliness. We do this by default on every website we do today and have gone back to redo older sites that pre-dated the iPhone (so earlier than 2007). But sites that aren’t mobile-friendly will not display AT ALL to a person Googling from their phone or tablet. You’ll only display on desktop searches.
  9. Using secure connections can be a signal for a higher ranking, but currently isn’t a huge boost. We’re looking at ways to use SSL connections (https:// instead of http://) on all of our websites, but this does cost money for the security certificates.
  10. Ensure speed and performance. The web community has largely not cared about speed and performance of page load times, but we have kept it in mind. This is why most of our sites don’t use flashy elements or animations. We’re looking at ways to further improve page load times, as Google’s hinted at rewarding faster, leaner sites in the future.

In part, this is a lot like writing papers in school. Don’t copy or plagiarize, and if you do copy, cite your sources and do it purposefully. Do it regularly and consistently and the audience and search traffic will follow.

Want to know when stuff like this is published?
Sign up for my email list.

Photo of Justin Harter


Justin has been around the Internet long enough to remember when people started saying “content is king”.

He has worked for some of Indiana’s largest companies, state government, taught college-level courses, and about 1.1M people see his work every year.

You’ll probably see him around Indianapolis on a bicycle.

Leave a Comment