Google has automatic filters to adjust site’s SERPs according to its behaviour.
By behaviour I mean three things: site content, on-page and off-page optimisation.
Experienced SEOs say there are hundreds of such filters, but Google never officially announced them, nor even claimed their existence. Because of that, the filters’ performance is determined on a purely empirical basis and their names vary a lot according to personal tastes of people who coin them.
Despite various gossips and uncertainties, there are a few filters that have more or less proven their existence and are widely recognised by many SEO’s and webmasters.
A good (though rather speculative) review of Google filters is given in this blog post.
In this article I will mention only 4 (most prominent, on my opinion) Google filters:
- Google Sandbox filter
- Google -N filter
- Google overoptimisation filter
- Google duplicate content filter
Strictly speaking, even that major categorisation above is logically incorrect, as you can’t compare apples with horses. What on Earth am I talking about?! Well, it’s simple: there are filters classified by what they do and filters classified by what they are for. If we accept that classification, then there are actually only two kind of filters:
- “Stay in the basement” and
- “Go down by so many floors”
What I mean is this: some filters (like Google Sandbox, e.g.) do not allow your site to climb anywhere to be visible in SERPs. Most frequently this means 200th or even 600th position in SERPs for your targeted keywords. The other kind of filters make your site drop in rankings by N positions . Most frequent are -30, -60 and -950 penalties, however there is a whole range of other “-N” Google filters or penalties.
What are Google Filters for?
I believe that all thinkable reasons for Google Filters or Penalties can be summarised by 5 major factors:
- Low Trust Rank (a new site or site linking out to spam/dodgy sites like those advertising porn, casino, pharmacy, using link farms, doorways or other “black-hat” methods)
- On-page overoptimisation (keyword, heading and meta tag stuffing)
- Duplicate content issues (either copied from other sites of due to dynamically formed pages that strongly overlap in content with each other)
- Off-page overoptimisation (reckless link-building policy: spikes in link building rate; heavily overused anchor text, paid links etc.)
- Server performance (slow page loading, long and frequent outages)
How to lift Google filters?
Plianly speaking, you have to remove the cause that triggered the filter in the first place. Let me just list the most important Do’s and Don’ts that will help you to avoid or remove a Google filter:
- Don’t do keyword or meta tag stuffing; texts should read naturally
- Avoid copying and pasting other’s content
- Remove or no-follow pages with duplicate content (this could be quite an issue on blogs, where the same posts appear on different pages sorted by categories, tags, in archives etc.)
- Build you links steadily over time; avoid sudden spikes in link popularity
- Do not link out to spam sites or sites that link to spam (in particular - adult, gambling, pharmacy and other dubious quality sites)
- Don’t engage in any “black-hat” activity (doorways, link farms, cloaking etc.)
- Use a quality hosting provider that offer high performance fast servers with short and infrequent outages (performance stats on any host are publicly available, just search Google)
- Try to acquire (by link-bait, negotiation etc.) links from topically-related trustworthy sites that are SERP leaders in their area.
- Related Posts
- SEO gurus - who are they? (100%)
- How to find a trustworthy and experienced SEO expert? (100%)
- SEO and Webmaster Glossary (100%)
- Unpleasant incident or "SEO Unpopularity and the Way Backwards"... (100%)
- Is SEO dying as Business? (100%)