How can businesses identify that they are subject to a penalty?
The obvious way is if they have warnings reported in Google Webmaster Tools.
Beyond this, keep a close eye on your organic search traffic and rankings (Searchmetrics SEO visibility report is normally very useful for this, especially for comparisons vs. competitors in your sector) to spot any recent drops in performance.
One thing I would highly recommend is link cleanups and frequent auditing for any site, whether penalised or not. You want to make your profile as clean as possible, prevention is definitely better than cure!
The telltale signs of something odd happening to the site are rankings and/or traffic drop. There are certain steps that every SEO takes when identifying the possible cause.
In my case, I’d first check if there were any changes made on a site recently. After excluding the possibility of any abnormalities found on the website, like blocking Google from crawling or indexing the site (you would be surprised how often it happens!), we may assume that we're dealing with a Google penalty.
Now we need to investigate the type of penalty we deal with. Is it manual action or an algorithm based penalty?
To answer that, go to Webmaster Tools and under “Manual Actions’ check if any messages are there. If not, then we are more likely dealing with algorithm penalty and now further investigation takes place (is it Panda, Penguin related?).
It is also important to analyse how the penalty affects your site. Is it side-wide (hopefully not!) or partial penalty? Simple check of indexing can give some early answers.
Julia Logan (aka Irish Wonder):
Depending on the penalty, it used to be possible to identify Penguin and Panda with high likelihood if the date of the traffic loss/drop coincided with known updates.
Combined with the site and/or backlink profile analysis confirming the initial guess, the diagnostics could be pretty precise and relatively straightforward (unless the dates of two updates coincided and/or more than one penalty affected the site in question).
However, as Google has announced that Penguin would now become a "rolling" update rather than tied to fixed dates, at least for Penguin the date-oriented approach is no longer applicable so we only have to rely on immediate diagnostics of the site, its link profile and the nature of traffic loss.
Finally, let's not forget that there are other kinds of penalties besides those from Penguin and Panda, and depending on the site vertical and nature they can also apply.
So sites in a number of highly competitive verticals should keep an eye on things like the Payday Loans 'update' (I put 'update' in quotes as I believe this to be a periodic manual cleaning effort rather than an algorithmic update - but nevertheless its effect on targeted sites can be devastating); local sites should watch for Pigeon and the like, etc.
The most straight forward way to tell whether there is a penalty in place is to log into the Webmaster Console.
The catch? There might be some debate around the semantics but not all 'penalties' are included in the Webmaster Console reports.
These are harder to be sure of but as a rule of thumb if a large chunk of text from a page, searched for as an organic exact match (with quotes) is not in the first position then there’s a problem.
If you have seen a dip in traffic firstly, try and be certain it’s as a result of an algorithm change:
Check other channels to rule out market conditions.
Check analytics tracking codes are present on all pages and working properly.
Use analytics to isolate the problem:
Look at individual pages e.g. home vs. deep pages.
Look at page types as a group (e.g. all categories vs. all products).
Look at pages grouped around topic areas – e.g. dresses vs. jackets.
The aim of this is to try and isolate the area that the algorithm has affected. Is it:
Page type specific.
Once we know the problem area we can then start to rule in or out certain factors by looking at:
Poor quality sources – article sites, press release sites, too many & low quality directories, comment spam.
Keyword rich anchors.
Manufactured / out of context links – aggressive guest posting, aggressive product gifting for links.
It’s worth noting here that some companies expect to return to their prior standings, which is absolutely not the case.
If your site was ranking well under the false pretence of a load of unnatural links, then even after clean up and removal of all actions applied to the site, you will not have the authority you once had – therefore rankings will be much lower than previously.
You will only rank where you once did, if you can garner the authority you once had (through earning it, not manufacturing it).
With the right tracking tools you can see almost immediately. There is a distinct difference in the pattern of the demise of the site between a penalty and an algorithmic hit.
A penalty will most likely affect the whole site or whole sections of a site. With a recent penalty on a major UK bank, they were hit on just a few sections of their site as opposed to the whole site.
This shot shows the effect of a penalty on a gambling site. You can see a drastic fall.
The obvious answer here would be to check Webmaster Tools or the accompanying message that will flag up, but in reality this doesn’t always ring true.
There can be issues with expired messages, someone else clearing the message from the account or people simply not noticing. There’s a substantial difference between a manual penalty and an algorithmic filter inhibiting your site.
A manual action is easier to spot because of the notice in WMT, while an algorithmic filter requires a bit more investigation:
Traffic. Anyone tracking their site’s progress should be using Google Analytics to monitor traffic levels. If there is an anomalous drop in traffic then this should be the first sign of a need for investigation.
Visibility. This is a method of tracking how often your site is seen and involves using a metric to measure which terms your site is ranking for against the average search volumes for these terms. If you regularly monitor your visibility across a range of keywords then any major fluctuations should be noted and investigated.
Brand SERP. If you’re worried about a limit to your ranking ability across the whole site then a quick search of your brand term should help to identify this.
If your social accounts, a competitor, or a similar brand name is ranking above your site then the cause of this may be a penalty/algorithmic filter.
Duplicate content. For ecommerce sites that supply a product feed to their stockists, or even those who don’t, you need to be careful that their content is not being duplicated across multiple sites.
The Panda updates mean that you can be pulled up on duplicate content, whether that’s other sites utilising your content or multiple similar pages on your own site.
It’s important to set a regular task each month (or quarter depending on content) to run a representative sample of your site’s pages through a tool like Copyscape, which identifies any duplications of your content across the web and grades its risk level; from this you can gauge if a search engine would see you as the content originator or if this could be a serious risk to your ranking abilities.
Backlink Profile. The Penguin updates look into the backlink profile of your site. If the links to your site can be perceived as manipulated by Google then this can drastically hinder the performance of your site.