Some things seem like a nightmare: you wake up and suddenly the good rankings have disappeared and the page you own is not found on Google. But do not panic! You can find the reason very often – or at least several reasons. Today I will show you the most common causes for ranking losses.
On the 12th of August 2011, the first Google Panda update was launched, since then there were many more Panda updates.
The name was not an animal, but the responsible engineer at Google called Navneet Panda. On July 22, 2015, the latest official update, Panda 4.2, took place. In January 2016, Google announced that Panda is now part of the so-called core algorithm. This means that no updates will be released in the future. Panda is a part of the algorithm and is work in real time, rather than “only” at intervals of several months.
Google Panda is a pure on page update and has nothing to do with links at all. Especially news pages with low-quality content, as well as arbitrage models (thin affiliate sites and price comparisons), but also pages with little editorial content and much double content or content, which are also available on other websites, are affected.
Today it is mainly a healthy portion of honesty: Do your website really offer a value to the user? The thousands shopping portal for fashion may be a revolutionary model in your eyes – but it is not (usually) for Google.
Panda or its successor in the core algorithm is very hard to fix again. Most of the time you need great efforts, especially with regard to the quality of the content of your own website. This often affects the entire business model at the core: pages that were previously profitable due to the low cost, are usually no longer required by applying all “anti-panda” measures. The “shopping club for fashion” is no longer worthwhile if you have to affront it with quality.
Anyone who wants to get out of Panda should focus on the user: How can you design the site in such a way that users will be happier with the website in the future? In addition, you should combat duplicate content as best as it can – but the most important remains the user focus.
This official Google video shows some of the site types that can be affected, even if it is more about low-quality content in itself.
Google Penguin was rolled out on 24.04.2012 for the first time. The last major update (Penguin 4.0) was in early October 2016 and is not so long ago.
Penguin is an algorithmic update and penalized websites, which have been used to create excessive links or links with poor quality links. While Penguin 1.0 still aimed purely at Anchor text, Penguin 2.0 was already more granular. A lot of pages were in a large penguin filter between June 2013 and October 2016 – more than three years.
Affected pages were massively pushed down for important keywords as of the key date. So, if you built up too many links with the anchor text “Couch”, it came suddenly to position 90 or even further back through Penguin from position 1 or 2. Since there were only four updates in four years, and from my point of view actually only Penguin 2 and 4 had really big effects, it was of course particularly bitter when one got caught.
Bad and purchased links should be removed or disavowed, if possible. My advice is to remove these bad links. Before me again the mob through the streets chasing: Yes, clean Link building with genuine and great cooperations without links to buy is of course still ok. Even if I never see this in the wild …
Remove following links too
— Links from non-indexed sites
— Links from websites with low page rank
— Links from irrelevant and untrustworthy sites
— Links from blog networks
— Site-wide links
— Disproportionate anchor text usage
— Article directories & forums
3. Warning about Unnatural Links
Here, the Diagnosis is simple: you get a message about “unnatural incoming links” in the Google Webmaster Tools. A Google employee has thus manually looked at the links to the website and suspects that the website has purchased links to improve their ranking. As a rule, it only takes a few days until first ranking losses are felt. In most cases, this affects certain directories or documents, more rarely the entire website. The impact of this message can be barely noticeable or very massive, depending on the degree of penalty.
This measure from Google is manual and can therefore only be solved manually. First you should create a large link list, go through each individual link and check for naturalness.
Google employees often know very well which links are bought and which are not. You should write nicely and ask webmasters to remove the links. If the webmaster does not react, you can remove the links using the Disavow tool (in the Google Webmaster tool).
There should be a list of domains that you then make publicly accessible in Google Docs. This list is then referenced in the so-called “request for re-examination”, which you also in the Google Webmaster tools. A Google employee will then see if the appropriate action has been taken to ensure that the Website meets the guidelines for webmasters. In the application, you should definitely write what you used to “break”, that you are sorry and you will never do it again. If the application is successful, you will receive the message “Manual spam measures canceled” and should have better rankings soon. If it has not been approved, it is not bad: you usually get a list with three concrete example links, which are still in breach of the directives. It is important that you not only remove these three links but much more!
4. Poor User Experience
You’re optimizing for a particular keyword and the rankings are going well up to page 1, but the page often falls back to position 10 or page two. Then this page has a bad user experience and does not fit into the search query.
Think about what could not fit on your search page and try to cover as many search terms as possible.
5. Obsolete Website
Your domain loses slowly, but steadily, to rankings. There are no big crashes, but rather a slow slipping into insignificance.
Here it gets tricky. The examples I have seen so far have had different reasons for the “Slow Death”. As a rule, user behavior plays a very important role. The page is less and less clicked, the design may not be up-to-date and/or the content is no longer good enough. Also declining brand search queries (search for “Domain.de”) can be a bad signal. And of course, it can also be because more and more old links are depreciated or dropped.
6. Technical problems
Your domain loses seemingly arbitrary rankings. Lose a specific directory or only specific keywords.
The reasons can be complex. Nowadays Google can be very sensitive to on page errors, especially regarding structure and awards.
- Loading time of the website or pages
Problems rendering a web page or content (parts) are not crawlable
Canonical do not (more) set correctly
Server-side problems in general
and much more!
Let SEO Discovery help you to recover from Ranking Drop
Many ranking losses have in common that the webmaster should seriously consider the way he earns money on the internet. Cheap link construction and bad content, but this time have been a long time. Especially in big teams, you should get together and basically discuss how to define SEO at all. With On page and Linkbuilding alone, you can not go any longer. You need a good website, good content , good products, and top SEO Services which were created not only for search engines.
If you have to take down ranking losses but do not know why: We will help you gladly!
Currently serving as the Managing Director of SEO Discovery, Mandeep Singh is an ardent reader, learner, and writer. He embarked on a career in Digital Marketing decades ago. Since then, he has been providing solutions to many businesses. He also takes time to keep readers posted with what he learned while building his years of experience. That’s so because he believes in the concept of sharing his expertise.