Google penalties: Why you can be penalized by Google and how to avoid losing search engine rankings

Measurements and Cover Size of Facebook, Twitter, Youtube, Google+, Instagram, Pinterest and more (January 2018)
How the content of social networks can help in SEO work

Google penalties: Why you can be penalized by Google and how to avoid losing search engine rankings

When trying to optimize a website, regardless of the marketing strategy we follow, we must be aware of what the are guidelines required by Google .

Not everything is based on reaching the top rankings and get backlinks, satisfy plays a lot of Google for a website because the main goal of Google is to give users access to accurate, unique and written by “experts”. That’s why we continually modify and improve their algorithms so that every web get the exposure it deserves.

Unfortunately, that’s where our concern appears: We can be penalized. Sometimes these penalties are deserved if the technique used to optimize a website is illegal or does not meet the requirements of Google . However, even if this is your case and you know you’re doing something wrong, everything is hopeless .

What is a Google penalty?

A sanction of Google is reflected in a negative impact on search rankings of a website because they do not meet updates Google algorithms and / or manual review .

The penalty may be a byproduct of an update algorithm or a deliberate penalty at the hands of SEO Negative / Black Hat SEO .

If your web traffic has suffered a sudden drop and therefore rank in the ranking it has been diminished, it probably is because you have received a penalty from Google.

Recognize a sanction or penalty Google

Nobody likes to be liked Google not therefore be followed to the letter the requirements of their algorithms, and for this we must start by putting the reasons why you can be penalized by Google. Here are eleven of them.

Before starting … What is Black Hat SEO SEO or Negative?

Negative SEO or Black Hat SEO is the set of practices or unethical techniques to improve the ranking of your site or sabotage the positioning of the site of a competitor in the search engines.
Among the black hat SEO techniques most popular are:
  • Copy content from other sites
  • Buy links to your site
  • Linkear excessively from spam sites
  • Hide words or texts to the user
  • Have low quality content with unnatural use of keywords
  • etc.

One of the best known cases is that of Trendhim , Danish men ‘s accessories company a year ago was the victim of an attack of negative SEO, where more than 8500 links linking to your website under the term ” porn .” This is the practice most widely used SEO Negative, to convince Google that a page is pornographic to lower positions in the rankings , and therefore, reduce their sales and increase those of a competitor.

trendhim

1) Google Penguin Penalty: Buy links

Among all owners of a website, how many there who you have not purchased any link during the life of your website? Few.

We all believe that buying links is good that there is nothing wrong behind it because Google can never know if we paid for a link or not. But, are we right? The answer is no. Buy links is the same as manipulate our profile link (backlink profile), which can be understood as Negative SEO but in reverse, since in this case we are buying links to our website voluntarily with anchor texts controlled by us.

Google has taken strict measures to combat these actions, harming the presence and prestige of each web, and so from 2012, the “Penguin” –  Google Penguin – is responsible for eliminating sites that have no natural bonds, low authority sites and directories on anchor text optimization .

2) Panda Google Penalty Duplicate Content 

Penalties of Google Panda  appeared in 2011 and deal with everything related to Content.

Again, What is the purpose of Google? Offer the best content to users. Is the best content duplicate content? Clearly not. The more you have duplicate content on a Web worst will drop to Google that site. Google found the net with little use duplicate content to users and thus transmit it .

Plagiarism is not all that Google takes into account when analyzing the content of a website, content quality plays an equally or even more important role . We must try to be unique and innovative when it comes to publishing content. Why invest time and effort into something that users can find on other sites?

These are the main reasons that makes the Pagerank of a web fall, if the content is not good, searchers have a bad image of the web . It is important to ensure that the content is unique and well written; To do this you can use tools such as CopyscapeCopyGator  or Plagium .

Plagium

 

Duplicate content not only refers to trace and copy something verbatim, another of the penalties of Google Panda , and few are aware of is the content spinning:   simply take something written in a way, and rewrite it so it sounds differently but says the same. This although many do not believe, is duplicate content, and is penalized by Google.

3) Google Panda Content poor quality

The quality of your web content is essential to get top positions in search engines and generate backward linkages natural appearance. The more information you give your web readers, more quality will.
When include content on your website you have to consider that it should be as informative as possible. You have to include all possible points so that readers do not have to consult other sites to find more information. 

If your website contains quality content you will show your readers that you are an expert in your industry, and that will keep them coming back to your website to find information. And the same will happen by Google. Google is aware of the quality of your content and effort you put into offer the best to your readers, and since you share the same goal, no problem will benefit your rankings in the search engines.

Try to understand that there are thousands of websites and content available in each niche Internet. However, Google can only rank the 10 best websites of any item on the first page of search engine results. The rest of the web pages are buried in the other search results.

As a rule, we can say that our friend Panda, does not like pages with fewer than 300 words.

4) Cloaking: Content distorted

The technique of  cloaking is to  distort the content accessed search engines when they read the site , creating a content layer and making believe the search engines that your website is a different content by using deceptive redirects.

This technique makes Google crawlers from accessing a different content accessed by users and is harshly sanctioned by Google when detected.

4) K eyword-Stuffing:  Excessive use of keywords

Again, this is one of the most common black hat SEO techniques. Google analyzes the keywords present in each web and acts accordingly. If the number of keywords is unusual, then the web will be penalized since a high keyword density is a sign that the content is of poor quality .

Technically it is called  keyword-stuffing , and what Google seeks to avoid these practices is that content is useful and valuable to the user, with a natural language that allows to read and understand. Thus, if the text uses synonyms and related words will be much more valuable to Google that if constantly repeated meaningless keywords.

This is important both in generating content and in generating backlinks to a website. Five years ago everything was based on the number of links, and not quality. This has now changed, and Google rewards the quality of backlinks received more than the amount thereof.

Brought this to the
plane of keywords, until recently, a technique widely used by those skilled in SEO to position a page was to use the same anchor text for all links so to see Google what we wanted position. However , since Google Penguin update,
overuse of anchor text is highly penalized .

That is why it is always advisable to avoid using the same keyword in the anchor text. To make a more detailed monitoring and anchor text links profile is advisable to use tools like Aherfs .

ahref

6) Google  Penguin:  Link exchange

Link exchange involves placing a link on your web site, and in return, the other site places a link to your website on their website.

Currently, link exchanges are not as effective as they used to be. Search engines have become smarter and these links have become detectable. By participating in link exchange, you run the risk of being penalized or even banned by the search engines. Again, this can be understood as a technique to manipulate PageRank.

7) Slow speed – Slow response time

Humans tend to wait three or four seconds for a page to load, if no load at that time, do not return to the web. Something similar happens to Google.

Web pages that take too long to load tend to have lower ratings . This is usually due to include images not optimized, numerous commercials, etc. These sites bother both the user and the search engines and consequently harm their SEO.

Nothing proves more our patience than a slow web. Currently, no one has time to wait a few minutes to load a web content. The slowness of a website automatically make users abandon the site and look for other faster websites . To remedy can be used to add caching or CDN.

A tool you can use to check the loading speed of your site is suitable is the W3C , where loading the URL of your page, giving you information about the elements optimize to accelerate it and reach the standares required by Google.

w3c

8) broken Links – Error 404

Google is always careful to ensure that the content it offers its users is always updated , which is why it takes into account all the errors of each web, even the most hidden. These hidden errors can be for example 404 errors or broken links . This error transmits the inability of the web to provide a user – oriented experience .

If the links are not updated , Google will assume that you do not mind the user experience. To remedy this is important to make a periodic review of the links within the web.

In the Console Google , you can find crawl errors that Google found on your website and work on them to correct them . Another highly recommended tool for this task is Screaming Flog , where, just paste the URL of your site and return the status of each of the pages that compose it . If you find any 404, you should identify the cause of it and eliminate it to be displayed again.

Screaming frog

9) Comments spam

Most websites have an automatic detection system unwanted comments, but still no comments to escape them, so it is important to be aware at all times of the comments that the web receives, if spam the web will be damaged.

Always check the comments from users, not linkeen to sites unrelated to the content you’ve shared, which do not contain excessive amounts of links, etc. If so, you can remove and / or mark it as spam, so that Google see you’re concerned about the content of your site and update.

10) Structuring Deceptive of Pages: Overuse of H1 titles

Carefully organized content help in SEO, that nobody doubts it. There is no better content than a well-structured content and visible to users, but that does not mean we have to fill our content label H1.

Header label or tag <h1> in HTML , generally will be the title of a publication or other emphasized text on the page. Usually it will be the largest text that stands out, therefore he’ll say more about the content.

Overuse of H1 tags can make Google believe that what we are trying to do is to include keywords in the content and believe that we are manipulating their algorithms. Not to mention that it may lie in another penalty which we have already spoken, excessive use of keywords.

As a rule, every page of your site should have one H1.

11) Lack of mobile optimization

All that we have a website we realized that most of our users access our site through mobile. Increasingly fewer they are seeking from a computer.

Formerly only we worried that our site was optimized for computers, but now Google is giving importance to mobile optimization. This is a fact, users prefer to use mobile, finally, after all, is simpler and we have always at hand.

It is why if the websites are not optimized for mobile devices, Google will interpret it as a sign that the web does not care about user experience, and again the web will be penalized.

After analyzing these 11 of the many reasons that lead Google to penalize a website, the rest is in our hands. We must be attentive to the requirements of the Google algorithms at all times, if you want to classify a web conditions.

Google’s goal is not to penalize the websites, but to offer the best experience and benefit all those offering quality content, well written, without plagiarizing. In short, all websites that collaborate to promote the objective of Google really will benefit in one way or another.

 

________________

About the Author

Sara LopezAlaguero Sara López is Marketing Manager at Trendhim. He loves writing, fashion and communications. Sara studied Computer Engineering at VIA UC and isexpert in Online Marketing, SEO and linkbuilding. In his spare time he is working aseditor for websites and newspapers,well as writing a personal journal which recounts his experiences and hobbies: sara-lopez.com . He haspassion for computer, and devotes his free time to delve into this field and learn as muchpossible.