You often hear place out there that duplicate content is bad for SEO , it would be punished by search engines like Google.
Some say that the punishment for this type of practice can even get to delete all the search results area, making the site effectively impossible to be found in this way.
Is that really true?
The search engines themselves are not much help because often quite opaque about the rules, sometimes to say something, but in practice do differently.
In this post we clarify what we know about the impacts that publish repeated content can have on SEO.
The short answer is “yes you can”, but not the way you’re thinking.
Excluding extreme cases, your site is safe as long as the amount of duplicate content is small. But what characterizes an extreme case? Something like what happened with a company that has hired a consulting public relations (PR) a little bad: PR advice not to bother to write a press release; they copied the text of the homepage of the company and sent to several vehicles.
Many of these vehicles have published the text unchanged at various locations. Google ‘s algorithms then began to fire alarms, since the same text was appearing at many sites in a short time. For Google, this was sign of spam , causing the pages that carry this duplicate content from being harmed in the polls.
In short, some duplicate posts to your blog will not harm the rankeamento of your site. Remember that Google is a company with over 50,000 employees and, among other things, makes cars that do not need drivers and kites to generate wind energy. Ie has enough smart people working there. Therefore, Google is smart enough to know that your site is not malicious by having a duplicate post from 50 other original content and quality.
Back in 2014, the update of Google’s algorithm, named Panda, refined organic results shown on the search page, focusing relevant content to the user. Already poor publications information or repeated lost visibility.
The main problem of duplicate content “not malicious” is that search engines do not know which version of the content display, because if the original content is not useful to the user, an equal content will not be.
So if you do not tell Google which of the content is correct to be displayed, it will take care to choose one of the versions – possibly opting for the version that was indexed first, the original. But if there are many external links directing to this version of the page, the chances increase even more.
In addition to choosing which content to display in the search results, Google also need to determine which version will receive the authority for other sites that link to one of the versions of the content.
Again, if you do not tell Google which version should receive this authority, he may attribute to the wrong versions, even diluting the authority among the various versions – and thus harming the placement of content in the search results. This directly affects your positioning and reduces the amount of visitors coming to your pages.
Duplicate content is often generated by the very platform content management , such as WordPress, without you knowing.
Here are some examples of what is considered duplicate content to Google:
There are several ways to “teach” the search engines how to handle your duplicate content for you to concentrate authority in the version you want:
Also known as redirect 301 , are made directly on the server and used so that users can not see over the page in question, being automatically redirected to another specified page.
By doing this, the search engines understand that all authority that the page must be transferred to the page to redirect to.
This is a method often used when a company is changing domain and do not want to lose the authority that has already won.
But remember, any redirection involves loss of authority. However, you can minimize the effect by doing the right way. There are also some WordPress plugins that make it easiest for those who are not familiar with codes.
While permanent redirects are made on the server, the canonical tags are inserted tags directly in the page’s HTML code.
Basically, it specifies the “canonical” version of the content, ie, the URL of the original content. Thus, all the authority of incoming links goes to the specified URL.
This option is often used when you want to republish an old post or publish in a different place one guest post .
Not to confuse Google, do not use on your site links to different URLs that lead to the same page.
This tag allows the search engine crawl the page, but not include it in their search results.
When playing an item, like a guest post, you can place at the end of a link to the original article.
That way Google knows that this is the URL to the original content.
When you play something from another site, it helps you to re-read the content in your own words to your page receiving authority.
The link to the original content will help Google know that they are related.
Never use pages with posts listing where the post appears in full.
Show only the first few words or summary of the post, for showing the post in its entirety you will be duplicating content on different URLs, which we have seen that harms their position.
Read more in post 4 reasons not to display full posts in the Home of your blog .
Duplicate content is a problem for any SEO strategy, but there are consolidated techniques to tell the search engines what is the correct version of the material.
In addition, with the exception of very extreme cases, worry that Google or another search engine go punish a domain for a bit of duplicate content is not based on practical experience.