The Truth About Duplicate Content and SEO

How to identify good sales opportunities & nbsp; in your funnel
Why it’s not worth copying third-party content

 

You often hear place out there that duplicate content is bad for SEO , it would be punished by search engines like Google.

Some say that the punishment for this type of practice can even get to delete all the search results area, making the site effectively impossible to be found in this way.

Is that really true?

The search engines themselves are not much help because often quite opaque about the rules, sometimes to say something, but in practice do differently.

In this post we clarify what we know about the impacts that publish repeated content can have on SEO.

Duplicate content on my website can harm the SEO of my entire domain?

The short answer is “yes you can”, but not the way you’re thinking.

Excluding extreme cases, your site is safe as long as the amount of duplicate content is small. But what characterizes an extreme case? Something like what happened with a company that has hired a consulting public relations (PR) a little bad: PR advice not to bother to write a press release; they copied the text of the homepage of the company and sent to several vehicles.

Many of these vehicles have published the text unchanged at various locations. Google ‘s algorithms then began to fire alarms, since the same text was appearing at many sites in a short time. For Google, this was sign of spam , causing the pages that carry this duplicate content from being harmed in the polls.

In short, some duplicate posts to your blog will not harm the rankeamento of your site. Remember that Google is a company with over 50,000 employees and, among other things, makes cars that do not need drivers and kites to generate wind energy. Ie has enough smart people working there. Therefore, Google is smart enough to know that your site is not malicious by having a duplicate post from 50 other original content and quality.

But why duplicate content is a problem for SEO?

Back in 2014, the update of Google’s algorithm, named Panda, refined organic results shown on the search page, focusing relevant content to the user. Already poor publications information or repeated lost visibility.

The main problem of duplicate content “not malicious” is that search engines do not know which version of the content display, because if the original content is not useful to the user, an equal content will not be.

So if you do not tell Google which of the content is correct to be displayed, it will take care to choose one of the versions – possibly opting for the version that was indexed first, the original. But if there are many external links directing to this version of the page, the chances increase even more.

In addition to choosing which content to display in the search results, Google also need to determine which version will receive the authority for other sites that link to one of the versions of the content.

Again, if you do not tell Google which version should receive this authority, he may attribute to the wrong versions, even diluting the authority among the various versions – and thus harming the placement of content in the search results. This directly affects your positioning and reduces the amount of visitors coming to your pages.

You have duplicate content on the site without knowing

Duplicate content is often generated by the very platform content management , such as WordPress, without you knowing.

Here are some examples of what is considered duplicate content to Google:

  • Domain with and without www: http://seu_dominio.com.br and http://www.seu_dominio.com.br are considered two different sites to Google. Therefore, all pages within those sites that can be accessed with or without the www is duplicated content to Google.
  • Even accessed content with different URLs: It is very common blog posts are available in your unique URL and also in other URLs that only show posts from a particular category.
  • page Print Version: Some sites generate a specific version for page printing. Quando acessado by one diferente URL of original, esse tipo of conteúdo also representa duplicidade for the busca of motores.

How to deal with duplicate content?

There are several ways to “teach” the search engines how to handle your duplicate content for you to concentrate authority in the version you want:

permanent redirects

Also known as redirect 301 , are made directly on the server and used so that users can not see over the page in question, being automatically redirected to another specified page.

By doing this, the search engines understand that all authority that the page must be transferred to the page to redirect to.

This is a method often used when a company is changing domain and do not want to lose the authority that has already won.

But remember, any redirection involves loss of authority. However, you can minimize the effect by doing the right way. There are also some WordPress plugins that make it easiest for those who are not familiar with codes.

Canonical tags

While permanent redirects are made on the server, the canonical tags are inserted tags directly in the page’s HTML code.

Basically, it specifies the “canonical” version of the content, ie, the URL of the original content. Thus, all the authority of incoming links goes to the specified URL.

This option is often used when you want to republish an old post or publish in a different place one guest post .

Consistency of internal links

As we mentioned above, there are many pages that can be accessed by more than one link, for example http://seu_dominio.com.br or http://www.seu_dominio.com.br .

Not to confuse Google, do not use on your site links to different URLs that lead to the same page.

Tag “noindex, follow”

This tag allows the search engine crawl the page, but not include it in their search results.

Link to the original article

When playing an item, like a guest post, you can place at the end of a link to the original article.

That way Google knows that this is the URL to the original content.

Summary of his words and the link to the original article

When you play something from another site, it helps you to re-read the content in your own words to your page receiving authority.

The link to the original content will help Google know that they are related.

index pages with shortened post

Never use pages with posts listing where the post appears in full.

Show only the first few words or summary of the post, for showing the post in its entirety you will be duplicating content on different URLs, which we have seen that harms their position.

Read more in post 4 reasons not to display full posts in the Home of your blog .

Conclusion

Duplicate content is a problem for any SEO strategy, but there are consolidated techniques to tell the search engines what is the correct version of the material.

In addition, with the exception of very extreme cases, worry that Google or another search engine go punish a domain for a bit of duplicate content is not based on practical experience.