Guest

Duplicate Content: Two Ways to Avoid SEO Problems

By: Guest | July 23, 2012 | 
25

This week our guest posts focus on the latest trends in SEO.

Today: Duplicate content by Derek Mabie. 

Duplicate content is every webmaster’s nightmare.

It’s exactly what it sounds like: A portion of text that exists on multiple URLs, making it difficult for search engines to determine how to rank them in order of importance.

Google will ultimately choose one URL to show in the results page – but it may not be the page you want to be exposed.

The bigger issue, though, is Google may prematurely give up on crawling your site if it keeps encountering instances of duplicate content. No one likes doing the same job multiple times, and Google is no exception.

How Does Content Get Duplicated?

It seems as though this would be a pretty easy fix. All you have to do is refrain from copying and pasting words into another page or post on your site, right? Not exactly. Many times, duplicate content isn’t intended to manipulate the search engines. For example, printer-friendly versions of pages are duplicate content. These cases are not confusing to users who understand the purpose – but to the search bots, you are wreaking havoc!

We recently discovered a client had identical content on two URLs that read like this:

www.website.com/page/subpage

www.website.com/Page/subpage

The only difference between the two URLs is one word is capitalized. To search engines, these two pages are completely separate from one another and thus, are competing to rank in the search engine results page (SERPs).

Avoid Duplicate Content 

Class, repeat after me: ca-non-ih-cole-eye-zay-shun. Canonicalization, easier done than pronounced, is the act of conveying to search engines which single version of the duplicate content rules supreme. This means one URL will be indexed and the identical versions won’t compete to outrank each other in the results pages.

1. 301 Redirect

Sometimes pages serve different purposes, but contain similar chunks of content. An example of this would be inserting an “Our Philosophy” paragraph into every footer of your site’s many pages. Other times, multiple pages will serve the exact purpose, the only difference being the web page addresses (URLs). This conundrum is common during website redesigns, but it can be resolved by assigning 301-redirects to the URLs.  By doing so, you can guide users and search engines to the page you actually want them to visit.

Home pages often require 301-redirects:

 

These pages are all identical, but they are seen as completely separate entities to search engines. This can be a mess if inbound links are distributed to each of these versions. The value from the links will be spread thin instead of adding up to boost domain authority. Adding a 301-redirect, thus identifying one URL as the master, will harness the authority and increase the potency of link juice from outside sources.

301-redirects serve two purposes: They can essentially delete the old URL (if the site is being redesigned) or they consolidate multiple versions (such as the home page examples shown above).

2. Rel=Canonical

Rel=canonical is the name of a link that can be inserted into the header of the non-preferred pages.

<link rel=”canonical” href=”http://www.example.com/best-url”/>

The URL you do want Google to use as the preferred version of content will go in place of www.example.com/best-url, ensuring that every duplicate piece of content will then internally link to the correct version. Search engines appreciate this clarity.

What if Other Sites are Using My Content?

Duplicate content is really only an issue when it exists on your own domain. Other websites may use your articles as syndicated content – and that is just fine. In fact, Google has stated that as long as the syndicated material links back to the original source, it will not interfere with your site’s ability to rank.

To play extra safe, we recommend asking those who use specific content to include a “noindex” meta tag in the header of that page. This communicates to search engine spiders that the page is not intended to be included in the index of pages.

Hopefully you have a clearer understanding of what duplicate content is, the harm it can cause, and what measures you can take to canonicalize the crap out of it.

The beauty of practicing “white hat” SEO is that your refinements are done according to the search engines’ recommendations. As their algorithms improve to better reflect the needs of users, your website will in turn transform into a user-friendly domain.

Derek Mabie is president of his St. Louis-based digital marketing agency, Evolve Digital Labs. His “white hat” SEO efforts allow him to empower brands through search. You can follow him on Twitter at derekmabie. Download Evolve’s SEO Guide for Beginners to learn more about the industry.

Spin Sucks in Your Inbox

Leave a Reply

25 Comments on "Duplicate Content: Two Ways to Avoid SEO Problems"

avatar

Sort by:   newest | oldest
realitygal
realitygal
3 years 9 months ago

@ginidietrich My BFF is Robert Downey Jr.’s yogi

ginidietrich
ginidietrich
3 years 9 months ago

@realitygal OMG! How do I get THAT introduction?!?

realitygal
realitygal
3 years 9 months ago

@ginidietrich here’s some pics http://t.co/ByG3gLxi

ginidietrich
ginidietrich
3 years 9 months ago

@realitygal I think my heart just skipped 12 beats.

ginidietrich
3 years 9 months ago

This is super helpful; thank you! So, you’re saying content from Spin Sucks can be syndicated and it won’t hurt our rankings as long as the other site links to the original article?

Lisa Gerber
3 years 9 months ago

@ginidietrich Ha! That’s exactly what I thought when Derek and his team queried to write this post! Cool. Huh? And this is true, @derekmabie?

adamtoporek
3 years 9 months ago

Great stuff. I wish I had read this post last week. I just turned down a very nice offer to reproduce a post due to SEO/duplication concerns. I also have shied away from syndication sites for the same reason. To draft off @ginidietrich ‘s question — as long as a site includes the original link, that is enough — even without the “noindex” tag? Thanks!

derekmabie
derekmabie
3 years 9 months ago

@ginidietrich thanks for the share!

Shonali
3 years 9 months ago

Just a TERRIFIC post. Thank you so much!

John_Trader1
John_Trader1
3 years 9 months ago

 @Shonali I agree, this is great information. SEO is for nerds and geeks. Which is why I LOVE IT!!!! Thanks Derek!

WorldTravelMom
WorldTravelMom
3 years 9 months ago

Although I am definitely not an SEO nerd or geek, this was a very useful post. I was confused about the syndication part of it but this was the clarity I was looking for.

derekmabie
derekmabie
3 years 9 months ago

Hi all thanks for the kind words on the post, sorry i was absent yesterday, unfortunately I am a bit under the weather. yes syndication is okay, as long as the authorship is maintained. 
 
check out this duplicate content explanation page, straight from the goog http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66359
 
Think about a press release, the major news sites would be hit on a regular basis, for duplicate content if that was the case.  
 
Let me know if you have any other questions, I will try and stay on top of it today. Thanks again for the kind words and social shares!

trackback

[…] Continue reading here: Duplicate Content: Two Ways to Avoid SEO Problems by – Spin Sucks […]

derekmabie
derekmabie
3 years 9 months ago

@pr2020 thanks for sharing the article!

yoursocialfans
yoursocialfans
3 years 9 months ago

When you Buy Facebook Fans the quantity of visitors to your page as properly as your site will increase, enabling you to make greater use of public social networking for increasing income. A growing number of individuals right now use public social networking sites in purchase to link with their friends and family. With an incredible number of customers opening the network each day you have unlimited prospective to present your company, services and goods to prospective customers all over the world.Buy facebook fans at yoursocialfans.

derekmabie
derekmabie
3 years 9 months ago

@JessicaDonlon thanks for sharing! hope you enjoyed it.

trackback

[…] Duplicate content is the evil twin in SEO because it makes it difficult for search engines to …spinsucks.com/…/duplicate-content-two-ways-to-avoid-seo-pr… […]

trackback

[…] Duplicate Content: Two Ways to Avoid SEO Problems by – Spin Sucks (author unknown) "Duplicate content is the evil twin in SEO because it makes it difficult for search engines to …spinsucks.com/…/duplicate-content-two-ways-to-avoid-seo-pr…" http://spinsucks.com/social-media/duplicate-content-two-ways-to-avoid-seo-problems/ […]

trackback

[…] Duplicate Content: Two Ways to Avoid SEO Problems by – Spin Sucks […]

mhellstern
mhellstern
3 years 8 months ago

@rebecca_blake good one! here’s another good explanation: http://t.co/KKlcM6Ya – there are ways around it but best to avoid if possible

rebecca_blake
rebecca_blake
3 years 8 months ago

@mhellstern you’re so full of #win! :) #hugethanks

mhellstern
mhellstern
3 years 8 months ago

@rebecca_blake happy to help, you know I love nerding out about that kind of stuff :) keep me posted on how it progresses!

rebecca_blake
rebecca_blake
3 years 8 months ago

@mhellstern :-)

rhonda hurwitz
rhonda hurwitz
3 years 6 months ago

I came late to this conversation, but appreciate this post  — thanks for the clarity on duplicate content.

trackback

[…] Last week we wrote a quick post about proper URL structure. To recap, web addresses should be precise, should separate words with hyphens (as opposed to underscores), include keyword(s), and should properly describe the page. Redesigning a site is the perfect opportunity to start fresh with URLs, but make sure to avoid Duplicate Content by adding 301 Redirects and rel=”canonical” tags to pages that require them. For more information on this, visit the guest post we wrote on SpinSucks. […]

wpDiscuz
[postmatic_subscribe_widget]