How to Find & Fix Duplicate Content Issues on eCommerce Websites


35 Comments
  1. Hi,
    Its a good article. My question is how to handle the following situation.
    Let I have a product
    Sony Headphones XYZ { This product Over-ear type, Water and dust proof and durable}
    Lets I made a following posts on Headphone
    1. Sports Headphone
    2. Durable Headphones
    3. Over Ear Headphone

    Since product XYZ qualify for the above 3 posts so I would like to mention it in all 03 post, so no I have following questions
    1 would it create a duplication of content
    2 How to deal with it.

    Note: kindly reply me on my email too
    Best Regards

  2. Hi!
    Thank you so much for this guide

    According to the study conducted by Moz, Google’s display titles max out (currently) at 600 pixels: https://moz.com/learn/seo/title-tag

    In 2020, the meta description length is 920 pixels. This equates to 158 characters on average. Rather than cutting to characters, Google truncates long descriptions to the nearest whole word and adds an ellipsis.

    Use a tool like SERP Pure optimization tool https://www.serpure.com to preview meta titles and meta descriptions and keep them under the limit.

    Thank you and please keep up the great stuff!:)

  3. Thanks for the great article. I have a quick question. While doing a SEO audit I received a warning stating all my product pages of the ecommerce website are Orphaned Pages. How do I fix this issue specifically for an ecommerce website with 1000 products? Is there a way to interlink product pages for Google Bots to not tag them as Orphaned?

    • It sounds like perhaps your internal linking is not in place to reach paginated category page results? I would also look at category structure, spreading a bit wider there (adding more cats and sub cats) could help provide the internal links necessary to your products. You should not orphan your product pages!

  4. Hi,
    What if you have a business that serves multiple locations, and you have duplicate content for each location EXCEPT for the location names?

    So, let’s say you have a 500 word page about “Real Estate Law in Kansas City,” and then a nearly identical page (on another subdirectory or subdomain) on “Real Estate Law in St. Louis”. Then you change any and all geographic references.

    Is this duplicate content? Will you be penalized?

    • Jonathan,
      Our advice would be to make it unique content. For example how are the laws different in KC vs STL in regard to real estate. Are you a member of a bar association or other professional group in each?

      Also you won’t be penalized per say, but you might not see your rankings go in the right direction either.

  5. I have a question
    Wha i case client has two websites with same domain (websitename) on 2 different ccTLd’s like .com and .com.au
    These two websites are duplicate of each other.
    Also, they are selling products on popular third part websites so everywhere the product descriptions are same which is duplicate content.

    I plan to do separate keyword research and write on page content for .com and .com.au version of website landing pages.
    Also, going to set target location in webmasters separately.

    But what about products descriptions as there are a lot of products and on both websites as well as third party sites there is same description.

    Adding an in depth content for products on own website is an option but as there are two personal websites so what will be the right approach.

    Looking for help in this case.
    Thanks

  6. Hello Dan Kem,

    I want to know if excerpts of posts on my site homepage is duplicate content?
    Whether it hurts SEO?

    Thank you!

    • Hello Harley,

      If that is the “only” content on your homepage it is not good for SEO. However, if your homepage is sort of like a blog home page where you have excerpts from recent posts, that is fine so long as you have other content unique to that page.

  7. OK, so I understand the importance of original, unique content. I also understand the tactic of keeping any ‘duplicate’ content hidden from search engines.

    However, I have several ecommerce clients who are selling third party products which are ALSO sold by other merchants as well, so the product titles have been identified as ‘duplicate’. But to hide them defeats the purpose of having the products on the website at all.

    Furthermore, some of the products are very basic and very similar in nature (eg. a ‘rose gold cake topper’ versus a ‘glitter cake topper’). So how reasonable is it to expect the client to generate original, unique content for each? But again, to hide the products makes no sense either.

    What to do…?

    • It’s a simple as this. If your client sells the same products as everyone else who sources their stock from that distributor or manufacturer, what is to differentiate their page from everyone else’s? If you were Google, how would you know which ones to show in the search results if there are dozens or hundreds of virtually identical results?

      If the client doesn’t care how those pages rank, then they shouldn’t be in the index. If the client does care, then they need unique titles and product descriptions.

      I sympathize with the issue of writing copy for products that are very similar. We handle this in several different ways. One option is to combine the products on one page and allow the visitor to select their option via drop-down. Sometimes that doesn’t make sense for the situation so here are some other options: You could choose one version of the product to be “canonical” and point the rel =”canonical” tag the other product pages to that one. This way you only have to write unique copy for one of them. Of course we also have some clients where it makes sense to get creative and hiring a copywriter to highlight the small differences between the different versions in unique copy on each product page.

  8. I work with a re seller website reselling market research reports published by market research companies. Our website is having 3 Lacs reports of different publishers. The report/product description usually provided by publishers are duplicate content. How to solve the issue?

  9. I found this article extremely useful! You’ve highlighted for me several areas that I need to improve upon on my website. Thank you so much!

  10. While trying to find a solution for my “problem”, I stumble upon this great article. Trying to understand it all, I hope you still want to answer this question: Our real estate board issues a monthly update on the market..I post this update on my site. Can I then use a canonical url pointing to the real estate board, even though the real estate board only places it on their site as a PDF? Or how should I go about this, as many other agents do the same. (without re-writing)

  11. Hi Dan, Great article mate 🙂 Have their been any updates to this since last year and the changes happening with the google algorithm updates?

    • Hi David, yes there was recently a “quality” update launched by Google. More info here, here and here.

      Glenn Gabe is seeing “thin content” as a big culprit. We don’t know anything directly from the source (Google), but the content quality issues appear to be similar to Panda. Everything in this article still applies. You only want to have high quality (non-duplicative, deep and authoritative) content indexed in Google. Hope this helps!

  12. Excellent article. I have a question. I am building an ecommerce site with more than 1000 products. I simply do not have the time or the budget to put unique product descriptions and/or specifications for each product. I realise that if i did that I would certainly be in good standing with Google but realistically this is almost impossible. Do you have any suggestions?

    • This is a common scenario, David. It reminds me of a mantra that the Chief Digital Officer (of a publishing company that I worked for previously) had about eCommerce sites: “If you can’t write a unique description for this product tailored to our audience, then don’t put it on the store (website).” So, I would first suggest revisiting your belief of not having time/budget to ensure you have unique product descriptions and consider how to make the investment. Your product pages and category pages are the foundation of your store, and their quality will set the foundation for your success online. There are numerous copywriting services that can help you scale the copywriting. Consider checking out Copypress and more services with this Google query. Consider hiring an intern or two, or even family members, and having them rewrite the duplicate product descriptions. If you have a store with an unusually large amount of products (i.e. – 10,000 ), then consider rewriting/improving the top 10-50% that get the most organic search traffic (improve what’s already working to make it work better). As a last resort, you could set your product pages to “noindex,follow” (via a meta robots tag) if they don’t get any organic search traffic due to duplicate content (and your inability to improve them for various reason). Google’s own John Mueller has stated that if you don’t plan to improve low quality content, then either delete it or set it to noindex until you can improve it. In that case, you would focus your copywriting efforts on improving your category pages (optimizing meta titles, meta descriptions and on-page intro descriptions of 100 words for target keywords) and really building out your strategic content marketing efforts (blog posts, video, infographics, etc.) in order to create content about related topics people are searching for and increase your search engine visibility/discoverability in that manner. Hope this helps!

  13. Excellent post, thanks for putting this info together in a clean and easy to read format. Bookmark’d for later use. 🙂

    I’ve been looking for creative ways to check for duplicate content on very large websites, > 100,000 pages. Siteliner and Copyscape are too expensive to make it worthwhile. Any suggestions?

    • Hello Dan,

      There are affordable tools to do checks for duplicate content internally. For example, you could use Screaming Frog to crawl your site and report on duplicate titles, descriptions and other issues typically caused by technical reasons for duplicate content. Other types of internal duplicate content (such as copied and pasted text or duplicate product descriptions on different products) are a little more difficult to catch without a full content audit.

      With regard to external duplication, we typically start with a Copyscape check of 1,000 pages from various sections of the site, which gives us a general idea as to the scale and cause of external duplicate content problems. From there it tends to be an issue of fixing the problem more than identifying more pages with the symptom.

  14. I’m currently up to my eyeballs in duplicate content so this post is a serious life saver!

    Thanks for sharing!

  15. When did robots.txt start supporting wild card entries?

    > Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines. The ‘*’ in the User-agent field is a special value meaning “any robot”. Specifically, you cannot have lines like “User-agent: *bot*”, “Disallow: /tmp/*” or “Disallow: *.gif”.
    Source: https://www.robotstxt.org/robotstxt.html

    • James Google has been obeying wildcard directives in the robots.txt file for several years. You can verify this by using them in the robots.txt testing tool in Google Webmaster Tools. As for the correct syntax according to other organizations, they may not be technically supported. As SEOs we tend to think more about how syntax is treated by search engines. According to Google’s Developer Help page:

      Google, Bing, Yahoo, and Ask support a limited form of “wildcards” for path values. These are:

      * designates 0 or more instances of any valid character
      $ designates the end of the URL

  16. Great article indeed Dan. It’s a wonderful resource for ecommerce site owners and staff at SEO companies. A small correction though. Panda algorithm was launched in Feb of 2011 and not 2012. Thanks again.

  17. That’s an awesome list and a even better check list for all Webmasters in eCommerce. Thank you very much for sharing! Greetings from Switzerland!

  18. Dan this is an Epic post. It is tough to cover so much ground at once without sacrificing depth, but you’ve managed to strike that balance here. Good stuff!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

About The Author

dan kern.

Dan Kern (Alumni)

Dan Kern has more than 10 years experience in technical SEO, content strategy, on-page optimization and link building, utilizing white hat best practices to drive targeted customers and meet business goals.

View Author’s Profile

Request a Proposal

Let us build a personalized strategy with the best eCommerce marketing services for your needs. Contact us below to get started.

Send this to a friend