303.473.4400 / Toll Free: 888.772.0777
Select Page

Workers at the cancer information service

Duplicate content are blocks of text content that are the same, or extremely similar, whether it’s on the same site or on a different domain.

Google is not a fan of duplicate content because it lowers the user experience since the content is not unique and interesting.

It also affects site structure and decreases the quality of browsing e-commerce sites if people are seeing the same information in multiple areas.

How is Duplicate Content Seen by Google?

Google and other search engines have bots that crawl every site on a daily basis to index and understand what each site and page is about. They look at your site’s content and meta tags, which are where many sites have duplicate content. This information is then sent to a holding center for Google to “index” and retrieve quickly to searchers.

Areas that people often miss duplicate content are:

  • Home Page Duplicates (http://www.example.com vs. http://example.com)
  • Headers
  • Introductions to Products & Product Descriptions
  • Page Closings
  • Meta Titles & Descriptions
  • Hidden Pages
  • Scraped Content (Content stolen from other sites)
  • Staging Servers for Building Sites
  • Secure Pages (http:// vs https://)
  • International Sites

Google does not consider the following as duplicate content:

  • Copied Snippets on Forums
  • Products Shown in a Different Order in a Store Listing Search
  • Images, PDFs, and Printer-Only Web Pages

At first glance, it doesn’t seem as if there is a lot of possibility for duplicate content on your site until you look at the details of each page.

What are the Implications of Duplicate Content on Your SEO Value?

Duplicate content is negatively looked at by search engines because it makes it difficult for search engines to decide which page is more relevant to direct searchers to. Therefore, they have to decide which the original was or is the best to send traffic to. As a result of their frustration, you may not get any link juice to your site and/or you could drop in rankings, making it important to stay on top of duplicate content.

Tools to Diagnose Your Duplicate Content

If your site is larger than five pages, it can be difficult to identify where the duplicate content lies. These are some tools that you can use:

  • Google Webmasters—Check HTML Improvements & Crawl Errors as 404s can be seen as duplicate since it returns the same error message on different URLs
  • Moz—Must subscribe to MozPro
  • Run a Site: Search to compare if the number of pages returned is comparable to how many are on your sitemap
  • Screaming Frog—Identifies Meta Tags & Duplicates

It’s best to use a variety of tools to find errors and cross reference as some may not see all of them alone.

What Can You Do When Other Websites Steal Your Content?

If after using these tools, you find that the duplicate content is not on your site, then likely it is a result of someone scraping your content and publishing it on their site. Another reason you don’t want your content on another site is because if it’s a low quality site that’s linking to yours, then your site looks bad to Google—it’s quality through association.

First make sure that it’s not syndicated content. Then go through these tools to find out where the duplicate content is published at:

  • Keep Track of Link Trackback Notifications on Your Site
  • Look at Webmasters’ Links to Your Site
  • Copyscape—Identifies Other Sites That Have Duplicate Content With Your Site

Once you’ve identified that your content is duplicated on another site, there a couple of ways to address this situation.

  • Use an RSS Footer Plugin to Credit Yourself When Content is Scraped
  • Contact the Site & Ask Them to Remove It

If you can’t find out who the site owner is, you can use a Whois tool to find out.

What Can Be Done to Fix Duplicate Content?

In Webmaster Tools, choose a Preferred Domain (www.example.com vs. example.com) in Site Settings. With duplicate content pages, don’t just block the page in your robots, it’s better to use a canonical tag. This tells Google which page to direct the link juice to despite the duplicate content.

If you decide you must delete the page, then be sure to create a 301 redirect in your .htaccess file so that you don’t have a 404 error lingering. Otherwise, if the page is useful to visitors, reword the text to make it unique from the duplicate page. This is often the solution for duplicate meta title tags and descriptions.

For blogs and articles that go beyond one page, use a pagination tag to direct all the link juice to the first page and let Google know the other pages are connected to the first.

As a last result, you can always remove the page on Webmaster, which comes with a risk that it will be gone forever and is irrevocable.

Checking in on duplicate content about once a month is great to see if any errors are happening on your site and will keep your visitors and Google happy.

If you need help tackling the duplicate content on your site or fixing any errors, then contact us today for a FREE SEO Diagnostic Report.

Pin It on Pinterest

Share This