Helping you get media coverage
RSS Sources Select News RSS Feed | SOURCESCalendar   


Dropped Search Engine Rankings Caused by Duplication

By Ross Dunn

CEO, StepForth Web Marketing Inc.



Repeatedly my sales and consulting staff find themselves explaining that using duplicate content can and will negatively affect search engine rankings and it is heartbreaking to see clients having to rebuild rankings due to such a simple mistake. As a result, I felt it was time to write this article and hopefully dispel many misled website owners.

Why write an entire article on something as simple as duplicate content? Well probably because it is not as simple as it sounds and many website owners find themselves in the grey area of duplication; where they don't know for sure whether they are risking valuable rankings or not.

The following is a sectional breakdown of the most common duplicate content issues we see defined from the standpoint of a question – hopefully making this article a little easier to read. After all, I have no illusions that reading up on duplicate content rules is exciting.

Duplicate Websites

Definition: a duplicate website is a website that has many if not all of the same pages as another live website.

Note: the following questions are based on a person who owns two websites that are duplicates.

Q: ”Why is a duplicate website such a bad idea?”

A: The major search engines are constantly trying to improve the quality of their search engine results in an effort to provide the best quality content for users. When duplicate content is indexed by search engine spiders, valuable time and processing power is wasted. As a result, search engines have blocked sites that used duplicate content from their database, ultimately favouring the site that either had the content first, or I believe, the one site that has the greater online history. In addition, the major search engines have a bad taste after dealing with so much duplicate content created by spammers over the past several years. As a result, posting a duplicate website is an offense that can quite literally blacklist a domain; there are few things the search engine properties dislike more than being gamed by spammers.


Q: ”What should I do with my duplicate website then? Just delete it?”

A: Deleting the site is the only option unless you want to create an entire new website with unique content and a unique purpose. That said, by deleting the website you can still ensure the effort you put into promoting the old site does not go to waste by pointing the domain to your new website's domain using a 301 redirect. A 301 is a term used to describe a server protocol which Google and other search engines will 'see' when they visit the old site. The protocol essentially says that your content from the old site can be found on the new site and that this is a permanent forwarding of all traffic. 301 redirects are by far the best way to minimize your losses from shutting down a website that just might have traffic or inbound links.


Q: ”Which website should I shut down? Is there anything I should consider first?”

A: Yes, it is very important that you keep the website that has the most backlinks and has been online the longest. The reason I say this is that Google tends to favour entrenched websites; they have been around a while, are well backlinked and overall appear to have a positive history.


Q: ”Will using a 301 redirect pass on the benefit of the deleted site's link popularity?”

A: Link popularity is passed onto the other website when a 301 is used but how much this pass-over will benefit the website seems to fluctuate on a case-by-case basis. Usually the fluctuation is only present when popularity from one domain is passed to another with differing content/topic. In this case, since the link popularity is being redirected to an identical website I expect the benefit to be virtually lossless.

Duplicate Content

Definition: content appearing within a website that is duplicated elsewhere on the same website or elsewhere on the Internet.


Q: ”I need content for my website; can I just copy content from industry journals and benefit from that quality content?”

A: No, aside from the copyright concerns of using content that is not yours, your rankings (if they exist) would suffer because it is highly likely the major search engines would detect the duplicate content. As a result, the page that you create may get flagged as duplicated and it would be ignored at the very least. The page could even devalue your site's overall credibility. Credibility is a critical component of Google's algorithm so sites with less credibility tend to have a harder time staying ('sticking' if you will) in a particular ranking.


Q: ”How much of my page should be unique? Is there a standard ratio or percentage you can share?”

A: There is no industry standard formula but if I had to state a percentage I would say a minimum of 70% of the page should be completely unique to thwart any concerns of duplication. You may be able to get away with less than 70% unique content but I would suggest this is playing with fire. Either way, this statistic is moot since every page you create needs to be created with the intention to provide a powerful resource; after all search engines are only a small part of the plan &ndash you do need visitors to like what they see and buy your product or service!

Conclusion

I am sure I didn't cover every question regarding duplicate content but I am fairly certain I touched on the most common questions we see at StepForth. If you would like to submit a duplicate content question or any other SEO question please email them to ceo@stepforth.com and I will endeavour to respond as soon as possible; likely in an article format or an SEO blog posting.

See Ross Dunn's Sources listing on page 188 of Sources #60, or online at www.sources.com.