Copied content is scattered all across the Web, but a lot of those duplicates are perfectly acceptable. For example, posters often quote paragraphs from somewhere else, or reuse content with the author’s permission. In a Search Engine Land video, Google’s Head of Search Spam Matt Cutts said 25 percent to 30 percent of Web content is duplicated elsewhere, but Google only downgrades a small group of offenders who are guilty of massive copying.
“For the most part, duplicate content is not really treated as spam,” Cutts said in the video. Instead, Google groups together content that has been flagged as similar under the title of “alternate sources.” According to Cutts, content creators who are worried about SEO shouldn’t stress about duplicate content affecting their search ranking.
But even though original and duplicate content might show up on the same page of Google’s search results, the intent behind the Google Scraper Report is to separate maliciously copied content or spammy duplicate pages from articles that simply have a lot in common (such as two articles that quote the same source). Though this tool won’t get rid of copied content, it will help authors get original content to the top of search results and downgrade those that are simply copying and pasting content from legitimate authors with the intent of gaining traffic.
This new tool isn’t the only weapon content creators have to fight against lifted content. Authors can still file copyright claims with Google’s Digital Millennium Copyright Act tool and request removal of offending sites. However, the Google Scraper Report tool is useful in informing the search engine about fraudulent activity and making sure that copied content isn’t rewarded with Web traffic. Though copied content is a huge problem for authors, they can protect themselves and others by using all the tools available to them. By helping to identify these sites, authors can make copied-and-pasted content a less profitable endeavor for those who pursue it.