Flattering flaws

Print Friendly, PDF & Email

It’s ironic that I have to thank Turnitin for bringing Retraction Watch to my attention.

Retraction Watch is a blog written by Adam Marcus and Ivan Oransky. It aims to report on retractions made in science journals.  Scientific knowledge is not static, but it does tend to develop slowly. New knowledge is gained as connections are made, as instruments become better calibrated, as new methods or techniques are tried, as exceptions and anomalies become known and are investigated.  Even revolutionary AHA! insights are built on what has gone before, what is already known.  The catch-phrase, often attributed to Isaac Newton, is “Standing on the shoulders of giants” – we build on what has gone before.

Normally, new knowledge does not mean that the old knowledge, what we used to think, believe or “know,” must be corrected or retracted.  Often it is left for an information seeker to do a thorough job of searching the literature, including online literature, to make sure the information found is solid and up-to-date. We looked at just how thorough we need to be in an earlier post, “Getting it wrong.”

However, it is not unknown for scientific papers to be falsified, for results to be fabricated, for experiments to be misreported, for methods to be shown to be flawed, for researchers to plagiarise their material. Such behaviour is often fraudulent. Whether it is deliberate or accidental, flawed research can lead to misunderstanding and to years of wasted research when trying to replicate or build upon results. In medicine and many other disciplines, flawed research can be fatal.  When such flaws become known, academic papers are retracted.

Occasionally, such retraction is headline news, as was the case with Hwang Woo-suk’s flawed cloning research. More usually, retraction is reported quietly in the journal in question, and few are any the wiser.  Enter Retraction Watch, alerting the scientific community to retractions published in the journals monitored.

The problem of research not being published because it fails to replicate earlier studies is known, or fails to produce the desired results, is something else again, outside the scope of this particular blog post. Failure to publish can sometimes be as damaging as the publication of flawed research. A good starting-place for those wanting to pursue this line of thought is Ben Goldacre’s Bad pharma, and his TEDtalk “Battling bad science.

Where does Turnitin come in?

Well, in December 2012, iThenticate, the company behind Turnitin, published a press release to promote a survey conducted by iThenticate which investigated plagiarism in scholarly studies.  The press release was headlined, “Survey Shows Plagiarism a Regular Problem in Scholarly Research” (iThenticate Press Release, 2012 December 5)  and was designed to promote the findings of their “2012 Survey Highlights: Scholarly Plagiarism” (iThenticate 2012).

Both the full press release and the survey highlights are “interesting” and are worth investigating further – and I’ll be doing just that in other blog posts in the near future. Here, I want to concentrate on just a single paragraph.

In the course of the press release, the PR office draws attention to a survey finding, that researchers seem reluctant to use “plagiarism detection software,” and goes on to note:

Editors at scholarly publications were the exact opposite, with a majority reporting routinely checking submitting authors’ work for plagiarism. The web site Retraction Watch estimates that the number of retractions in scholarly publications doubled between 2010 and 2011 (iThenticate Press Release, 2012 December 5).

Thus my discovery of the Retraction Watch blog.  Thank you, Turnitin.

I used the term “irony” – in fact there are several ironies. One irony is that I have major concerns about Turnitin.com, so to owe them a thank you is irony indeed.

Another irony is that the press release writer makes no reference to the source/s of his (or her) statement: “the number of retractions in scholarly publications doubled between 2010 and 2011.”  It does not come out of the Survey Highlights paper, which seems to be all that is available about the survey as published on the iThenticate website.

I would like to know more.  How many papers were retracted in 2010? How many in 2011?  How many journals were monitored by Retraction Watch over the two years, how many papers were published in total? If twice as many papers were published in 2011 as in 2010, then a doubling of the number retracted may not be as much of a cause of concern as iThenticate suggests. What is the source of iThenticate’s claim?

I cannot find a statement regarding 2010 figures in Retraction Watch, and given that the blog started up in August 2010, just over half-way through the year, any figures may not refer to the complete year.  There are figures available for the full year 2011 (Oransky, 2011, December 30) and the near-full year 2012 (Oransky, 2012, December 24), and they show that the number of retractions in each of these years was about the same.  Any increase in the number of retractions is not evident in these two full years. The trend, if there was any, is not evident.

It’s irritating, not to be able to track down iThenticate’s shock-horror claim, to verify it and to find out more.  The representative of a company which claims that it aims to help students use other people’s material in academically acceptable ways should surely have a better notion of academic referencing? Granted, the press release is not an academic paper and so might not be expected to cite all its sources in pukka MLA (or any other) style. On the other hand, given the lack of mention of  Retraction Watch in the Survey Highlights in the press release, this appears to be an interpolation of the press release writer – so perhaps we could and should expect a little more?

There are several other points to be made, several more ironies.

Retraction Watch’s 24 December 2012 blog post makes the point that the year of retraction is NOT an indication of when the offence occurred.  It shows only when the publishing journal acted upon perceived problems with the paper and/ or study. Many retractions are made for older papers (Oransky, 2012 December 24).  So, once again, we cannot say that the rate of retraction is increasing, let alone doubling.

And again… let’s just take another look at that paragraph from the press release:

Editors at scholarly publications were the exact opposite, with a majority reporting routinely checking submitting authors’ work for plagiarism. The web site Retraction Watch estimates that the number of retractions in scholarly publications doubled between 2010 and 2011 (iThenticate Press Release, 2012 December 5).

In the context of the report, of the survey, of the headline, of the press report, the implication is that the rate of plagiarism is doubling, year on year.

In the right-hand column of the Retraction Watch blog there is a drop-down index which enables one to navigate to “Retraction posts by author, country, journal, subject, and type.”  At the time of writing, RW gives reasons for the retraction of 657 papers.  Some papers have more than one reason for retraction, and it is clear that many times, the publisher gives no reason, so none is recorded.  (It would be useful to know how many papers are retracted without reason shown.)

Retraction Watch - acsues for retraction


Of the 657 papers retracted with reason recorded, only 105 are retracted because of plagiarism issues, that’s just about 16%.




Reasons for retraction which are more common than plagiarism are

duplication (126)
papers for which the findings are not reproducible (124)
image manipulation (124)
faked data (104)

Text matching software might detect duplication, but it won’t detect any of the other three main causes for retraction.  Duplication occurs when an author submits substantially the same paper to two or more journals (or occasionally, submits the same paper to the same journal more than once). Duplicated papers are a problem, but if the science and the investigation and the write-up are sound, then they might not be in quite the same class as plagiarism.One should not submit substantially the same material to multiple publishers without declaring this so, but it is not an attempt to pass someone else’s work off as one’s own, it is one’s own.

While 16% of papers retracted for plagiarism is a lot, is too many,  and it is 35% if we include retractions for duplicate submissions in this calculation, there is still nothing to suggest that the number of retractions due to plagiarism has doubled.  The shock-horror of that press release is again diluted.  We should be far more concerned by the prevalence of bad science and fraud, and iThenticate’s text-matching software is NOT going to discover manipulated images, falsified figures or flawed methodology.

As suggested in an earlier post (Go figure), we are sometimes blinded by numbers and statistics, and it pays to investigate more deeply.  It is probably even more important when the statistics are produced and the studies are conducted and interpreted by companies and organisations with vested interest in the results.

In the opening pages of Bad pharma, Ben Goldacre makes much the same point:

Before we get going, we need to establish one thing beyond any doubt: industry-funded trials are more likely to produce a positive, flattering result than independently-funded trials … this is one of the most well-documented phenomena in the growing field of ‘research about research’ (Goldacre, 1).

iThenticate and Turnitin fund their own studies.  Think on.

These aren’t the only reservations I have about the press release and the survey report. To be continued…



Goldacre, B. (2012). Bad pharma. London: Fourth Estate.

iThenticate (2012). 2012 Survey Highlights: Scholarly Plagiarism. iThenticate. Retrieved from http://www.ithenticate.com/Portals/92785/docs/plagiarism-survey-results-120412.pdf

iThenticate Press release (2012 December 5). “Survey Shows Plagiarism a Regular Problem in Scholarly Research, Publishing, But Action to Prevent Falls Short.” iThenticate. Retrieved from http://www.ithenticate.com/press-releases/plagiarism-survey-2012/

Oransky, I. (2011, December 30). “The Year of the Retraction: A look back at 2011.” Retraction Watch.  Retrieved from http://retractionwatch.wordpress.com/2011/12/30/the-year-of-the-retraction-a-look-back-at-2011/

Oransky, I. (2012, December 24). “How many retractions were there in 2012? And, some shattered records.” Retraction Watch.  Retrieved from http://retractionwatch.wordpress.com/2012/12/24/how-many-retractions-were-there-in-2012-and-some-shattered-records/

7 thoughts on “Flattering flaws

  1. Interesting, thanks for bringing this to our attention. We weren’t aware of the release, are not sure where the doubling claim comes from. As you point out, we launched in August 2010.

    One place to find trends like this over time is at Neil Saunders’ site: http://pmretract.heroku.com/byyear His plots show that retractions are certainly on the rise, as do Thomson Scientific’s numbers as reported by Nature: http://www.nature.com/news/2011/111005/full/478026a.html The number of retractions in 2011 was ten times the number in 2011, with only a 44% increase in the number of papers published. So there is something going on, but we’re not aware of any doubling from 2010 to 2011.

    Incidentally, we’re grateful you made use of our categorization drop-down menu, as others have. Just two notes: a) We do not claim to be comprehensive, and in fact usually leave plagiarism and duplication retractions at the bottom of the priority pile because there are so many (and other stories are more interesting. So the figures there may not represent overall figures. b) The numbers in parentheses are the numbers of posts in each category, not the number of retractions. Some are about more than one, while others are follow-ups on retractions we’ve already covered.

    Thanks again for your interest.

    Ivan Oransky
    Retraction Watch

    • Thanks, Ivan.

      Thanks for the clarifications and corrections, and for the links and hints about for further exploration. Explore, I shall.

      This just confirms the need to cite one’s sources, so that others can follow up on where you got the evidence.

      Thanks, John

  2. Pingback: Floored by flaws? | Honesty, honestly…

  3. Pingback: Carried away | Honesty, honestly…

  4. Pingback: Texas sharp-shooting? | Honesty, honestly…

  5. Pingback: Memory hole | Honesty, honestly…

  6. Pingback: Vested interest | Honesty, honestly…

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.