The problem with plagiarism, as with any activity that those who indulge prefer to keep secret, is that we don’t know how prevalent it is, so we can’t say how effective are the measures we take to prevent or detect occurrences, or whether what we do really does make a difference.
Turnitin, probably the most well-known of the various online text-matching services (aka plagiarism checkers or plagiarism detectors), tries – possibly needs – to have it both ways. They try to show that more and more students at all levels of education are plagiarising, so schools need to buy their detection services, and they also try to show that schools which use their services have reduced levels of plagiarism.
Now Turnitin has published a Research Study “Turnitin Effectiveness: Plagiarism Prevention in U.S. High Schools.” It is regarded as so telling that the company has revamped its home page to publicise the study:
The page links directly to an interactive map which purports to give a state-by-state record of its effectiveness in US high schools which use the service, over 8 years.
The report shows that Turnitin has not been equally effective all over the country, and even suggests that plagiarism has increased in some states over the years. But the trend is, in most states, for the incidence of unoriginal material in schools using the service to have decreased over time. In particular, the detection rate in Massachusetts has fallen by more than 80% over the past 8 years.
It sounds impressive. It looks impressive.
Until you read the study.
The study is flawed, on several counts.
The first is that the study considered only the incidence of papers with an Overall Similarity Index (OSI) of 50-100 per cent unoriginal content. There is no accounting for papers with an OSI of less than 50%.
Papers with 50% or more unoriginal content are unoriginal indeed. The study notes that even such a high rate “does not necessarily equate to plagiarism.”
I suggest it shows, at the least, a great deal of unoriginal thought and borrowing. It is good to see that the number of such papers is much reduced in schools using Turnitin in most states.
The Study puts forward no proof that the reduction of such high levels of unoriginal content is due to Turnitin, only supposition that the reduction is due to use of Turnitin. There is no accounting for other factors, such as more teacher awareness of plagiarism, or better training of teachers, and better teaching, or more student awareness of the conventions of academia, or better training of and learning by students, or any of the above combined, and maybe other factors too. There is no comparison with schools not using Turnitin. There might even be a shift in cultural mores: the Josephson Institute’s biennial survey on the Ethics of American Youth for 2012 shows that the number of students admitting to plagiarism had dropped since the previous survey. Donald McCabe’s latest study (admittedly of college level students) also showed a drop in the number of students admitting to plagiarising (for more details of these studies, see Go Figure!)
Additionally, it would be good to know the extent of plagiarism, “real” plagiarism and not just unoriginal content, in the 0-50% range, and whether there is more, or less, than in previous years. But Turnitin does not tell us this, even in high schools using Turnitin.
The study tries to demonstrate that Turnitin is effective in reducing plagiarism. It also tries to demonstrate that Turnitin (and other plagiarism detection and prevention services) has greater impact in reducing high-rate unoriginal writing than is evident in schools which do not use these services. Maybe. Maybe not. They try, and fail, to make the case.
This is where the study really does fall down.
There is a whole section which addresses the question “How Would Schools Using Turnitin Compare With Schools That Do Not Use a Plagiarism Prevention Service?” The argument here is based on conjecture and assumption, built on false premises. It should not surprise that the study declares that Turnitin works wonders. When you can manipulate the figures and misinterpret the evidence, you too can work wonders. Smoke and mirrors and flashy technology…
Normally, a study which attempts to show that A is different to B would study two groups, A and B. A study which attempts to show that A is more effective than B would measure both A and B over time (and possibly C as well), and try to isolate other factors which might have a bearing on any differences in the changes measured.
This study does neither. “B” does not exist:
Unfortunately, there has not been a comparable study on the levels of unoriginal writing in schools that do not use plagiarism prevention services. Most of the data available is in the surveys referenced in the introduction of this paper. All of these studies point to a steady rise in plagiarism.
We have an expression in England, it fits this piece of flim-flam perfectly: “Yerwhat?”
The introduction mentions two studies. The statement “All of these studies point to a steady rise in plagiarism” is grammatically wrong. Even more seriously, it is factually wrong.
The introduction states:
Plagiarism seems to make headlines on an almost daily basis, and the news is often unsettling. Research supports the view that plagiarism is on the rise. In a 2011 survey by the Pew Research Center of 1,055 college presidents, 55 percent said that plagiarism has increased. Of the college presidents who believed plagiarism to be on the rise, 89 percent attributed the cause to computers and the Internet.1 In another study of high school plagiarism, Don McCabe of Rutgers University asked 24,000 students at 70 high schools if they plagiarized. 58 percent of students admitted to plagiarism.2
The two sources are given in full in the endnotes. Let’s look at each in turn.
1 Pew Research Center. The Digital Revolution and Higher Education, Pew Research Center, 2011. Web. September 9, 2013. <http://pewinternet.org/Reports/2011/College-presidents/Summary/Findings.aspx>
The mention of the PEW Research Center study (2011) is irrelevant on two counts. In the first place, it discusses perceptions of plagiarism. Reporting that perceptions that plagiarism has increased is not the same as saying that plagiarism has increased. The study makes no measure of plagiarism, reported or self-reported; it is an opinion poll of college presidents, not a study of incidence or incidents. The other reason this study is irrelevant is that it was carried out at college level. It says nothing about the prevalence of plagiarism at high school level, nor whether it is increasing or not.
Aside: this article from the UK Guardian provides an analogous lesson in the difference between public perception and statistical “fact”: Public perception of crime higher despite falling figures, report says.
As a further aside, there is much concern in UK today about sexual abuse of children, and the implication is that the internet has increased the availability, interest in and levels of child pornography. At the same time, the number of historical cases of sexual abuse of children, especially but not exclusively in religious and welfare institutions, suggests that there is nothing new. It is not necessarily the internet which has increased the rate of abuse. It is our awareness that has increased and, being aware, more is discovered, realised and reported, both currently and years after the event.
The use of the second source, a report of a study by Donald McCabe, is worth investigating in detail. It also illustrates the point that we don’t reference just to be honest (and avoid plagiarism): we reference so that the reader can follow up on the sources which informed our own knowledge, can see how well we have conducted our own research, can judge how (and how accurately) we have used the source material.
2 McCabe, Donald. Students’ Cheating Takes a High-Tech Turn, Rutgers University, 2010. Web. September 9, 2013. <http://www.business.rutgers.edu/media/coverage/students-cheating-takes-high-tech-turn>
The impression given is that McCabe’s study took place in, or at least was published in, 2010. Wrong. The link given here leads to a press cutting on the Rutgers site. Follow the link given here and we reach, not a research study, not an academic paper. but a news item in the DenverPost, Students’ cheating takes a high-tech turn, which mentions McCabe’s survey of 24,000 students in 70 high schools, and the note that 58% of these students had admitted to plagiarism.
And, oh yes, despite the reference in the Turnitin study, Donald McCabe did not write the Rutgers press alert and he did not write the DenverPost article either. The cutting is a slightly edited version of the DenverPost article, and that was written by Jeremy Meyer (2010). Moreover, McCabe is noted as saying that “cheating on tests in high school is on the rise.” An increase in the rate of cheating on tests does not necessarily equate to a rise in the rate of plagiarism.
In passing, Shay Maunz (2013) in the Charleston Daily Mail this week reported the press release announcing the Effectiveness Study. In this story, McCabe’s study is updated to 2013: “A Rutgers study this year found that 58 percent of students admitted to plagiarism.” So new myths and misunderstandings arise.
(The advertisement for WriteCheck is ironic. Not only is WriteCheck the “only Checker that uses Turnitin.” WriteCheck is a sister company of Turnitin. See Authentic Authenticity.)
When was McCabe’s study actually published? Not 2013. Not 2010. In fact, the literature includes various mentions of the study’s findings, but there seems to be no publicly available version. (Please let me know if you discover otherwise.) The first mention of these data appears to be in a paper by Daniel Wueste (2008), Unintended Consequences And Responsibility. Wueste includes a table which originated in a presentation McCabe made at Clemson University on the occasion of the CAI moving from Duke to Clemson. Wueste notes in his paper that he was given permission by Don McCabe to use the chart in his paper.
And here we see that McCabe’s studies took place over several years to 2007, six years ago. It is hardly current.
The table gives us a statement: many high school students cheat. There is nothing here about the rate at which high school students are plagiarising, nor whether the rate is increasing or decreasing.
So the premise that “All (sic) of these studies point to a steady rise in plagiarism” is false. This section of the Turnitin Research Study is built on false premises: an irrelevant, inconclusive and meaningless study of college presidents’ perceptions of plagiarism amongst a completely different population, and a misreported and out-of-date study of high school students which gives us a baseline but does not address increase or decrease in the practice of plagiarism.
But let’s imagine they do show that they do point to a steady rise. Let’s.
While we are at it, let’s imagine how great the rise, the growth rate, is.
Chart 3 shows the average annual reduction in unoriginal work for Turnitin users (5.6%) and compares it against three hypothetical rates for levels of unoriginal writing in high schools that do not use plagiarism prevention technology.
False premises. Hypothetical rates of increase. Do these really demonstrate how much more effective Turnitin is?
And don’t forget, these are papers with more than 50% unoriginal content.
The discussion makes the suggestion that, assuming an increase of 5% per year and starting from the same rate in 2005, the prevalence of unoriginal content of 50% and higher could be twice as high today in schools which do not use Turnitin (or other plagiarism prevention service) than it is in schools using Turnitin.
So, while we’re at it, let’s imagine that increased awareness and understanding, teachers and students, has resulted in decreased rates of unoriginal writing in schools not using Turnitin (or other plagiarism prevention service). Why not, it only takes imagination.
I would be especially interested in seeing the data for schools which take externally moderated national and international examinations. But for the moment, I’ll just have to imagine. Just imagine.
What then does the study show? That Turnitin is effective, possibly. That it is more effective than in schools which do not use Turnitin … we just do not know. Despite this study.
But in postscript I would mention one further assumption – the assumption that Turnitin works, that it is effective at uncovering unoriginal writing. And this, as their own advertising demonstrates (see Carried Away and Never Mind the Quality…), is palpably an assumption too far.
Maunz, S. (2013, September 25). Study says fewer WV students are plagiarizing. Charleston Daily Mail. Retrieved from http://www.dailymail.com/News/statenews/201309250110
Meyer, J (2010, May 27). Students’ cheating takes a high-tech turn. DenverPost. Retrieved from http://www.denverpost.com/news/ci_15170333
Pew Research Center. (2011) The Digital Revolution and Higher Education, Pew Research Center. Retrieved from http://pewinternet.org/Reports/2011/College- presidents/Summary/Findings.aspx
Rutgers University (2010). Students’ Cheating Takes a High-Tech Turn. Rutgers University. Retrieved from http://www.business.rutgers.edu/media/coverage/students-cheating-takes-high-tech-turn
Turnitin (2013, September). Turnitin Effectiveness : Plagiarism Prevention in U.S. High Schools: Research Study. Turnitin. Retrieved from http://pages.turnitin.com/rs/iparadigms/images/Turnitin-Effectiveness-SE-National.pdf
Wueste, D. (2008, Fall). Unintended Consequences And Responsibility. Teaching Ethics, 13-24. Retrieved from https://webprod1.uvu.edu/ethics/seac/Wueste-Presidential%20Address%20-%20Unintended%20Consequences%20and%20Responsibility.pdf
Pingback: Studies in statistics | Honesty, honestly…