Smile, please – it’s for real

Print Friendly, PDF & Email

I came across this news item in the i newspaper (page 13 of the 29 August 2018 edition, a short article by John von Radowitz). The article reports on a study in which “Scientists showed 20 goats unfamiliar photos of the same human face looking happy or angry;”  they found that “goats preferred to interact with the smiling face.”

It sounds fun, it sounds odd, it almost sounds improbable.

Two things struck me immediately.  The first was that phrase, “unfamiliar photos.”  When you’re a goat, who’s to say whether a photo is familiar or unfamiliar?

The second was a memory – a memory of the academic paper Feline Reactions to Bearded Men.  You might remember it: the researchers claimed to have held cats in front of photos of bearded men and observed their reactions.  The paper suggests that ” Cats do not like men with long beards, especially long dark beards.”

The cats “paper” was first published in 1999, maybe earlier.  It is frequently used in website evaluation exercises to make students aware of web pages which look authentic but could be big hoaxes.

The name of the site – Improbable Research – is claimed as a warning signal (though as this is the site responsible for the annual Ig Nobel Prizes, a very real event, one might not be so sure). The biggest giveaway in the cats paper is probably the bibliography, which includes entries for Pat Boone, Madonna, Yul Brynner, Sinead O’Connor, Mary Quant, Arnold Schwarzenegger and the if-only Dr Seuss (responsible for the paper “Feline Responses to Hats”).  How much of a giveaway, 20 years on, might be questionable; many of the names are probably unknown Continue reading

Not just CRAAP – 3

Print Friendly, PDF & Email

[In part 1 of this 3 part article we looked at Wineburg and McGrew’s study which suggests that a fresh look at the way we evaluate web pages and sites could be valuable.]
[In part 2, we looked at a rebuttal of Wineburg and McGrew’s study – and rebutted the rebuttal.]
[In this third part, we look at reasons why we may need a compromise between the “old” and the “new” ways of evaluating pages and sites online.]

In my last two posts, I discussed a study by Sam Wineburg and Sarah McGrew into different methods of search-and-find as employed by three distinct groups, professional fact-checkers, professional historians and first-year undergraduates. The researchers found that the methods used and the thinking processes of the historians and the students were different to the strategies and the thinking processes of the fact-checkers – and that the methods used by these historians and the students could be among the reasons why many of them made incomplete analyses of the sites visited and made flawed conclusions.

In one particular task, a comparison and evaluation of two articles both of which dealt with bullying, the researchers found that historians and students tended to spend much time considering the actual articles before they moved elsewhere; some never left the target sites, some left them to look elsewhere. By contrast, the fact-checkers spent very little time on the target pages – sometimes just seconds; they all quickly looked elsewhere, often outside the publishing sites. That is not necessarily (at least in my eyes) a concern. What does concern is that the evaluations made by the two groups were very different. Continue reading

Not just CRAAP – 2

Print Friendly, PDF & Email

In part 1 of this three-part article, I discussed a study by Sam Wineburg and Sarah McGrew into different methods of search-and-find as employed by three distinct groups, professional fact-checkers, professional historians and first-year undergraduates.  The researchers found that the methods used and the thinking processes of the historians and the students were different to the strategies and the thinking processes of the fact-checkers – and that the methods used by these historians and the students could be among the reasons why many of them made incomplete analyses of the sites visited and made flawed conclusions.

The three groups were asked to complete six tasks in timed conditions. The findings and ensuing discussion are detailed in the paper Lateral Reading: Reading Less and Learning More When Evaluating Digital Information.

In this earlier post (Not just CRAAP – 1), I invited readers to try one of the tasks for themselves. If you haven’t already done this, it might be a good idea to try before reading on here.

The task asked participants to imagine they looking for information on bullying, and describe their thought processes as they considered two particular articles on two different websites.  The articles were Bullying at School: Never Acceptable on the site of the American College of Pediatricians (ACPeds – the College) and then  Stigma: At the Root of Ostracism and Bullying on the site of the American Academy of Pediatrics (AAP – the Academy).

Participants were allowed to look elsewhere on the sites and anywhere else online that they wished.  They had to decide which website was the more reliable and trustworthy.

What the researchers found was that Continue reading

Not just CRAAP – 1

Print Friendly, PDF & Email

Over the weekend, a newsletter item in the Chronicle of Higher Education caught my attention, One way to fight fake news by Dan Berrett and Beckie Supiano.  It was originally published in November 2017;  I’ve got behind in my reading.

The item reports on a study by Sam Wineburg and Sarah McGrew.  Wineburg and McGrew compared the search habits and evaluation techniques of three different groups, professional historians, professional fact-checkers, and students at Stanford University.  They found that :

  • the historians and the students mostly used very different techniques of search and evaluation to the techniques of the fact-checkers;
  • the historians and the students could not always find the information they were asked to search for;
  • the historians and the students took longer to decide on the validity and reliability of the sites they were asked to look at;
  • most disturbingly, the historians and the students came by-and-large to diametrically opposite conclusions to those of the fact-checkers as to the validity and reliability of the various sites; the two groups could not both be right.

Before reading further, you might want to try an approximation of one of the tasks undertaken by the participants (there were six tasks in all, in timed conditions). Continue reading

Memory hole

Print Friendly, PDF & Email

Yesterday, halfway through writing my next post, I needed a quotation I had used in an earlier post.  I quickly found the quotation, clicked on the link so that I could check and then cite the original source – and, horror, although part of the passage I wanted to use was still there, – the words of the vital sentence were not. They had been replaced, the evidence  I wanted to support my claim was no longer there.

The quotation in question was from the post Flattering flaws. I was commenting on a press release put out by iThenticate.com, promoting their then-recently published study Survey Shows Plagiarism a Regular Problem in Scholarly Research, Publishing, But Action to Prevent Falls Short. I pointed to several questionable statements in the press release, statements which were not always reflected in the actual study.

The paragraph in question reads:

Editors at scholarly publications were the exact opposite, with a majority reporting routinely checking submitting authors’ work for plagiarism. The web site Retraction Watch estimates that the number of retractions in scholarly publications doubled between 2010 and 2011 (iThenticate Press Release, 2012 December 5).

and, amongst other things, I questioned the second statement. There is no evidence in the study to indicate that “the number of retractions in scholarly publications (had) doubled between 2010 and 2011” – and there was nothing on the Retraction Watch website to suggest this either. Where, I asked, had iThenticate found this statement?

I still don’t have an answer to this question. It might not even be a valid question any more, because the statement is no longer there. Instead, what I see now Continue reading

Wrong to be forgotten?

Print Friendly, PDF & Email

The ECJ ruling that individuals be allowed to request that search engines remove links to web pages which mention them, the so-called “right to be forgotten,” has come in for a lot of support and a lot of criticism. It raises a lot of questions as to whether the law is enforceable.

Some of the biggest criticisms raise notions of censorship and attempts to change history and the historical record. One of my biggest concerns is that the search engine company is judge and jury, and the “defendant” – the person or organisation behind the “offending” page – is not informed of the request unless and until the request to remove Continue reading

By any other name…

Print Friendly, PDF & Email

An interesting way of putting it : “extensive text overlap.”

The full Retraction reads,

This article [1] has been retracted by the author due to extensive text overlap with a previous publication by Roberts et al. [2]. The author apologises for any inconvenience caused.

The offending paper, now retracted, is “Infantile colic, facts and fiction” by Abdelmoneim E.M. Kheir. It was published in the Italian Journal of Pediatrics (IPJ) in 2012. There is a note on the page that this paper is “Highly accessed.”

The text overlap which has been identified is with Continue reading

Texas sharp-shooting?

Print Friendly, PDF & Email

Congratulations, Ben Goldacre!  Damning Report From The Public Accounts Committee On Clinical Trial Results Being Withheld tells it all.

On 3 January, the Public Accounts Committee of the House of Commons issued a report which expressed concern at the fact that pharmaceutical companies tend to publish results of clinical trials which make them look good, but withhold publication of trials in which the results are less favourable. This affects doctors’ knowledge and perceptions Continue reading

Thirty percent

Print Friendly, PDF & Email

I use the Google Alert feature to be made aware of new web pages which include terms I regularly search for. It saves me having to remember to repeat my favourite searches, and it pinpoints new or changed pages.

I thought the feature had gone berserk the other day. My alert for “every written assignment they complete” usually gives me just one or two hits a week.  This week’s digest gave me forty hits. Continue reading

Getting it wrong…

Print Friendly, PDF & Email

The strange story of Hamilton Naki

A strange story, and a strange journey too. This post is not just Naki’s story, strange as that is.

We visit Wikipedia (and wonder if teachers who forbid its use might want to think again), touch on journalistic ethics, have a quick look at the online citation generator EasyBib, and finish at the gates of Turnitin, the software which will “check students’ work for improper citation or potential plagiarism” (Turnitin OriginalityCheck). Continue reading