…we need some much-needed candor regarding just how easy it is to falsify social science data.
From “Fake Data, Fake Science“, being an article by David Malamed in the March, 2016 issue of CPA Magazine, concerning academic dishonesty, in relation to “When contact changes minds: An experiment in transmission of support for gay equality“, originally published (and then retracted) in the December, 2014 issue of Science. The paper was co-authored by political science graduate student Michael J. LaCour, then a graduate student at UCLA, and Donald P. Green, then a professor of Political Science at Columbia University. The paper was retracted in June of 2015, for the following reasons:
Science 05 Jun 2015:
Vol. 348, Issue 6239, pp. 1100
Science, with the concurrence of author Donald P. Green, is retracting the 12 December 2014 Report “When contact changes minds: An experiment on transmission of support for gay equality” by LaCour and Green (1).
The reasons for retracting the paper are as follows: (i) Survey incentives were misrepresented. To encourage participation in the survey, respondents were claimed to have been given cash payments to enroll, to refer family and friends, and to complete multiple surveys. In correspondence received from Michael J. LaCour’s attorney, he confirmed that no such payments were made. (ii) The statement on sponsorship was false. In the Report, LaCour acknowledged funding from the Williams Institute, the Ford Foundation, and the Evelyn and Walter Haas Jr. Fund. Per correspondence from LaCour’s attorney, this statement was not true.
In addition to these known problems, independent researchers have noted certain statistical irregularities in the responses (2). LaCour has not produced the original survey data from which someone else could independently confirm the validity of the reported findings. Michael J. LaCour does not agree to this Retraction.
1.M. J. LaCour, D. P. Green, Science 346, 1366 (2014)
2.D. Broockman, J. Kalla, P. Aronow Irregularities in LaCour (2014) (2015)
As Mr. Malamud discusses:
It wasn’t clear if the paper was a deliberate fraud or wishful thinking. The New Yorker posited that confirmation bias — the people involved in the study wanted to believe the findings — could have played a role in the paper’s positive reception from onset to publication. “We know that studies confirming liberal thinking sometimes get a pass where ones challenging those ideas might get killed in review,” the magazine said. “The same effect may have made journalists more excited about covering the results.”
No matter the motivation, the discredited paper was a black eye for Science and for the media outlets that lapped it up. It was not, however, an isolated occurrence. Science fraud is, in fact, more common than most people likely suspect.
In November 2015, the website Quartz noted that “the number of published science papers that have been retracted due to misconduct or fraud has ballooned in the last decade.”
A month later, The Scientist echoed that conclusion. “Recent years have seen a spate of scientific scandals,” it said. “Whether this is due to an increase in dishonesty or foul play in the lab or simply closer attention to the issue, research misconduct is now squarely in the public eye.”
In November 2015, Stanford University reported that two of its academics had published a paper in the Journal of Language and Social Psychology that could help identify false research before it is published. “Even the best poker players have ‘tells’ that give away when they’re bluffing with a weak hand. Scientists who commit fraud have similar, but even more subtle, tells,” the university said, explaining that researchers had cracked the writing patterns of scientists who pass along falsified data.
The study expanded on “studies [that] have shown that liars generally tend to express more negative emotion terms and use fewer first-person pronouns. Fraudulent financial reports typically display higher levels of linguistic obfuscation — phrasing that is meant to distract from or conceal the fake data — than accurate reports.”
The researchers compared 253 retracted papers, mostly from biomedical journals, to unretracted papers on the same topics from the same journals and in the same publication years.
“Scientists faking data know that they are committing a misconduct and do not want to get caught,” David Markowitz, one of the researchers, said. “Therefore, one strategy to evade this may be to obscure parts of the paper. We suggest that language can be one of many variables to differentiate between fraudulent and genuine science. Fraudulent papers had about 60 more jargon-like words per paper compared to unretracted papers. This is a non-trivial amount.”
One consequence to LaCour was that an offer of assistant professorship by Princeton University was rescinded. One consequence to Green was that he lost $200,000 in grant money. Green still has a job at Columbia. All this happened three years after he joined that university, following at least fifteen years at Yale.
The scandal became known as the “Michael LaCour Scandal“.