February 2016

Self-Described Experts Over-Estimate Expertise

A study, “When Knowledge Knows No Bounds: Self-Perceived Expertise Predicts Claims of Impossible Knowledge,” published in 2015 by Cornell University researchers Stav Atir, Emily Rosenzweig, and David Dunning, in the journal Psychological Science, presented the findings of a study examining the phenomenon of “overclaiming.”  The study abstract described overclaiming as that circumstance where people “overestimate their knowledge, at times claiming knowledge of concepts, events, and people that do not exist and cannot be known.”  Four individual studies were done.  The first showed that self-perceived financial knowledge led to a claim of knowledge of nonexistent financial concepts; the second study focused on specific areas, e.g. biology; the third involved warnings to the participants of the falsity of some concepts, but the study found the warnings had no effect in reducing overclaiming; and the fourth study dealt with geographic expertise, which found participants claiming familiarity with non-existent places.

Sources:  Jessica Schmerier, “You don’t know as much as you think: false expertise,” scientificamerican.com, January 1, 2016: http://www.scientificamerican.com/article/you-don-t-know-as-much-as-you-think-false-expertise/?print= true.  Abstract: http://pss.sagepub.com/content/early/2015/07/14/0956797615588195.abstract

Stanford Research Finds How Scientists Lie

A study by David M. Markowitz and Jeffrey T. Hancock published in the Journal of Language and Social Psychology, “Linguistic Obfuscation in Fraudulent Science,” presented the first-of-a-kind large-scale analysis of fraudulent papers across disciplines to examine how changes in writing styles relate to fraudulent writing.

The researchers examined 253 publications from 1973 to 2013 that had been retracted due to the use of fraudulent data, and they then compared the linguistic style of those papers to 253 unretracted publications and 62 publications that had been retracted for non-fraudulent reasons.  Fraudulent papers were found to contain “significantly higher levels of linguistic obfuscation, including lower readability and higher rates of jargon than unretracted and nonfraudulent papers.”  The fraudulent papers “had about 60 more jargon-like words [about 1.5 percent more] per paper compared to unretracted papers … a non-trivial amount,” Markowitz said.  The researchers also found that the fraudulent authors tended to obfuscate their reports and references to “mask their deception by making [the papers] more costly to analyze and evaluate.” The changes in writing-style, the authors found, was “directly related to the author’s goal of covering up lies through the manipulation of language.”

Sources:  Bjorn Carey, “Stanford researchers uncover patterns in how scientists lie about their data,” news.stanford.edu, November 16, 2015:  http://news.stanford.edu/news/2015/november/fraud-science-papers-111615.html.  Link to Abstract:  http://jls.sagepub.com/content/early/2015/11/05/0261927X15614605?papetoc

by Neil Leithauser
Associate Editor