May 2020

Study Finds Courts Failing to Filter out ‘Junk Science’

A recent study examined the use of psychological assessments by psychologists in legal matters. The two-part study focused on a review of 364 psychological assessment tools used by mental health practitioners in legal cases and on legal challenges to expert psychological assessments. The researchers found that almost all (about 90%) of the assessment tools used by experts in legal matters had been subjected to empirical testing; however, only about 67% of the tools were found to be generally accepted in the field, and only about 40% were viewed as generally favorable by leading authorities. The researchers also found that legal challenges are done only infrequently; that is, legal challenges by attorneys – usually challenges to the validity of the assessment tools – were done in only about 5.1% of the cases examined. Where challenges were made, the challenges were successful about one-third of the time.

Judges have a threshold duty to determine the admissibility of expert evidence, and the requirements may be found in Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 (1993) (and see General Electric Co. v. Joiner, 522 U.S. 126 (1997) and Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999)) and the Rules of Evidence, e.g., Fed R. Evid. 702. Courts are required to determine that the proposed expert testimony is the product of reliable principles and methods. However, the researchers found that judges often do not sufficiently evaluate the merits of expert methodology and fail to recognize poor science foundations.

Fundamental to psychometric concepts are validity, or accuracy, and reliability, or repeatability. Both validity and reliability are context specific; that is, both the practitioners and the courts must determine the scientific acceptability – for a specific intended use – of a psychological assessment tool on a case-by-case basis. Under the Supreme Court cases, the term “reliable” referenced both reliability and validity, the researchers noted, and judges must determine that the methods used by a proposed expert have a valid link to the facts of the case. That is, “forensic health experts must use methods that are appropriate for the specific population and circumstances of the case at hand …”

Forensic psychology is a growing field, and psychological testing is a “big-business industry,” the researchers found. New psychological assessment tools are always being developed, marketed, and used in forensic settings. Surveys show that about 74.2% of mental health professionals use one or more psychological assessments tools for each examination conducted. The researchers examined 364 assessment tools – used in different areas such as for youth, or adults, or for pretrial risk assessments, competency and criminal responsibility, and sentencing considerations – to ascertain whether the tools had been subject to testing and peer-review, and found that about 90% of the tools had been subject to testing. The researchers only found data to assess whether a tool had gained generally acceptance in about one-half of those examined. Of those where data was available, the researchers found that about two-thirds of the tools were generally accepted. For 16.8% of those examined the evidence was conflicting, and for the remaining 16.8% the researchers found the tools were clearly not generally accepted in the field.

In the second part of their study the researchers focused on 30 psychological assessment tools used in 372 court cases, both at the trial level and in appellate decisions. Legal challenges to the admissibility of an assessment tool or to testimony relying on an assessment tool were found in only 5.1%, or 19 of the cases; the challenge was successful – in excluding the evidence – in only 6 of those 19 cases. One main reason for rejecting a challenge in the appellate courts was that the appellate court deferred to the trial court, and a main reason a challenge was rejected in the trial court was a determination that the issue was one of weight of the evidence for the factfinder and not a question of admissibility.

Sources: Christina Larson, Associated Press, “Study: Courts not filtering out 'junk science'-based psychology tests,” detnews.com, February 17, 2020:
https://www.detroitnews.com/story/news/nation/2020/02/16/court-psychology-tests-junk-science-study/111328956/
Tess M. S. Neal, Christopher Slobogin, Michael J. Saks, David L. Faigman, and Kurt F. Geisinger, “Psychological Assessments in Legal Contexts: Are Courts Keeping “Junk Science” Out of the Courtroom?” Association for Psychological Science, Psychological Science in the Public Interest, Volume 20, Number 3, February, 2020:
https://journals.sagepub.com/stoken/default+domain/10.1177%2F1529100619888860+-+FREE/pdf
Association for Psychological Science webpage:
https://www.psychologicalscience.org/publications/psychological-assessment-in-legal-contexts-are-courts-keeping-junk-science-out-of-the-courtroom.html

by Neil Leithauser
Associate Editor