Overhype and 'research laundering' are a self-inflicted wound for social science
- Written by Christopher J. Ferguson, Professor of Psychology, Stetson University
Earlier this fall, Dartmouth College researchers released a study claiming to link[1] violent video games to aggression in kids. The logic of a meta-analytic study like this one is that by combining many individual studies, scientists can look for common trends or effects identified in earlier work. Only, as a psychology researcher who’s long focused on this area, I contend this meta-analysis did nothing of the sort. In fact, the magnitude of the effect they found is about the same as that of eating potatoes on teen suicide[2]. If anything, it suggests video games do not predict youth aggression.
This study, and others like it, are symptomatic of a big problem within social science: the overhyping of dodgy, unreliable research findings that have little real-world application. Often such findings shape public perceptions of the human condition and guide public policy[3] – despite largely being rubbish. Here’s how it happens.
The last few years have seen psychology, in particular, embroiled in what some call a reproducibility crisis[4]. Many long-cherished findings in social science more broadly have proven difficult[5] to replicate under rigorous conditions. When a study is run again, it doesn’t turn up the same results as originally published. The pressure to publish positive findings[6] and the tendency for researchers to inject their own biases[7] into analyses intensify the issue. Much of this failure to replicate can be addressed with more transparent and rigorous methods in social science.
But the overhyping of weak results is different. It can’t be fixed methodologically; a solution would need to come from a cultural change within the field. But incentives to be upfront about shortcomings are few, particularly for a field such as psychology, which worries[8] over public perception[9].
One example is the Implicit Association Test (IAT). This technique is most famous for probing for unconscious racial biases. Given the attention it and the theories based upon it have received, something of a cottage industry has developed to train employees about their implicit biases[10] and how to overcome them. Unfortunately, a number of studies suggest the IAT is unreliable and doesn’t predict real-world behavior[11]. Combating racial bias is laudatory, but the considerable public investment in the IAT and the concept of implicit biases is likely less productive than advertised.
Part of the problem is something I call “death by press release.” This phenomenon occurs when researchers or their university, or a journal-publishing organization such as the American Psychological Association, releases a press release that hypes a study’s findings without detailing its limitations. Sensationalistic claims tend to get more news attention[12].
ilikestudio/Shutterstock.com[13]For instance, one now notorious food lab at Cornell experienced multiple retractions[14] after it came out that they tortured their data in order to get headline-friendly conclusions. Their research suggested that people ate more when served larger portions, action television shows increased food consumption, and kids’ vegetable consumption would go up if produce was rebranded with kid-friendly themes such as “X-ray vision carrots.” Mainly, lab leader Brian Wansink appears to have become an expert in marketing social science[15], even though most of the conclusions were flimsy.
Another concern is a process I call “science laundering” – the cleaning up of dirty, messy, inconclusive science for public consumption. In my own area of expertise, the Dartmouth meta-analysis on video games is a good example. Similar evidence[16] to what had been fed into the meta-analysis had been available for years and actually formed the basis for why most scholars[17] no longer link violent games to youth assaults.
Science magazine[18] recently discussed how meta-analyses can be misused to try to prematurely end scientific debates. Meta-analyses can be helpful when they illuminate scientific practices that may cause spurious effects, in order to guide future research. But they can artificially smooth over important disagreements between studies.
Let’s say we hypothesize that eating blueberries cures depression. We run 100 studies to test this hypothesis. Imagine about 25 percent of our experiments find small links between blueberries and reduced depression, whereas the other 75 percent show nothing. Most people would agree this is a pretty poor showing for the blueberry hypothesis. The bulk of our evidence didn’t find any improvement in depression after eating the berries. But, due to a quirk of meta-analysis, combining all 100 of our studies together would show what scientists call a “statistically significant” effect – meaning something that was unlikely to happen just by chance – even though most of the individual studies on their own were not statistically significant.
Merging together even a few studies that show an effect with a larger group of studies that don’t can end up with a meta-analysis result that looks statistically significant – even if the individual studies varied quite a bit. These types of results constitute what some psychologists have called the “crud factor[19]” of psychological research – statistically significant findings that are noise, not real effects that reflect anything in the real world. Or, put bluntly, meta-analyses are a great tool for scholars to fool themselves with.
Selena N. B. H./Flickr, CC BY[20][21]Professional guild organizations for fields such as psychology and pediatrics should shoulder much of the blame for the spread of research overhyping. Such organizations release numerous, often deeply flawed[22], policy statements trumpeting research findings in a field. The public often does not realize that such organizations function to market and promote a profession[23]; they’re not neutral, objective observers of scientific research – which is often published, for income[24], in their own journals.
Unfortunately, such science laundering can come back to haunt a field when overhyped claims turn out to be misleading. Dishonest overpromotion of social science can cause the public and the courts[25] to grow more skeptical of it. Why should taxpayers fund research that is oversold rubbish? Why should media consumers trust what research says today if they were burned by what it said yesterday?
Individual scholars and the professional guilds that represent them can do much to fix these issues by reconsidering lax standards of evidence, the overselling of weak effects, and the current lack of upfront honesty about methodological limitations. In the meantime, the public will do well to continue applying a healthy dose of critical thinking to lofty claims coming from press releases in the social sciences. Ask if the magnitude of effect is significantly greater than for potatoes on suicide. If the answer is no, it’s time to move on.
References
- ^ claiming to link (doi.org)
- ^ eating potatoes on teen suicide (www.wired.com)
- ^ guide public policy (www.law.cornell.edu)
- ^ reproducibility crisis (www.theatlantic.com)
- ^ proven difficult (www.washingtonpost.com)
- ^ pressure to publish positive findings (doi.org)
- ^ inject their own biases (doi.org)
- ^ which worries (doi.org)
- ^ public perception (dx.doi.org)
- ^ train employees about their implicit biases (thinkprogress.org)
- ^ unreliable and doesn’t predict real-world behavior (www.thecut.com)
- ^ get more news attention (doi.org)
- ^ ilikestudio/Shutterstock.com (www.shutterstock.com)
- ^ multiple retractions (www.washingtonpost.com)
- ^ marketing social science (slate.com)
- ^ Similar evidence (doi.org)
- ^ why most scholars (doi.org)
- ^ Science magazine (www.sciencemag.org)
- ^ crud factor (goodsciencebadscience.nl)
- ^ Selena N. B. H./Flickr (www.flickr.com)
- ^ CC BY (creativecommons.org)
- ^ often deeply flawed (doi.org)
- ^ promote a profession (psychcentral.com)
- ^ for income (ar2016.apa.org)
- ^ the courts (doi.org)
Authors: Christopher J. Ferguson, Professor of Psychology, Stetson University