I experience frustration when I see exaggerated or incorrect conclusions based on research in the media. We’ve all seen it in newspapers and social media, with friends posting, “see! I knew garcinia cambogia was the cure to obesity!” Even earlier this year, the Toronto Star retracted an article that made heavy allegations against the HPV vaccine, based mostly on anecdotes rather than published research.
Today, I came across How Not to be Misled by Data in the Wall Street Journal, which is a simple guide to navigate research findings so we don’t misinterpret the data! The following points may be helpful whether you’re reading a study yourself or if someone else (e.g., the media) is interpreting it for you:
1. Was there a comparison group? For example, if the cancer rate was 15% in the smoking group, what was the cancer rate in the non-smoking group? Without a comparison group, we lack context to draw accurate conclusions.
2. Is the data representative? For example, a convenience sample of 25 undergraduate students in a first-year Psychology class is not representative of all post-secondary students.
3. Is the reported finding the only finding? The media will often report select findings from studies (“needle in the haystack” effect) and may fail to mention less sensational ones. For example, in the well-known study by Freedman et al., (2012; NEJM), if only one finding was pulled out, it could read: “coffee drinkers are at increased risk of death.” However, upon looking at other findings, it is clear that smoking is a confound variable; coffee drinkers are more likely to smoke and therefore have increased risk of death.