Eileen GAMBRILL *
School of Social Welfare, University of California at Berkeley, USA
Coyne and Kok describe many of the shortcomings of research design and reporting. Many have been described for decades yet continue, revealing the degree of avoidable ignorance in this area encouraged by pursuit of status and money and related cognitive biases. Such shortcomings are vital to highlight (and avoid if possible) given the hoped-for guidance of practice and policy decisions by empirical research: if the research is flawed, drawing on this may do more harm than good; policy makers, clinicians and clients all may be misled. As the authors note, many publications are characterized by inflated claims of effectiveness and hiding of harms. These are classic propaganda/marketing strategies (Gambrill, 2012). Reforms such as trial registration that arose because of dubious practices in research concerning pharmaceuticals (e.g., hiding negative results), are, as the authors note, important for all research.
Ioannidis (2005) has led a growing critique of research in the bio-medical area which has encouraged a critique in areas such as psychology including efforts to replicate prior research, often with disappointing results. This thrust has been encouraged by the rigorous guidelines for conduct of systematic reviews described by the Cochrane and Campbell Collaborations, the former initiated in 1992. Conducting research that cannot answer questions posed is a waste of money which hopefully will be discouraged by enterprises such as the Meta-Research Innovation Center (METRICS) recently established at Stanford to strengthen the quality of scientific evidence and combat research waste. Original publications describing the philosophy and process of evidence-informed practice call for the integration of data from well-designed research concerning clinical questions and client circumstances and characteristics, including their preferences and values, drawing on clinical expertise (Straus, Richardson, Glasziou, & Haynes, 2010). This process is a way to handle the inevitable uncertainty in making decisions in an informed, ethical manner. Bogus claims based on flawed research impede such efforts.
Improving the quality of research concerning psychotherapy will require attention to professional education venues awarding degrees as well as continuing education programs (e. g., Baker, McFall, & Shoham, 2008; McFall, 1991). It will require attention to how problems are framed as well as to methodological quality (Gambrill, 2014). Problem framing receives no attention in quality filters such as CONSORT. A psychiatric framing of life’s travails dominates psychology, psychiatry, and social work (e. g., Kirk, Gomory, & Cohen, 2013). Alternative well-argued views (e. g., social learning theory) are often ignored in descriptions of randomized controlled trials (e. g., Gambrill & Reiman, 2012). Fortunately, many sources are now available for critically appraising and/or designing research including user-friendly websites such as Testing Treatments, Interactive (TTi) and books such as How to read a paper (Greenhalgh, 2010) and Randomized controlled trials (Jadad & Enkin, 2007). The greater attention to involvement of clients in the conduct, appraisal and use of research is also an important development.
Is it wise to use the term “well established” in view of the history of science and medicine that shows that most claims we thought were “well-established,” were wrong? Why not clearly describe the evidentiary status of an intervention (e. g., this intervention has been tested in three well-designed and …. ). “Salvaging psychotherapy research” will require attending to the role of nonspecific factors as well as specific interventions (see the work of Bruce Wampold and his colleagues). Placebo effects must be considered. It will require asking: “Is this intervention bona fide?” (e. g., Benish, Imel, & Wampold, 2008).
The increasing call to integrate research evidence into practice and policy contexts highlights the importance of conducting high-quality research that accurately describes the evidentiary status of interventions. It highlights the importance of taking a close look at how problems are framed. Are they framed as individual problems when indeed environmental stress is predominant? Coyne and Kok contribute to this important endeavor.
Baker, T., McFall, R., & Shoham, V. (2008). Current status and future prospects of clinical psychology:Toward a scientifically principled approach to mental and behavioral health care. Psychological Science in the Public Interest, 9, 67-103.
Benish, S. G., Imel, Z., & Wambold, B. E. (2008). The relative efficacy of bona fide psychotherapies for treating post-traumatic stress disorder: A meta-analysis of direct comparisons. Clinical Psychology Review, 28, 746-758.
Gambrill, E. (2012). Propaganda in the helping professions. New York: Oxford.
Gambrill, E. (2014). The DSM as a major source of dehumanization in the modern world. Research on Social Work Practice, 24, 13-36
Gambrill, E. & Reiman, A. (2011). A propaganda index for reviewing articles and manuscripts: An exploratory study. PLoS One, 6, e19516.
Greenhalgh, T. (2010). How to read a paper: The basics of evidence-based medicine (4th Ed.). Hoboken, NJ: Wiley-Blackwell.
Ioannidis, J. P. A. (2005). Why most published research findings are false. PLoS Medicine, 2, e124.
Jadad, A. R., & Enkin, M. W. (2007). Randomized controlled trials: Questions, answers & musings (2nd Ed.). Malden, MA: Blackwell Pub.
Kirk, S. A., Gomory, T. & Cohen, D. (2013). Mad science: Psychiatric coercion, diagnosis and drugs. New Brunswick, NJ: Transaction.
McFall, R. M. (1991). Manifesto for a science of clinical psychology. Clinical Psychologist, 44, 75-88.
Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B. (2010). Evidence-based medicine: How to practice and teach it (4th Ed.). New York: Churchill Livingstone.