Bruce A. THYER*
College of Social Work, Florida State University, USA
Keywords: psychotherapy research, publication standards, evidence-based practice.
The manifesto on salvaging psychotherapy research has much to commend it and I compliment Coyne and Kok for their unflinching criticism and proposed stringent solutions. Almost all the methodological, peer review and publication shortcomings they mention are legitimate concerns. In responding let me state that I am a part of the problem. For the past 25 years I have served as the Editor of a peer-reviewed journal titled Research on Social Work Practice (RSWP), produced by Sage Publications. I founded RSWP to serve as an outlet primarily for outcome studies on psychosocial interventions used by social workers, and attempted to model it after the prestigious Journal of Consulting and Clinical Psychology, with which I was very familiar. Clinical social workers are the largest provider of mental health services in the United States, surpassing the combined numbers of psychologists and psychiatrists (Hamp, Stamm, Christidis & Nigrinis, 2014). RSWP has been a success. For two of the past three years it has had the highest impact factor of all social work journals tabulated by the Journal Citation Reports, and it is received by over 8000 subscribers. Over 150,000 articles are downloaded from the journal’s website each year. Yet RSWP regularly publishes the type of underpowered and methodologically weak outcome studies Coyne and Kok fulminate against! RSWP has the most rigorous submission requirements of all social work research journals, having adopted the APA’s Journal Article Reporting Standards some years ago, shortly after they first appeared. The journal has already published an article describing the importance of registering randomized clinical trials in an appropriate registry (Harrison & Mayo-Wilson, 2014), and years ago we began using structured abstracts containing the subheadings of Purpose, Method, Results, and Discussion. Here are some additional specific editorial policies of RSWP which I have been planning for some time and will take effect in 2015, steps which will move the field of psychotherapy research forward along the lines suggested by Coyne & Kok.
The journal will clearly state that the authors of RCTs should have pre-registered their study and provide a citation as to where this description can be located. RCT submissions must be accompanied by a completed CONSORT Checklist. Articles reporting the results of a quasi-experimental outcome study must follow the standards found in the Transparent Reporting of Evaluation Studies using Nonrandomized Designs (TREND) checklist. Analogous to CONSORT, the TREND checklist will greatly facilitate the timely peer-review of submissions and their transparency upon publication (see http://www.cdc.gov/trendstatement/). All nomothetic outcome reports must have a participant flow chart, depicting attrition at each stage of the study. All reports of statistically significant differences or changes much be accompanied by an appropriate effect size, and the meaning of this ES must be discussed. Causal inferences, if any, should be made conservatively and not go beyond the limits imposed by the presented methods and data. The authors of outcome studies evaluating nonpharmacological interventions (e.g., psychotherapies and other psychosocial treatments) are urged to familiarize themselves with relevant guidelines useful for reporting such studies. Grant et al. (2013) will be a recommended resource for authors to consult, as is Boutron, Ravaud and Moher (2012).
Also effective for RSWP in 2015, articles claiming to be a Systematic Review must adhere to the guidelines for preparing systematic reviews developed by the Cochrane Collaboration (Higgins & Green, 2009) or the Campbell Collaboration (2014). In addition, the authors of systematic reviews and meta-analyses must follow the guidelines found in the PRISMA Statement (Preferred Reporting Items for Systematic Reviews and Meta-analyses), found at: http://www.prisma-statement.org/, and include a copy of a completed PRISMA Checklist with the submitted manuscript. If the article does not follow these standards, the paper should be subtitled as A Narrative Review, or A Review, and the specific term Systematic Review should be avoided. Authors submitting a systematic review for review and publication will strongly encouraged to have pre-registered the review protocol in a suitable registry, such as PROSPERO (www.crd.york.ac.uk/PROSPERO). The article by Stewart, Moher and Shekelle (2012) will be referenced to authors for guidance regarding the rationale for, and process, of pre-registering systematic review protocols, and submissions should include a statement giving the reference to any registry in which the protocol is published. The EQUATOR Network (Enhancing the QUAlity and Transparency of Health Research) will be a recommended resource for authors preparing studies for submission to RSWP which deal with the general topic of health care, including psychotherapy (see http://www.equator-network.org/).
There are risks in adopting such stringent guidelines. Submissions may drop precipitously as authors opt for less demanding journal outlets. However, the more forward-thinking members of the profession will likely applaud the promulgation of such standards, and the journal may begin attracting the highest quality studies, which all too often social work authors submit to non-social work journals. A recent bibliography I am preparing has found that there are over 630 published randomized experiments authored by social workers, many of which involve evaluating psychotherapy (Thyer & Massie, 2014). The earliest of these appeared in 1949 and most of these have appeared in non-social work journals.
Adherence to the peer-review process itself can militate against rejecting low-quality and under-powered psychotherapy studies. I always share my decision letter to authors with the reviewers, along with all the reviewers’ comments. If reviewers recommend acceptance, and I consistently rejected poorer-quality works against the reviewers’ recommendations, soon I would have no reviewers! Journals can only publish what they receive in the form of submissions, and like other entities, the first rule is that of survival! Underpowered and poor-quality is in the eye of the beholder and the peer-review process is flawed. A treatment which is very strong can be appropriately evaluated with an RCT involving relatively small numbers of participants. It is only comparatively weak and ineffectual treatments that need very large Ns so that inferential statistics can be used to detect reliable (but clinically meaningless differences). Psychotherapy evaluation research is an evolving and imperfect process. The many laudable initiatives outlined by Coyne and Kok and slowly being adopted in the higher quality research journals (Thyer, 2014) are a positive sign that additional progress is being made.
REFERENCES
Boutron, I., Ravaud, P. & Moher, D. (2012). Randomized clinical trials of nonpharmacological treatments. New York: CRC Press.
Campbell Collaboration. (2014). Campbell Collaboration systematic review: Policies and guidelines. The Campbell Collaboration. Available from www.campbellcollaboration.org
Grant, S., Montgomery, P., Hopewell, S., Macdonald, G., Hoher, D. & Mayo-Wilson, E. (2013). Developing a reporting guideline for social and psychological intervention trials. Research on Social Work Practice, 23, 595-602.
Hamp, A., Stamm, K., Christidis, P. & Nigrinis, A. (2014, September). What proportion of the nation’s behavioral health providers are psychologists? Monitor on Psychology, p. 18.
Higgins J. P. T., & Green, S. (2009). Cochrane handbook for systematic reviews of interventions, Version 5.0.2. The Cochrane Collaboration. Available from www.cochrane-handbook.org.
Harrison, B. A. & Mayo-Wilson, E. (2014). Trial registration: Understanding and preventing bias in social work research. Research on Social Work Practice, 24, 372-376.
Stewart, L., Moher, D. & Shekelle, P. (2012). Why prospective registration of systematic reviews makes sense. Systematic Reviews, 1:7. doi:10.1186/2046-4053-1-7
Thyer, B. A. (2014). Evolving reporting guidelines for social work research. Nordic Social Work Research, 4(1), 1-4. DOI:10.1080/2156857X.2014.916887
Thyer, B. A. & Massie, K. (2014). A bibliography of randomized controlled experiments in social work (1949 – 2013): Solvitur Ambulando. Unpublished manuscript.