Challenges in synthesising evidence from implementation and dissemination studies: experience from two systematic scoping reviews

ID: 

101

Session: 

Poster session 2

Date: 

Monday 24 October 2016 - 15:30 to 16:00

Location: 

All authors in correct order:

Thompson Coon J1, Abbott R1, Rogers M1, Lourida I1, Whear R1, Lang I1, Pearson M1, Day J1, Stein K1
1 University of Exeter Medical School, UK
Presenting author and contact person

Presenting author:

Jo Thompson Coon

Contact person:

Abstract text
Background: Interest in implementation science is burgeoning. Alongside this there has been a proliferation of evidence syntheses of implementation and dissemination studies. A systematic scoping review of the methods used in implementation reviews conducted by our team in 2013 identified 166 eligible publications. Updating the searches for this review in 2015 resulted in the inclusion of an additional 208 publications. We have since conducted systematic scoping reviews to examine the extent, range, and nature of research on different ways of disseminating and implementing research findings in two topic areas – dementia care and care homes.

Objectives: To use our experience to highlight and explore the challenges involved in synthesising evidence from implementation and dissemination studies.

Methods: We conducted each review according to established methods for scoping reviews; protocols are available from the authors. Frequent face-to-face meetings were necessary at all stages of the project; particularly during the screening phase. The nature of the issues and challenges encountered was captured through note-taking and email dialogue during the review process and further reflective discussion took place in the preparation of this abstract.

Results: Challenges encountered included:
1. confidence in the identification of papers for inclusion despite an extensive search strategy informed by previous reviews and expert advice;
2. consistent application of inclusion and exclusion criteria to the wide variety of study designs that have been used to study implementation and dissemination;
3. achieving team-wide consensus on a robust definition of implementation; and
4. the lack of distinction between the reporting of implementation and intervention effectiveness.

Conclusions: Implementation science is an emerging field for which the parameters and boundaries are still being (socially) constructed. This lack of clarity means that a common language is lacking and reporting is often poor, making it hard for findings to be interpreted. Reflection on our experiences from these reviews will provide a basis for future methodological guidance.