reporting results of systematic reviews karin hannes centre for methodology of educational research

29
Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Upload: shannon-potter

Post on 11-Jan-2016

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Reporting results of systematic reviews

Karin HannesCentre for Methodology of

Educational Research

Page 2: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Overview

• Anatomy of a Systematic Review• Dissemination Channels• Inclusion of Process and Implementation

Aspects in a systematic Review

Page 3: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Anatomy of a Systematic Review

• Background/Introduction– Establish need– Distinguish from previous review efforts– State objectives and review questions

• Methods– Criteria for inclusion and exclusion• Type of population• Type of studies• Type of intervention (+ comparison)• Type of outcomes

Page 4: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Anatomy of a Systematic Review (cont.) • Methods– Locating studies: Consulted data sources (databases, grey

literature, reference searches, expert consulting etc.,– Search strategy (final)

Page 5: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Anatomy of a Systematic Review (cont.)

• Methods– Proces of

selecting studies• Include

screening instrument in annex

1. Published between 2005 (jan) and 2008 (dec)

Already filtered out.

2.Conducted within health care or a health care context

Include: Syntheses of qualitative with quantitative research) by

synthesis methods other than informal review.

Exclude: Papers commenting on methodological issues but

without including details of the outcomes of the synthesis.

Papers that do not explicitly describe or name a method for synthesis.

Reviews on concepts/definitions used within health care or research issues

3. Published in English language

4. Published in a peer-reviewed journal

Inclusion criteria review on QES in the literature (update Dixon-Woods)

Possible screening criteria• Timespan• Language restrictions• Discipline / Scientific field• INCLUSION and EXCLUSION CRITERIA

Not acceptable in a Cochrane or Campbell review

Page 6: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Anatomy of a Systematic Review (cont.)

• Methods:– Data Extraction

• Introduce coding form• Describe and define coding categories• Describe process of data extraction ‘at least two

independent reviewers’

Example of a data-extraction sheet including (based on EPOC review group documents):

• Screening form• Critical Appraisal checklist• Data-extraction sheet ‘We used the EPOC guidance on data-extraction (reference)’

Page 7: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

http://epoc.cochrane.org/data-extraction

Page 8: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 9: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Descriptive

Page 10: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 11: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Statistical part

Page 12: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Anatomy of a Systematic Review (cont.)

• Results– Descriptive results – Inferential results (if applicable)

• Discussion• Conclusions

– Implications for practice– Implications for research

• References • Appendix: search strings, critical appraisal checklist, list with

excluded studies (usually a flow chart), coding/extraction sheets, outcomes of meta-synthesis exercise etc.

• The Campbell Collaboration would ask for a user sheet (short summary avoiding scientific jargon)

Page 13: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Disseminating Systematic Reviews: Organizations

• Campbell Collaboration – www.campbellcollaboration.org

• Cochrane Collaboration– www.cochrane.org

• Joanna Briggs Institute– http://connect.jbiconnectplus.org/JBIReviewsLibrar

y.aspx• EPPI-centre and many more organizations that

produce and publish their own...

Page 14: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Disseminating Systematic Reviews: Journals

• Most journals welcome article forms of full Cochrane or Campbell Reviews

• Some do not wish to publish them if they are public available in a database

• Some are sensitive to the argument that Cochrane and Campbell type of reviews require too much time and effort:– Rapid reviews– Narrow inclusion criteria– Best Practice Sheets or Critical Appraisals of Reviews

Check potential copyright issues!

Page 15: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Inclusion of Process and Implementation Aspects

• Is there a need?– It is important to know what works, but it is

equally important to know what sort of programmes to put limited resources into • The school feeding program review…. • Scared Straight Programs (Petrosini Review) have

proven to cause more harm than benefits. There are some good theories to potentially explain this, however too little empirical data in the trials to test them….

Page 16: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Inclusion of Process and Implementation Aspects

• What is the problem?

– We do address variation in the effects of interventions:• Factors related to patient or client groups• Timing and intensity of programs• The potential impact of co-interventions• …. (Glasziou 2002; Higgins et al 2002) Meta-regression or sensitivity analysis

or use of individual patient/client data

– Reviewers evaluating complex interventions often experience difficulties in determining:• what exactly the intervention entailed • whether it was implemented as intended• whether there were confounding factors in the wider social context that would affect the outcomes• … (Egan et al, 2009).

Issues beyond those related to program design/logic

Page 17: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Inclusion of Process and Implementation Aspects

• Cargo et al. (2011) developed a new instrument, the ‘Extraction tool for Campbell Collaboration Review of childhood and youth reviews’ to assist with the extraction of process and implementation variables in systematic reviews.

We explored • to what degree process and

implementation variables are present in published educational reviews (N=10)

• The aspects that are articulated most

• whether consideration of these items in reviews is possible, given the data provided by its primary studies

Page 18: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

[email protected]

• Process and Implementation Aspects– Theory or change models shaping

the intervention– Characteristics of:

• Implementing organisation• Partnering organisation• Implementers• Participants, clients, patients

– Protocols for the intervention– Context:

• Ecological• External

– Process and Implementation factors– Design and Methodological Issues– Sensitivity analysis– Quality assessment and risk of bias

Page 19: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Presence of Process and Implementation Variables in Systematic Reviews by the Campbell Collaboration

• Could be extracted in most reviews: age, gender, grade or grade level, ethnicity of the participants, who the implementers were, implementer training, the intervention protocol, the intervention setting, attrition, dose delivered

• Could be extracted in some reviews: information regarding the organisation providing the intervention or service, the presence or absence of partnering organizations, role of implementers, SES of the participants, the engagement of the implementer, the presence and content of co-intervention

– Many reviews performed a sensitivity analysis.

• Could not be extracted (or only to a limited extent): information on the service delivery protocol, ethnicity, SES, age and gender of the implementer, minimum dose, reach of the intervention

– From the process and implementation section, recruitment, minimum attrition, minimum dose received, minimum fidelity, and participant engagement were not mentioned in any of the reviews.

Page 20: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Presence of process and implementation variables in the primary studies used by systematic reviews.

• Could be extracted in most studies: the use of available resources like staff, building or materials, the implementer’s occupation, previous training or experience , the intervention protocol as well as the service delivery protocol , the characteristics of the participants or students that enrolled in an intervention, the place and/or setting, the country and its degree of urbanization, the length of the program, the frequency and type as an aspect of dose delivered

• Could be extracted in some studies: the quality of the intervention materials (e.g. Curricula), the funding sources, the use of joined forces and expertise of other instances, a clear explication of the change process envisioned

• Could not be extracted (or only to a limited extent): A diagram of the change model, leadership and technical support, alliances between the intervention program and other instances involved

Page 21: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

To what extent were process and implementation variables from the primary studies included in the systematic review?

• Goerlich et al. (2006): – Some aspects from the implementing organization such as adequacy of resources (e.g.

staff) and quality of intervention materials were present in all three primary studies but not in the systematic review.

– Attrition, reach and minimum dose delivered from the process and implementation section, were not considered in the review, although they were mentioned in all studies.

• Zwi et al. (2007): – Reporting of items from the process and implementation section was mostly in

accordance with the presence of this item in primary studies. Only dose delivered was not considered in the systematic review although it was presented in all primary studies.

– Twelve out of thirteen studies reported a change model. This was, however, not considered in the review. The implementer was discussed, but no clear provider type was specified (considered in ten studies out of thirteen).

– Age, gender and grade or grade level of the participant were considered most in the original studies and were also present in the review.

– Ethnicity and SES were reported in nine studies but not mentioned in the review.

Page 22: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research

Mar

gare

t.car

go@

unisa

.edu

.au

Page 23: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 24: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 25: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 26: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 27: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 28: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research
Page 29: Reporting results of systematic reviews Karin Hannes Centre for Methodology of Educational Research