Mallidou1, A.A., Young2, L., Paul3, K. University of Victoria
1Assistant
Background • Systematic review (SR): An explicit and rigorous research method to identify, critically appraise, analyze, and synthesize the findings of independent relevant primary/original studies on a specific topic. • SRs should identify the gap between what we know and what we need to know, be concise and transparent, synthesize contradictory findings, provide narrative summary or pooled statistical analysis, and set research agenda based on identified gaps. • Goal: To translation and transfer existing evidence for utilization by target audiences (e.g., clinicians, managers, policy/decision-makers, consumers). • Quantitative (meta-analysis), qualitative (meta-synthesis) or both types of evidence (comprehensive/mixed-method) are synthesized in a SR • FAME: Feasibility, Appropriateness, Meaningfulness or Effectiveness of healthcare interventions and practice are examined in SRs. • The Joanna Briggs Institute (JBI) (http://www.joannabriggs.edu.au): An international, not-for-profit organization for research, development, and knowledge-synthesis involving nursing, medical & allied health researchers, clinicians, academics and policy/decision-makers in more than 70 Centres and Groups, with more than 8,000 members in over 47 countries and users in more 90 countries in every continent. Established in 1996. • First North American JBI Centre: located in Canada at Queen’s University the Queen's Joanna Briggs Collaboration (http://meds.queensu.ca/qjbc).
Purpose To describe the JBI model of systematic reviews using the JBI software: “System for the Unified Management of the Assessment and Review of Information” (SUMARI), which consists of the: • Comprehensive Review Management Software (CReMS) • Qualitative Assessment and Review Instrument (QARI) • Meta-Analysis of Statistics Assessment and Review Instrument (MAStARI) • Narrative Opinion and Text Assessment and Review Instrument (NOTARI) • Analysis of Cost Technology and Utilization Assessment and Review Instrument (ACTUARI).
Evidence-Based Health Care (EBHC) Re-conceptualizing Evidence: Experience, Expertise, Learned/official bodies, Experimental research (RCTs), Any rigorous research studies (including results from both quantitative and qualitative inquiries). Needs and preferences of patients & clients
Evidence = Knowledge arising from: Experience, Expertise, Learned/official bodies, Experimental research (RCTs), Any rigorous research studies Best available research evidence
Professional expertise, skills & judgment
Professor, School of Nursing; 2Professor, School of Nursing; 3Scientist Librarian
Essential Steps in EBP (Sackett & Haynes, 1995) These are five steps to evidence-based practice that are recognized internationally: I. To convert information into answerable questions (to formulate the problem); II. To track down the best evidence with which to answer these questions; III. To appraise the evidence critically, assess its validity (closeness to the truth) and usefulness (clinical applicability); IV. To implement the results in practice; V. To evaluate performance.
Methods - Steps in a SR • Develop a research/review question that includes the PICO core elements: Population Intervention or phenomenon of Interest (qualitative evidence) Comparator or Context (qualitative), and Outcomes • Define and describe inclusion and exclusion criteria for the primary studies • Locate studies • Select studies • Assess study quality following Grant & Booth’s (2009) analytic method (SAlSA): Search Appraisal Synthesis Analysis • Extract data • Use CReMS • Analysis/summary and synthesis of relevant studies • Present results • Interpret results/determining the applicability of results
Meta-Analysis, Meta-Synthesis, Mixed 1. Meta-Analysis (Narrative Synthesis): Use of statistical methods to combine the results of various independent, similar, quantitative studies to more precisely calculate one estimate of treatment effect than could be achieved by any of the individual contributing studies. Questions of Feasibility, Appropriateness, and /or Effectiveness (FAME) can be answered. 2. Meta-Synthesis (Meta-Aggregation, Meta-Ethnography): Use of qualitative methods to combine the findings of a number of individual, independent qualitative research studies and text. Questions of Feasibility, Appropriateness, and /or Meaningfulness (FAME) can be answered. 3. Comprehensive/Mixed Method: Combines both quantitative and qualitative findings to address multiple forms of evidence. Separate analyses and syntheses are performed on the corresponding data. Questions of Feasibility, Appropriateness, Meaningfulness, and /or Effectiveness (FAME) can be answered.
Rigour in Systematic Reviews Step:
Rigor achieved by:
Protocol
Peer review
3 phases of search
Exhaustive
Retrieve studies?
Blinded, dual decision
Appraise
Standardized tools
Include?
Blinded, dual decision
Extract data
Blinded, dual process
Pool data
Standardized techniques
Synthesize and report
Peer review
References Canadian Institutes of Health Research (2008). Knowledge translation. Canadian Institutes of Health Research. Available from: URL: http://www.cihr.ca/e/29418.html Centre for Reviews and Dissemination (2009). Systematic Reviews. CRD's guidance for undertaking reviews in health care. York: Centre for Reviews and Dissemination, University of York. Dixon-Woods, M., Shaw, R. L., Agarwal, S., & Smith, J. A. (2004). The problem of appraising qualitative research. Quality and Safety in Health Care, 13(3), 223-225. doi: 10.1136/qshc.2003.008714 Dixon-Woods, M., Agarwal, S., Jones, D., Young, B., & Sutton, A. (2005). Synthesising qualitative and quantitative evidence: A review of possible methods. Journal of Health Services Research and Policy, 10(1), 45-53. Dixon-Woods, M., Sutton, A., Shaw, R., Miller, T., Smith, J., Young, B., . . . Jones, D. (2007). Appraising qualitative research for inclusion in systematic reviews: A quantitative and qualitative comparison of three methods. Journal of Health Services Research and Policy, 12(1), 42-47. doi: 10.1258/135581907779497486 Evans, D. (2003). Hierarchy of evidence: A framework for ranking evidence evaluating healthcare interventions. Journal of Clinical Nursing, 12: 77–84. Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26, 91-108. doi: 10.1111/j.14711842.2009.00848.x Grimshaw, J., McAuley, L. M., Bero, L. A., Grilli, R., Oxman, A. D., Ramsay, C., & Vale, L. (2003). Systematic reviews of the effectiveness of quality improvement strategies and programmes. Quality & Safety in Health Care, 12, 298-303. Higgins, J.P.T. & Green, S. (2008). Cochrane Handbook for Systematic Reviews of Interventions. Chichester: Wiley-Blackwell. Sackett, D. L., & Haynes, R. B. (1995). On the need for evidence-based medicine. Evidence-Based Medicine, 1, 5-6.
Acknowledgements We thank Dr. Noreen Frisch, Director and Professor, School of Nursing, University of Victoria for her support in attending a 5-day workshop organized by the Queen’s Joanna Briggs Collaboration at Queen’s University, Kingston, Ontario. We also acknowledge InspireNet’s support to this project.
Further Information Anastasia Mallidou: mallidou@uvic.ca