Taking Stock: What is the Quantity and Quality of our Evidence Base? Andrew Booth Reader in Evidence Based Information Practice
Outline • What is Evidence Based Library and Information Practice (EBLIP)? • What Are the Challenges and Barriers? • How Good is the Evidence? • How might the Challenges and Barriers be Overcome?
What is Evidence Based Library and Information Practice (EBLIP)?
Evidence based library and information practice is …
• “EBLIP involves asking answerable questions, finding, critically appraising and then utilising research evidence from relevant disciplines in daily practice. It thus attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making”. (Booth, 2006)
Why librarians? • “As a profession which has the ability to manage the literature of research, librarianship is uniquely placed to model the principles of evidence-based practice, not only as they apply to other disciplines which we serve, but also as they apply to our own professional practice” (Ritchie, 1999)
What Are the Challenges and Barriers?
Challenges for EBLIP • Quality of the evidence • Dispersion of evidence sources (e.g. education, management, computer science) • Skills in conducting research • Skills in disseminating research • Skills in interpreting research • Time!
The Evidence Based Professional….. • Struggles heroically to keep up with best practice. Subscribes to key journals from their own pocket. Access email alerting service and move items into Pending folder until they have time to read it. • Occasionally they spot gap in their current practice or opportunity to develop their services. They stay after-hours to conduct a literature search on topic they have identified. After they have handled all user requests and enquiries, they "slip" a couple of photocopy requests into the system for their own professional development. • Having read and appraised the evidence they summarise their findings in a report to the library committee. Report is discussed at a monthly team meeting and task group set up to explore feasibility. After six quarterly meetings the task group eventually agrees that issue needs to be tackled. An implementation group gets to work on draft local guidance.
A Long Way to Go!
Challenges for EBLIP • Quality of the evidence • Dispersion of evidence sources (e.g. education, management, computer science) • Skills in conducting research • Skills in disseminating research • Skills in interpreting research • Time!
How Good is the Evidence Base?
What is an Evidence Base? • An “Evidence Base” is “the best available external evidence from systematic research” • Evidence may be overwritten as better evidence becomes available • “Evidence Base” is cross-sectional snapshot of evidence available on particular topic at particular point in time
Two Important Points 1. Evidence Base is always changing. We are interested in both: •
Prevalent Evidence – Available evidence from existing research
•
Incident Evidence – New evidence from recent research
2. Evidence Base for a particular topic may lie within a specific discipline or it may be derived from associated disciplines
For Example • As a library manager you are not only interested in directly applicable evidence from research conducted in other libraries. • You may also be interested in evidence from research in human resource management (organisation of teams), in marketing (promoting services), in general management (morale, motivation, leadership), financial management etcetera
Summary – So far • An evidence base, or body of evidence, difficult to define. Commonly thought of as “a collection of research that informs practice”. • Evidence base for information practice is located within three main search domains: (1) library and informatics literature; (2) so-called “grey literature”; and, (3) the literatures outside our field with functional relevance to the question such as the literatures of the social, behavioural, education or management sciences (Eldredge (2004) p. 36). • Quality and quantity of research varies according to subject being studied and discipline.
Our Evidence Base - 1 • Health Research e.g. Information needs/ Information Seeking of cancer patients • Information Systems e.g. Evaluation of System performance • Library and Information Practice e.g. Question asking, Information Sources
Our Evidence Base - 2 • Health Research e.g. Qualitative Research Designs, RCTs • Information Systems e.g. Mixed Methods Evaluations, Small-Scale Experiments • Library and Information Practice e.g. Surveys, Questionnaires, Reviews
By Domain - 1 E.g. What proportion of research studies are on Information Retrieval? • 217 LIS journals reviewed; 91 journals provided data. 2664 journal articles examined, with 807 (30.3%) classified as research. • Top 10 journals for research (2001): 1) 2) 3) 4) 5) 6) 7) 8)
JASIST, Scientometrics, Info Proc & Man; Coll & Res Lib, Tie: J Lib Adm/Bull Med Lib Assn, Libs & Culture, J Doc, Tie: J Info Sci/J Acad Libr.
By Domain - 2 • Descriptive research (329/807 articles) published most frequently. • Information Access & Retrieval had highest number of research articles (314/807), followed by Collections (193/807), Management (135/807),Education (95/807) and Reference (77/807). • Two new domains identified: Library History and Professional Issues. No evidence to support Marketing & Promotion domain. LISA provides best coverage of top 10 LIS research journals.
Hierarchy of Evidence Comparative Prospective
•  For critique see: Booth A (2010) On hierarchies, malarkeys and anarchies of evidence. Health Information and Libraries Journal, 27, (1): 84-88.
By Study Type: The Three-Legged Chair! • “Studies reveal that library research relies primarily upon three levels of evidence; descriptive surveys, case studies, and qualitative methods”. (p. 294) Eldredge (2000)
But which do we mean?
OR
How Low Can You Go?
Literature of Library & Information Practice
An Analysis of the Literature on Instruction in Academic Libraries
1. Descriptive surveys
Survey comes first! • “the survey approach remains the predominant research strategy in both library science and information science.” (p. 108). • 2005: survey research accounted for 30.5% (n=173) of studies analysed to determine research methods. This was highest percentage with next highest being experimentation at 20.8% (n=118). • However, 30.8% figure was substantial decrease from earlier articles reporting survey research at 40.1% and 41% in 1975 and 1985 respectively. Hider and Pymm (2008)
What are surveys good for? • “the basic purposes of descriptive surveys are to describe characteristics of the population of interest, estimate proportions in the population, make specific predictions, and test associational relationships. (They can be used to explore causal relationships)” (Powell & Connaway, 2004 p. 87)
Better a good survey than a poor trial? • “Some lower levels of EBL evidence may contain studies with higher-quality design methodological rigor than study designs ranked at the higher levels of EBL evidence. In this connection, a well-designed descriptive survey could have greater validity than a poorly designed or procedurally compromised randomized controlled trial”. (p. 294) Eldredge (2000)
But are they good surveys?
$64,000 Question! • “Why do we insist on believing that any librarian can successfully design a questionnaire?” (Boot h, 2005 p. 228)
Solution: Develop Evidence Based Questionnaires • “How often, though, do those researchers use the same questionnaire or at least the same or similar (enough) questions (after getting proper permissions and giving proper attributions, of course)? • How often, when selecting survey participants, do they try to control for the same factors as the studies they are using as examples? • How often, in other words, do they approach their project from the standpoint of gathering results that will be directly comparable to the work they are using as models? Not having studied this systematically myself, I cannot say for sure, but my impression is that the answer would have to be: not very often”. (Plutchak, 2005)
Mind your Ps and A quest for Qs (pitfalls of questionnaires questionnaires) Booth A. Health Info Libr Booth A. Health Info Libr J. J. 2003 Mar;20(1): 2005 Sep;22(3):228-31. 53-6.
Demystifying Survey Research: Practical Suggestions for Effective Question Design. Charbonneau, D.. Evidence Based Library and Information Practice, 2 7 12 2007
The Compound (Fractured) Question! • “We encounter questions such as ‘do you require information or training on the MEDLINE database?’ Where a respondent completes the answer ‘Yes’ we are not able to discern if they are saying ‘Yes’ to information, ‘Yes’ to training or ‘Yes’ to both”. (Booth, 2005 p. 230)
2. Case Studies
By Application • E.g. What proportion of studies are “Directly Applicable” to my question/practice? • Ideally we would like Evidence that is Directly Applicable. • More commonly we encounter Evidence that needs to be Locally Validated, perhaps through a survey or audit of local services. • In our general reading we encounter Evidence that Improves Understanding.(Koufogiannakis and Crumley, 2004) • Final category is Evidence to inform our Choice of Methodologies, Tools or Instruments (Booth, 2004)
On the Case! We Want: • Before-After Evaluations using a clearly focused question with objective outcome measures
We Get: • This is how we did it in our Library… • ….. And the users liked it and it was a great success
3. Qualitative Research
The Neglected Voice (Booth & Brice, 2007) “EBLIP…attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making”. Booth, 2006a
Effectiveness
User views
Professional judgement
We therefore need to: • Know how much Qualitative Research is present in the Library Literature • Identify Qualitative Research from the Library Literature • Characterise the different types of Qualitative Research in the Library Literature
What I did? • Attempted to replicate and particularise methods used by Koufogiannakis Slater & Crumley, 2004. • Examined Library and information studies literature (2007-2008) using content analysis. Chosen years represented most recent full years of content. • Compiled comprehensive list of potential journals for inclusion from 2008 Journal Citation Reports subject list (Information Science & Library Science). • Coded abstracts using Keywords in Qualitative Methods: a vocabulary of research concepts, supplemented by library-favoured terms.
How much QR is in the Library Literature? • 11,901 abstracts identified (2007-2008). • 9,645 (81%) excluded as “not research”. • Of remainder (2256, 19%) over half (1287, 11%) considered to report qualitative research. Take Home Message: Based on only ISI eligible LIS journals (i.e. those with an impact factor) about 20% of articles constituted research of which about half appeared to be qualitative research
What types of QR in the Library
Literature? - 1
• Interviews, Questionnaires and Observation most common qualitative research methods in LIS. • Delphi techniques and focus groups, although less prevalent, also well represented in LIS literature. • About 3% of qualitative research occurs in context of mixed methods studies. • Critical incident technique (11 studies, 1%) and Network analysis (31 studies, 3%) disproportionately represented in LIS qualitative research literature.
What types of QR in the Library
Literature? - 2
Conclusion: • Based on ISI eligible LIS journals about 20% of articles constituted research of which about 11% appeared to be qualitative research • In contrast to well-indexed/well-abstracted topics, (e.g. health care), more exhaustive list of methodological terms needed to retrieve appropriate qualitative LIS studies. • Additional adjectival expressions (e.g. “in depth” for case studies and interviews and “semistructured” for interviews) to enhance precision of retrieval strategies.
As Koufogiannakis and Crumley state:
• “"When using research to help with a question, look for high quality studies, but do not be too quick to dismiss everything as irrelevant. Try to take what does apply from the research and use it to resolve the problem at hand" (Koufogiannakis and Crumley, 2004)
Conclusions • Ultimately, in assessing the Evidence Base, we are interested both in Quantity and Quality • The Evidence Base is composed from different disciplines with different research traditions • If there are no robust study designs we may have to settle for inferior designs • We still need to assess each study for quality • Poor quality studies mean that uncertainty still exists
Alternatives? Action Research • Combining action-research and EBL would add a high dose of active commitment to the latter and good tools for the refinement of theoretical material to the former. • In the case of EBL, it would allow to shift the focus from information search to final action, and would provide a closer contact with action’s addresses, including them in the decisionmaking process. Civallero (2007) http://archive.ifla.org/IV/ifla73/papers/154Civallero-en.pdf
Four point plan 1. Demand evidence – best way to get organization to become evidence-based is for leaders to ask for evidence supporting decisions and recommendations. 2. Examine logic and critically evaluate any evidence presented. 3. Treat organization as unfinished prototype - Try something new in a limited way, gather evidence and then adapt, revise and retry as needed. 4. Cultivate attitude of wisdom throughout the organization - act on best available evidence at the time, keep questioning what we know, see if new evidence comes to light and be open to any new evidence. Fisher & Richardson EBLIP4 (2007) (after Pfeffer and Sutton)
How might the Barriers and Challenges be Overcome?
Challenges for EBLIP • Quality of the evidence • Dispersion of evidence sources (e.g. education, management, computer science) • Skills in conducting research • Skills in disseminating research • Skills in interpreting research • Time!
• Better Studies • Better application from other disciplines
• Library curricula • CPD • Make EBLIP more important • Integrate with existing activities
Can EBLIP be fitted within the LIS Curriculum? • Two alternatives • Add an EBLIP module to existing overloaded curricula • Integrate EBLIP into every module What is the Evidence Base? What are the Burning Questions? Where are the Gaps? ……..Reviews of the Literature
EBLIP Skills
Hallam & Partridge (2003)
Above All, Start Small! • Therese Skagen (University of Bergen Library, Norway) identified some challenges to EBP including time allocation, dissemination (within and outside the library), competences (in research and planning), and resources. Management needs to be supportive, and you need to believe that research can be of value to the organisation. • She suggests trying out EBP in a small area of the library service to begin with. Therese saw gains from EBP in terms of, for example, your own learning, strategic understanding or (organisationally) increased quality of service and better morale”. Information Literacy Weblog
Implementing Research in Practice - The challenge • "The key to evidence-based information practice is the ongoing development and application of…information science research".
• "Individual……librarians must apply the results of research routinely to library and information service practice, to the development of information policy, and to other information issues important to…..institutions Using Scientific Evidence to Improve Information Practice: The Research Policy Statement of the Medical Library Association
Ten Steps for Practical EBLIP 1. Integrate EBLIP into recruitment and development 2. Practice Evidence Based Project Management 3. Incorporate Evidence Review into Existing Meetings 4. Utilise Evidence Based Standards and Guidelines 5. Implement Evidence Based Webpages HILJ, Mar 2009
Ten Steps for Practical EBLIP 6. Develop Evidence Based Questionnaires 7. Practise Evidence Based Collection Management 8. Evaluate Information Literacy Instruction 9. Manage Change using Evidence Based Methods 10. Evaluate Evidence Based Strategies HILJ, Mar 2009
Are you ready to meet the challenge? • To identify important answerable questions from your practice • To rapidly review the evidence for answers to these questions • To make changes to your practice • To evaluate those changes • To share the lessons learnt
Look At Things from a Different Perspective!
Or Do Things As Evidence Based Teams‌.More of that Later!
Some Useful Resources • EBLIP Journal https://ejournals.library.ualberta.ca/index.php/ EBLIP • Libraries Using Evidence Toolkit http://www.newcastle.edu.au/service/library/ gosford/ebl/ • EBLIP5 Conference http://blogs.kib.ki.se/eblip5/
References - 1 • Booth, A. (2003). A quest for questionnaires. Health Information and Libraries Journal, 20(1), 53-56. • Booth A (2004) What research studies do practitioners actually find useful? Health Info Libr J. 21 (3), 197–200. • Booth, A. (2005). Mind your Ps and Qs (pitfalls of questionnaires). Health Information and Libraries Journal, 22(3), 228-231. • Booth, A. (2004) Evaluating your performance. In: Booth & Brice. Evidence Based Practice for Information Professionals: a handbook. Facet: 127-137. • Booth, A. (2009) Eleven steps to EBLIP service. Health Information and Libraries Journal, 26, (1): 81-84.
References - 2 • Eldredge, J. (2000). Evidence-based librarianship: an overview Bulletin of the Medical Library Association, 88(4), 289-302. • Eldredge, J. (2004). How good is the evidence base? In A. Booth & A. Brice (Eds.), Evidence-Based Practice for Information Professionals: a Handbook. London: Facet Publishing. • Ellis J. Mulligan I. Rowe J. Sackett DL. Inpatient general medicine is evidence based. A-Team, Nuffield Department of Clinical Medicine. Lancet. 346(8972):407-10, 1995 Aug 12. • Hallam, G & Partridge, H. (2003) Generic capabilities for the library and information professional: report on a QUT teaching and learning research project prepared for the ALIA LISEKA working group. Unpublished report. • Jerome RN. Further developing the profession's research mentality. J Med Libr Assoc. 2008 Oct;96(4):287-9.
References - 3 • Hider, P., & Pymm, B. (2008). Empirical research methods reported in high-profile LIS journal literature. Library & Information Science Research, 30(2), 108-114. • Koufogiannakis D & Crumley E (2004) Applying evidence to your everyday practice in Booth A & Brice A (2004). Evidence-based Practice for Information Professionals: a handbook. London, Facet. Chapter10 pp.119-126 • Koufogiannakis, D., Slater, L., Crumley, E. (2003), A content analysis of librarianship research, Journal of Information Science, 30 (3): 227-39. • Plutchak TS. Building a body of evidence. J Med Libr Assoc. 2005 Apr;93(2):193-5. • Sampson M, Daniel R, Cogo E, Dingwall O. Sources of evidence to support systematic reviews in librarianship. J Med Libr Assoc. 2008 Jan;96(1):66-9.