Online supplementation for teaching evidence-based medicine: feasibility of a randomised-controlled trial ========================================================================================================= * Marcy C McCall * Thomas R Fanshawe * David McCartney * Damion Young * David Nunan * Carl Heneghan ## Abstract **Background and Objectives** As teaching technology advances, medical education is increasingly using digital mediums and exploring instructional models such as the flipped classroom and blended learning courses, where the in-class taught sessions are more groups on content delivered before class. Early evidence suggests lectures and foundational material can be equally provided online, but we have low-quality research to be convinced. We aim to test and develop an online evidence-based teaching resource that seeks to improve the availability and scalability of evidence-based medicine (EBM) learning tools. We evaluate the feasibility of a study design that could test for changes in academic performance in EBM skills using an online supplement. **Methods** Mixed-methods feasibility study of a randomised controlled trial (RCT) in an undergraduate medical student cohort. **Results** Of a small cohort (n=34), eight participants agreed to randomisation and completed the study. No study participant completed the EBM supplementary course in full. Students report time-management as a significant barrier in participation, and all aspects of the study and communications should be delivered with efficiency a key consideration. **Conclusion** Randomising students to an online EBM supplement within a medical school programme presents challenges of recruitment and student motivation, but the study design is potentially feasible. * medical education & training * evidence-based practice ### Key message #### What is already known about this subject? * Essential training in evidence-based medicine improves critical thinking and statistical reasoning. * Medical school programmes are constrained by the availability of teachers, resources, and time to provide evidence-based medicine (EBM) training. * Foundational material in EBM could be delivered online to improve scalability and uptake. #### What are the new findings? * Randomising students to an online EBM supplement within a medical school programme presents challenges of recruitment and student motivation, but is feasible. * Qualitative and quantitative indicators suggest that students will access online material for revision before an examination. * Baseline knowledge assessment of the student cohort could guide more appropriate content of an online learning supplement in EBM. #### How might it impact on clinical practice in the foreseeable future? * This article provides insight into how to develop, implement and pursue evidence-based education of EBM. ## Background Established in the mid-1990s as an approach to achieve better healthcare outcomes, evidence-based medicine (EBM) is the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients.1 EBM has become essential for the training of young clinicians by stressing critical thinking and the importance of statistical reasoning and continuous evaluation of medical practice.2 Current research explores efficacy measures of EBM teaching, and explores how to improve teaching with different types of education interventions.3–7 Review authors suggest ‘teaching methods for optimising EBP among health professionals could become a robust standardised procedure of the medical education curricula and lifelong learning of healthcare professionals’.8 Still, some medical school curricula are constrained by the availability of teachers and supporting materials to provide adequate EBM training.9 As teaching technology advances, medical education is increasingly using digital mediums and exploring instructional models such as the flipped classroom and blended learning courses, where the in-class taught sessions are more groups on content delivered before class.10 Evidence suggests lectures and foundational material can be equally provided online.11 12 A recent systematic review shows blended learning (lectures with online supplement) is helpful for examination preparation, concept clarification and one strategy to reduce problems in medical student performance.13 However, one review cites insufficient evidence regarding the effectiveness of e-learning on healthcare professional behaviour or patient outcomes and calls for more research in this area.14 Here, we aim to test and develop an online evidence-based teaching resource that seeks to improve the availability and scalability of EBM learning tools. We evaluate the feasibility of a study design that could test for changes in academic performance in EBM skills using an online supplement. ## Research objectives The purpose of this research and analysis is set to explore two related questions: first, could an introductory online course in EBM be administered and evaluated for change in learning outcomes in the context of a prospective randomised-controlled trial (RCT) for medical school training; and second, could delivering a 100% online short-course as a learning supplement to standard in-person teaching increase EBM knowledge and skill acquisition in medical students? ## Methods ### Type of study A mixed-methods feasibility study of an RCT in an undergraduate medical student cohort. ### Participants Undergraduate medical students (n=34) in the first year of the graduate entry programme at University of Oxford in academic year 2018–2019. ### Recruitment and blinding Students were recruited via e-mail and with two in-class oral reminders from course programme director (DMcC) over a 21-day period. Consenting participants were assigned a study ID number (DY) and randomly allocated in equal numbers, using [www.graphpad.com](http://www.graphpad.com), to the intervention or control group (TRF). The outcome assessor (TRF) was blind to participants. Other researchers (MCMcC, DN and CH) were blind in addition to group allocation and data collection. ### Intervention The intervention group received the standard curriculum plus automated registration and delivery of the online EBM Primer. Students signed into the course on the university’s password-protected virtual learning environment (Canvas). These students were provided unlimited access to the EBM Primer from the point of randomisation until their final examination, 12 weeks later. #### Characteristics of the intervention The evidence-based healthcare (EBHC) programme at the University of Oxford sponsored the development of a 100% online EBM primer. The intervention was designed to meet the flexible learning needs of students and piloted in a healthcare education programme (Oxford’s MSc in Evidence-Based Healthcare). The online course is fully self-paced, untutored and requires 10–15 hours to complete. For a course sample and syllabus, please see content overview (online supplementary appendix A) and view [https://www.youtube.com/watch?v=Eg_a3twU0cU](https://www.youtube.com/watch?v=Eg_a3twU0cU). ### Supplementary data [[bmjebm-2020-111372supp001.pdf]](pending:yes) ### Control The control group received Oxford’s Year 4 Graduate Entry standard teaching curriculum until after the practice examination initial assessment. The control group received access to the EBM Primer 8 weeks before June’s final examination. ### Outcome measures #### Primary outcome measures Numerical data collected on the participant recruitment rate, protocol adherence after point of randomisation, and individual and group metrics of course activity ascertain the feasibility of study design. In addition (I), we asked students about barriers and enabling factors for participating in an RCT and completing an online course supplement. For the set of qualitative questions, see online supplementary appendix B. ### Supplementary data [[bmjebm-2020-111372supp002.pdf]](pending:yes) #### Secondary outcome measures The standard formative assessment (approximately 2 hours, computer-based examination) was completed by all study and non-study participants. Study participants completed an integrated online assessment tool of approximately 20 min in addition to the formative assessment (see online supplementary appendix C for the Assessing Competency in EBM (ACE) tool).15 The intervention group also completed a customised precourse and postcourse multiple choice questionnaire (MCQ) within the course. Then, all study participants completed the standard final examination (approximately 2 hours, computer-based examination). ### Supplementary data [[bmjebm-2020-111372supp003.pdf]](pending:yes) ### Data analysis Quantitative outcomes use absolute values, percentages, means and SDs and differences between groups analysed using a t-test showing mean difference and 95% CIs. Usage metrics and statistical analyses are limited due to the small sample size. Questionnaire responses were collected and analysed by frequency of comments and by occurring themes (TRF and MCMcC). ## Results ### Feasibility data The study was conducted over a 24-week period from recruitment to data analysis. For participant flow and study timeline, please refer figure 1. ![Figure 1](http://ebm.bmj.com/https://ebm.bmj.com/content/ebmed/26/5/254/F1.medium.gif) [Figure 1](http://ebm.bmj.com/content/26/5/254/F1) Figure 1 Study flow. #### Recruitment and student participation Of 34 eligible students, eight (24%) chose to participate and were randomly allocated to the intervention and control conditions (four in each group). Five participants were recruited in week 1, two additional participants in week 2 and one participant in week 3. #### Adherence to protocol No participant withdrew from the study. Two of the four participants allocated to the intervention group completed the MCQ at the start of the study; two intervention group participants did not access the EBM primer at any point in the study period. Participants did not complete the MCQ at the end of the intervention phase. All eight participants completed the ACE tool and the summative course assessments. Three of eight study participants completed the qualitative data portion of the study, and three of 26 non-study participants responded to our inquiry of reasons for not participating. #### Changes to protocol We amended our protocol after the randomisation of participants to include qualitative data collection on students’ preferences in learning about EBM, and also to collect information barriers and motivations for participating in the study. The short questionnaires were emailed to the class cohort (n=34) following their final examination in June 2019. #### Accessing the EBM course Figure 2 shows the frequency of access to the EBM Primer throughout the study period. Between 1 April and 30 April 2019, the EBM Primer was available to intervention group participants. One participant viewed the majority of pages during this period. Ahead of the final examination, the EBM Primer was made available to all study participants. Four control participants viewed most pages. Patterns of behaviour suggested that access to the online course occurred on a single day rather than via repeat visits. Three participants accessed the course in days just before a final examination. ![Figure 2](http://ebm.bmj.com/https://ebm.bmj.com/content/ebmed/26/5/254/F2.medium.gif) [Figure 2](http://ebm.bmj.com/content/26/5/254/F2) Figure 2 EBM Primer daily unique page views by participant. Colours indicate different participants, with solid lines being those in the intervention group and dotted lines those in the control group. Between 1 April and 30 April 2019, the EBM Primer was only available to intervention group participants. EBM, evidence-based medicine. #### Barriers and enabling factors Three of eight participants in the study, and three of the 26 non-participants returned responses to questionnaires. Of the study participants, two highlighted the advantage of having access to an extra resource as a reason to participate, and one suggested that it could be a more enjoyable way of learning EBM than lectures. Responses to the teaching materials in the EBM primer were mixed: one participant found it more engaging than lectures, and one reported it to be ‘a good revision aide, but not so good as a primer’. The third found the information clear but providing no new information from lectures. Two participants commented that the exercises or videos were long or slow. Two participants expressed concerns over testing the objectives of the course or key EBM concepts, and one participant highlighted unwelcomed differences in EBM primer-based evaluations compared with the students’ final examination. One participant said having access to an online supplement may provide those students more advantage ahead of the final examination and ‘felt that this was unfair’. One respondent felt the EBM course material was better positioned as a revision tool. Another participant felt the course was too long in duration. Of three non-participant respondents, two gave insufficient time as a reason for not participating, and the other mentioned ‘no desire to engage’ in supplementary EBM teaching. #### Differences in EBM skills and knowledge Quantitative outcome results for each participant were measured using the ACE tool (online supplementary appendix C). Data provided for preliminary analysis only. The two intervention group participants who completed the MCQ at the start of the study scored 6.8 and 7.7 out of 10, respectively, but as neither of these completed it at the end of the intervention phase, this measure could not be analysed. The mean (SD) of the ACE measure was 10.0 (2.4) in the intervention group and 10.5 (1.9) in the control group (mean difference −0.5, 95% CI −4.4 to 3.4) (figure 3). ![Figure 3](http://ebm.bmj.com/https://ebm.bmj.com/content/ebmed/26/5/254/F3.medium.gif) [Figure 3](http://ebm.bmj.com/content/26/5/254/F3) Figure 3 ACE and examination marks by group. Horizontal lines show means per group. For the final performance metric (examination paper 3), the mean (SD) was 66.0 (16.3) in the intervention group and 77.0 (9.8) in the control group (mean difference −11.0, 95% CI −35.5 to 13.5). ## Discussion As a feasibility RCT, this study indicates implementation challenges for medical student recruitment and adherence to an online training supplement of EBM. Our preliminary data suggest that students access online learning material in advance of examination, and it is unclear if supplementary training in EBM would be a priority for students otherwise. ### Threats of bias Student participants who volunteered for this study may have systematic differences in their baseline knowledge and interest in EBM. A potential selection bias was further indicated in qualitative data, where one student described feeling unmotivated to participate in the study due to lack of interest in EBM. Our study design and its intervention were developed by members of the same research and teaching team (MCMcC, DY, CH and DN). To control for researcher or interpreter bias, the EBM Primer course director (MCMcC) and contributors (CH and DN) were not involved in recruitment, data collection or preliminary analyse and blinded from the point of recruitment to end of study. DY was not involved in analysis of the data. DMcC, as director of the programme, was involved in developing the study design and conducting recruitment. ### Recommendations for improved study design #### Recruitment, participation and retention In future iterations, we recommend a longer and more aggressive (in-person) recruitment phase to encourage students to participate. Our retention rate was 100%, which we suspect was attributed to the short study duration and motivated study participants. No study participant completed the EBM supplementary course in full. Students report time management as a significant barrier in participation, and all aspects of the study and communications should be delivered with efficiency a key consideration. An alternative to individual randomisation within a medical school cohort, different groups of classes could be randomly allocated to receive online EBM course versus standard curriculum. In addition, increasing the required amount of EBM content on the medical school curriculum could increase student motivation and participation.16 #### Characteristics of the intervention Implementation of an online supplement requires minimal input from a teaching or administrative standpoint. An online intervention carries advantages of standardising content delivery, amenable to measurement in the RCT framework. Additional information is required to explore how this online course should be modified to meet students’ preferences in terms of content delivery and duration. Patterns of behaviour indicate content were more important to study participants in advance of an examination. #### Pre-existing knowledge of EBM Earlier validation studies for the ACE tool reports ‘EBM-novice’ averaged 8.6 (SD 2.4), ‘EBM-intermediate’ averaged 9.5 (SD 1.8) and ‘EBM-advanced’ averaged 10.4 (SD 2.2).15 Our test results indicate seven of eight study participants form Oxford’s graduate-entry year 4 medical students had an existing intermediate/advanced knowledge-based of EBM. Other medical schools or students earlier in their training may present with lower baseline knowledge, and therefore would present an opportunity for greater increase in knowledge acquisition. ## Conclusion The conduct of an RCT examining the effect of an online supplement is potentially feasible. Adequate recruitment and adherence of study participants will need consideration in each context, focusing on strategies that will increase perceived benefits and encourage student interest but not threaten validity or reproducibility of results. We support the development of evidence-based teaching for EBM and online education interventions. Medical educators should pursue this research agenda and seek data to understand what works in terms of quantitative metrics for effective EBM knowledge acquisition, as well as in terms of qualitative data for student preferences. ### Supplementary data [[bmjebm-2020-111372supp004.pdf]](pending:yes) ## Data availability statement Data are available upon reasonable request. Contact medical statistician Thomas Fanshawe for deidentified participant data at: thomas.fanshawe@phc.ax.ac.uk ## Ethics statements ### Patient consent for publication Not required. ### Ethics approval The study was approved by Oxford’s Central University Research Ethics Committee (CUREC) as lower-risk research involving human participants and/or their data (R62369). ## Acknowledgments We are grateful to student volunteers and faculty of University of Oxford’s EBHC programme, who with their support and interest in evidence-based practice, have made this study possible. ## Footnotes * Twitter @dnunan79, @carlheneghan * Contributors Authors made the following valuable contributions to the research and manuscript: MCMcC: protocol development, intervention design plus implementation, data analysis, draft and submit manuscript. TRF: protocol development, data analysis, draft manuscript. DMcC: protocol development, recruitment, communications with study participants, manuscript review. DY: protocol development, communications with study participants, data collection, manuscript review. DN: protocol advice, manuscript review. CH: protocol advice, ethics application and manuscript review. * Funding The intervention was developed using University of Oxford’s EBHC programme funds and volunteer efforts. Research was self-funded. Authors have not received funding or compensation beyond their usual University roles. No competitive industry inputs or commercial interests are known. * Competing interests The online course intervention design and its implementation were a joint initiative of MCMcC and CH. DMcC and DY work within the medical sciences division to administer teaching in the medical school programme. DN and MCMcC have presented this course as part of a sample curriculum in teaching evidence-based medicine. CH, TRF and DN are paid faculty who teach on the evidence-based health care (EBHC) programme, Oxford University. MCMcC is also part-time lecturer on the EBHC programme. * Provenance and peer review Not commissioned; externally peer reviewed. [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/). ## References 1. Sackett DL , Rosenberg WM , Gray JA , et al . Evidence based medicine: what it is and what it isn't. BMJ 1996;312:71–2.[doi:10.1136/bmj.312.7023.71](http://dx.doi.org/10.1136/bmj.312.7023.71) pmid:http://www.ncbi.nlm.nih.gov/pubmed/8555924 [FREE Full Text](http://ebm.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjExOiIzMTIvNzAyMy83MSI7czo0OiJhdG9tIjtzOjIwOiIvZWJtZWQvMjYvNS8yNTQuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 2. Djulbegovic B , Guyatt GH . Progress in evidence-based medicine: a quarter century on. Lancet 2017;390:415–23.[doi:10.1016/S0140-6736(16)31592-6](http://dx.doi.org/10.1016/S0140-6736(16)31592-6) pmid:http://www.ncbi.nlm.nih.gov/pubmed/28215660 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 3. Shaneyfelt T , Karyn Baum MD , Mse DB , et al . Instruments for Evaluating Education in Evidence-Based Practice A Systematic Review [Internet]. Available: [https://jamanetwork.com/](https://jamanetwork.com/) 4. Dorsch JL , Aiyer MK , Meyer LE . Impact of an evidence-based medicine curriculum on medical students' attitudes and skills. J Med Libr Assoc 2004;92:397–406.pmid:http://www.ncbi.nlm.nih.gov/pubmed/15494754 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=15494754&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) [Web of Science](http://ebm.bmj.com/lookup/external-ref?access_num=000224486200006&link_type=ISI) 5. Larsen CM , Terkelsen AS , Carlsen A-MF , et al . Methods for teaching evidence-based practice: a scoping review. BMC Med Educ 2019;19.[doi:10.1186/s12909-019-1681-0](http://dx.doi.org/10.1186/s12909-019-1681-0) 6. Albarqouni L , Hoffmann T , Glasziou P . Evidence-Based practice educational intervention studies: a systematic review of what is taught and how it is measured. BMC Med Educ 2018;18:177. [doi:10.1186/s12909-018-1284-1](http://dx.doi.org/10.1186/s12909-018-1284-1) pmid:http://www.ncbi.nlm.nih.gov/pubmed/30068343 [CrossRef](http://ebm.bmj.com/lookup/external-ref?access_num=10.1186/s12909-018-1284-1&link_type=DOI) [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 7. Hecht L , Buhse S , Meyer G . Effectiveness of training in evidence-based medicine skills for healthcare professionals: a systematic review. BMC Med Educ 2016;16.[doi:10.1186/s12909-016-0616-2](http://dx.doi.org/10.1186/s12909-016-0616-2) 8. Patelarou AE , Kyriakoulis KG , Stamou AA , et al . Approaches to teach evidence-based practice among health professionals: an overview of the existing evidence. Adv Med Educ Pract 2017;8:455–64.[doi:10.2147/AMEP.S134475](http://dx.doi.org/10.2147/AMEP.S134475) pmid:http://www.ncbi.nlm.nih.gov/pubmed/28740443 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 9. Meats E , Heneghan C , Crilly M , et al . Evidence-Based medicine teaching in UK medical schools. Med Teach 2009;31:332–7.[doi:10.1080/01421590802572791](http://dx.doi.org/10.1080/01421590802572791) pmid:http://www.ncbi.nlm.nih.gov/pubmed/19404893 [CrossRef](http://ebm.bmj.com/lookup/external-ref?access_num=10.1080/01421590802572791&link_type=DOI) [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=19404893&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 10. Kyriakoulis K , Patelarou A , Laliotis A , et al . Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. J Educ Eval Health Prof 2016;13:34. [doi:10.3352/jeehp.2016.13.34](http://dx.doi.org/10.3352/jeehp.2016.13.34) pmid:http://www.ncbi.nlm.nih.gov/pubmed/27649902 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 11. Tang B , Coret A , Qureshi A , et al . Online lectures in undergraduate medical education: Scoping review. JMIR Med Educ 2018;4:e11.[doi:10.2196/mededu.9091](http://dx.doi.org/10.2196/mededu.9091) pmid:http://www.ncbi.nlm.nih.gov/pubmed/29636322 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 12. McCall M , Spencer E , Owen H , et al . Characteristics and efficacy of digital health education: an overview of systematic reviews. Health Educ J 2018;77:497–514.[doi:10.1177/0017896918762013](http://dx.doi.org/10.1177/0017896918762013) 13. Ahmady S , Khajeali N , Sharifi F , et al . Factors related to academic failure in preclinical medical education: a systematic review. J Adv Med Educ Prof 2019;7:74–85.pmid:http://www.ncbi.nlm.nih.gov/pubmed/31086799 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 14. Sinclair PM , Kable A , Levett-Jones T , et al . The effectiveness of Internet-based e-learning on clinician behaviour and patient outcomes: a systematic review. Int J Nurs Stud 2016;57:70–81.[doi:10.1016/j.ijnurstu.2016.01.011](http://dx.doi.org/10.1016/j.ijnurstu.2016.01.011) pmid:http://www.ncbi.nlm.nih.gov/pubmed/27045566 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom) 15. Ilic D , bin Nordin R , Glasziou P , et al . Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine [Internet], 2014. Available: [http://www.biomedcentral.com/1472-6920/14/114](http://www.biomedcentral.com/1472-6920/14/114) 16. West CP , Jaeger TM , McDonald FS . Extended evaluation of a longitudinal medical school evidence-based medicine curriculum. J Gen Intern Med 2011;26:611–5.[doi:10.1007/s11606-011-1642-8](http://dx.doi.org/10.1007/s11606-011-1642-8) pmid:http://www.ncbi.nlm.nih.gov/pubmed/21286836 [PubMed](http://ebm.bmj.com/lookup/external-ref?access_num=21286836&link_type=MED&atom=%2Febmed%2F26%2F5%2F254.atom)