关键词: Clinical Decision-Making Evidence-Based Practice Health Methods

来  源:   DOI:10.1136/bmjebm-2023-112647

Abstract:
OBJECTIVE: The objectives of this study are to assess reporting of evidence-based healthcare (EBHC) e-learning interventions using the Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) checklist and explore factors associated with compliant reporting.
METHODS: Methodological cross-sectional study.
METHODS: Based on the criteria used in an earlier systematic review, we included studies comparing EBHC e-learning and any other form of EBHC training or no EBHC training. We searched Medline, Embase, ERIC, CINAHL, CENTRAL, SCOPUS, Web of Knowledge, PsycInfo, ProQuest and Best Evidence Medical Education up to 4 January 2023. Screening of titles, abstracts, full-text articles and data extraction was done independently by two authors. For each study, we assessed adherence to each of the 17 GREET items and extracted information on possible predictors. Adequacy of reporting for each item of the GREET checklist was judged with yes (provided complete information), no (provided no information), unclear (when insufficient information was provided), or not applicable, when the item was clearly of no relevance to the intervention described (such as for item 8-details about the instructors-in the studies which used electronic, self-paced intervention, without any tutoring). Studies\' adherence to the GREET checklist was presented as percentages and absolute numbers. We performed univariate analysis to assess the association of potential adherence predictors with the GREET checklist. We summarised results descriptively.
RESULTS: We included 40 studies, the majority of which assessed e-learning or blended learning and mostly involved medical and other healthcare students. None of the studies fully reported all the GREET items. Overall, the median number of GREET items met (received yes) per study was 8 and third quartile (Q3) of GREET items met per study was 9 (min. 4 max. 14). When we used Q3 of the number of items met as cut-off point, adherence to the GREET reporting checklist was poor with 7 out of 40 studies (17.5%) reporting items of the checklist on acceptable level (adhered to at least 10 items out of 17). None of the studies reported on all 17 GREET items. For 3 items, 80% of included studies well reported information (received yes for these items): item 1 (brief description of intervention), item 4 (evidence-based practice content) and item 6 (educational strategies). Items for which 50% of included studies reported complete information (received yes for these items) included: item 9 (modes of delivery), item 11 (schedule) and 12 (time spent on learning). The items for which 70% or more of included studies did not provide information (received no for these items) included: item 7 (incentives) and item 13 (adaptations; for both items 70% of studies received no for them), item 14 (modifications of educational interventions-95% of studies received no for this item), item 16 (any processes to determine whether the materials and the educational strategies used in the educational intervention were delivered as originally planned-93% of studies received no for this item) and 17 (intervention delivery according to schedule-100% of studies received no for this item). Studies published after September 2016 showed slight improvements in nine reporting items. In the logistic regression models, using the cut-off point of Q3 (10 points or above) the odds of acceptable adherence to GREET guidelines were 7.5 times higher if adherence to other guideline (Consolidated Standards of Reporting Trials, Strengthening the Reporting of Observational Studies in Epidemiology, etc) was reported for a given study type (p=0.039), also higher number of study authors increased the odds of adherence to GREET guidance by 18% (p=0.037).
CONCLUSIONS: Studies assessing educational interventions on EBHC e-learning still poorly adhere to the GREET checklist. Using other reporting guidelines increased the odds of better GREET reporting. Journals should call for the use of appropriate use of reporting guidelines of future studies on teaching EBHC to increase transparency of reporting, decrease unnecessary research duplication and facilitate uptake of research evidence or result.
BACKGROUND: The Open Science Framework (https://doi.org/10.17605/OSF.IO/V86FR).
摘要:
目的:本研究的目的是使用报告循证实践教育干预和教学(GREET)清单评估循证医疗保健(EBHC)电子学习干预措施的报告,并探索与合规报告相关的因素。
方法:方法学横断面研究。
方法:根据早期系统评价中使用的标准,我们纳入了比较EBHC电子学习和任何其他形式的EBHC训练或不进行EBHC训练的研究.我们搜查了Medline,Embase,ERIC,CINAHL,中部,Scopus,WebofKnowledge,PsycInfo,ProQuest和最佳证据医学教育至2023年1月4日。标题筛选,摘要,全文文章和数据提取由两位作者独立完成.对于每一项研究,我们评估了17项GREET项目的依从性,并提取了可能预测因子的信息.对GREET核对表中的每个项目的报告是否充分,均以“是”(提供完整信息)进行判断,否(未提供任何信息),不清楚(当提供的信息不足时),或不适用,当项目显然与所描述的干预措施无关时(如项目8-关于教师的细节-在使用电子的研究中,自我节奏的干预,没有任何辅导)。研究对GREET清单的依从性以百分比和绝对数字表示。我们进行了单变量分析,以评估潜在的依从性预测因子与GREET检查表的关联。我们描述性地总结了结果。
结果:我们纳入了40项研究,其中大多数评估电子学习或混合学习,主要涉及医学和其他医疗保健学生。没有一项研究完全报告了所有GREET项目。总的来说,每个研究满足的GREET项目的中位数(接受的是)为8个,每个研究满足的GREET项目的第三四分位数(Q3)为9个(min.4max.14).当我们使用Q3的项目满足的数量作为截止点,对GREET报告清单的依从性较差,40项研究中有7项(17.5%)报告清单的项目达到可接受水平(17项至少符合10项).没有一项研究报告了所有17项GREET项目。对于3个项目,80%的研究报告了良好的信息(这些项目收到的是):项目1(干预措施的简要描述),第4项(循证实践内容)和第6项(教育策略)。50%的纳入研究报告完整信息的项目(这些项目收到的是)包括:项目9(交付方式),项目11(时间表)和12(学习时间)。70%或更多的纳入研究没有提供信息的项目(这些项目没有收到)包括:项目7(激励措施)和项目13(适应;对于这两个项目,70%的研究没有收到)。项目14(教育干预措施的修改-95%的研究没有收到该项目的),项目16(确定教育干预中使用的材料和教育策略是否按原计划交付的任何过程-93%的研究没有获得该项目的结果)和17(根据时间表进行干预-100%的研究没有获得该项目的结果).2016年9月后发表的研究显示,9个报告项目略有改善。在逻辑回归模型中,使用第三季度的截止点(10分或以上),如果遵守其他准则(综合报告标准试验,加强流行病学观察研究的报告,等)是针对给定研究类型报告的(p=0.039),更多的研究作者使坚持GREET指导的几率增加了18%(p=0.037).
结论:评估EBHC电子学习教育干预措施的研究仍然不符合GREET清单。使用其他报告准则增加了更好的GREET报告的可能性。期刊应呼吁在未来的EBHC教学研究中适当使用报告指南,以提高报告的透明度,减少不必要的研究重复,促进研究证据或结果的吸收。
背景:开放科学框架(https://doi.org/10.17605/OSF。IO/V86FR)。
公众号