关键词: ChatGPT academia artificial intelligence medical writing

来  源:   DOI:10.1177/15533506241259916

Abstract:
Background: When properly utilized, artificial intelligence generated content (AIGC) may improve virtually every aspect of research, from data gathering to synthesis. Nevertheless, when used inappropriately, the use of AIGC may lead to the dissemination of inaccurate information and introduce potential ethical concerns.Research Design: Cross-sectional. Study Sample: 65 top surgical journals. Data Collection: Each journals submission guidelines and portal was queried for guidelines regarding AIGC use.Results: We found that, in July 2023, 60% of the top 65 surgical journals had introduced guidelines for use, with more surgical journals (68%) introducing guidelines than surgical subspecialty journals (52.5%), including otolaryngology (40%). Furthermore, of the 39 with guidelines, only 69.2% gave specific use guidelines. No included journal, at the time of analysis, explicitly disallowed AIGC use.Conclusions: Altogether, this data suggests that while many journals have quickly reacted to AIGC usage, the quality of such guidelines is still variable. This should be pre-emptively addressed within academia.
摘要:
背景:如果使用得当,人工智能生成内容(AIGC)可能会改善研究的几乎每个方面,从数据收集到综合。然而,如果使用不当,AIGC的使用可能导致不准确信息的传播,并引入潜在的道德问题。研究设计:横截面。研究样本:65种顶级外科期刊。数据收集:向每个期刊提交指南和门户查询有关AIGC使用的指南。结果:我们发现,2023年7月,排名前65位的外科期刊中有60%引入了使用指南,引入指南的外科期刊(68%)多于外科亚专业期刊(52.5%),包括耳鼻喉科(40%)。此外,在39个有指导方针的国家中,只有69.2%给出了具体的使用指南。不包括日记,在分析的时候,明确禁止使用AIGC。结论:总之,这些数据表明,尽管许多期刊对AIGC的使用迅速做出了反应,这些准则的质量仍然是可变的。这应该在学术界先发制人地解决。
公众号