关键词: artificial intelligence autosegmentation contouring crowdsourcing radiation oncology segmentation

来  源:   DOI:10.1117/1.JMI.10.S1.S11903   PDF(Pubmed)

Abstract:
UNASSIGNED: Contouring Collaborative for Consensus in Radiation Oncology (C3RO) is a crowdsourced challenge engaging radiation oncologists across various expertise levels in segmentation. An obstacle to artificial intelligence (AI) development is the paucity of multiexpert datasets; consequently, we sought to characterize whether aggregate segmentations generated from multiple nonexperts could meet or exceed recognized expert agreement.
UNASSIGNED: Participants who contoured ≥ 1 region of interest (ROI) for the breast, sarcoma, head and neck (H&N), gynecologic (GYN), or gastrointestinal (GI) cases were identified as a nonexpert or recognized expert. Cohort-specific ROIs were combined into single simultaneous truth and performance level estimation (STAPLE) consensus segmentations. STAPLE nonexpert ROIs were evaluated against STAPLE expert contours using Dice similarity coefficient (DSC). The expert interobserver DSC ( IODSC expert ) was calculated as an acceptability threshold between STAPLE nonexpert and STAPLE expert . To determine the number of nonexperts required to match the IODSC expert for each ROI, a single consensus contour was generated using variable numbers of nonexperts and then compared to the IODSC expert .
UNASSIGNED: For all cases, the DSC values for STAPLE nonexpert versus STAPLE expert were higher than comparator expert IODSC expert for most ROIs. The minimum number of nonexpert segmentations needed for a consensus ROI to achieve IODSC expert acceptability criteria ranged between 2 and 4 for breast, 3 and 5 for sarcoma, 3 and 5 for H&N, 3 and 5 for GYN, and 3 for GI.
UNASSIGNED: Multiple nonexpert-generated consensus ROIs met or exceeded expert-derived acceptability thresholds. Five nonexperts could potentially generate consensus segmentations for most ROIs with performance approximating experts, suggesting nonexpert segmentations as feasible cost-effective AI inputs.
摘要:
UNASSIGNED:放射肿瘤学共识(C3RO)是一项众包挑战,涉及各种细分领域的放射肿瘤学家。人工智能(AI)发展的一个障碍是多专家数据集的匮乏;因此,我们试图表征从多个非专家产生的聚合分割是否可以达到或超过公认的专家协议.
未经评估:乳房轮廓≥1个感兴趣区域(ROI)的参与者,肉瘤,头颈部(H&N)妇科(GYN),或胃肠道(GI)病例被确定为非专家或公认的专家。队列特定的ROI被组合成单个同时的真值和性能水平估计(STAPLE)共识分割。使用Dice相似性系数(DSC)针对STAPLE专家轮廓评估了STAPLE非专家ROI。专家观察者间DSC(IODSC专家)被计算为STAPLE非专家和STAPLE专家之间的可接受性阈值。要确定每个ROI匹配IODSC专家所需的非专家数量,使用可变数量的非专家生成单一共识轮廓,然后与IODSC专家进行比较.
未经评估:对于所有情况,对于大多数ROI,STAPLE非专家与STAPLE专家的DSC值高于比较专家IODSC专家。TheminimumnonexpertsegmentationsneedforaconsensedROItoachieveIODSCexpertacceptabilitycriteriarangebetween2and4forbreast,3和5用于肉瘤,H&N为3和5,3和5为GYN,和3的GI。
UNASSIGNED:多个非专家产生的共识ROI达到或超过了专家得出的可接受性阈值。五名非专家可能会为大多数ROI与性能接近专家产生共识分割,建议将非专家细分作为可行的具有成本效益的人工智能投入。
公众号