{Reference Type}: Journal Article {Title}: Multimodal learning system integrating electronic medical records and hysteroscopic images for reproductive outcome prediction and risk stratification of endometrial injury: a multicenter diagnostic study. {Author}: Li B;Chen H;Lin X;Duan H; {Journal}: Int J Surg {Volume}: 110 {Issue}: 6 {Year}: 2024 Jun 1 {Factor}: 13.4 {DOI}: 10.1097/JS9.0000000000001241 {Abstract}: OBJECTIVE: To develop a multimodal learning application system that integrates electronic medical records (EMR) and hysteroscopic images for reproductive outcome prediction and risk stratification of patients with intrauterine adhesions (IUAs) resulting from endometrial injuries.
METHODS: EMR and 5014 revisited hysteroscopic images of 753 post hysteroscopic adhesiolysis patients from the multicenter IUA database we established were randomly allocated to training, validation, and test datasets. The respective datasets were used for model development, tuning, and testing of the multimodal learning application. MobilenetV3 was employed for image feature extraction, and XGBoost for EMR and image feature ensemble learning. The performance of the application was compared against the single-modal approaches (EMR or hysteroscopic images), DeepSurv and ElasticNet models, along with the clinical scoring systems. The primary outcome was the 1-year conception prediction accuracy, and the secondary outcome was the assisted reproductive technology (ART) benefit ratio after risk stratification.
RESULTS: The multimodal learning system exhibited superior performance in predicting conception within 1-year, achieving areas under the curves of 0.967 (95% CI: 0.950-0.985), 0.936 (95% CI: 0.883-0.989), and 0.965 (95% CI: 0.935-0.994) in the training, validation, and test datasets, respectively, surpassing single-modal approaches, other models and clinical scoring systems (all P<0.05). The application of the model operated seamlessly on the hysteroscopic platform, with an average analysis time of 3.7±0.8 s per patient. By employing the application's conception probability-based risk stratification, mid-high-risk patients demonstrated a significant ART benefit (odds ratio=6, 95% CI: 1.27-27.8, P=0.02), while low-risk patients exhibited good natural conception potential, with no significant increase in conception rates from ART treatment (P=1).
CONCLUSIONS: The multimodal learning system using hysteroscopic images and EMR demonstrates promise in accurately predicting the natural conception of patients with IUAs and providing effective postoperative stratification, potentially contributing to ART triage after IUA procedures.