关键词: Decision tree LIME SHAP SVM XGBoost eXplainable AI

Mesh : Humans Support Vector Machine Skin Diseases / diagnosis Machine Learning Diagnosis, Computer-Assisted / methods Female Male

来  源:   DOI:10.1016/j.compbiomed.2024.108919

Abstract:
Research on disease detection by leveraging machine learning techniques has been under significant focus. The use of machine learning techniques is important to detect critical diseases promptly and provide the appropriate treatment. Disease detection is a vital and sensitive task and while machine learning models may provide a robust solution, they can come across as complex and unintuitive. Therefore, it is important to gauge a better understanding of the predictions and trust the results. This paper takes up the crucial task of skin disease detection and introduces a hybrid machine learning model combining SVM and XGBoost for the detection task. The proposed model outperformed the existing machine learning models - Support Vector Machine (SVM), decision tree, and XGBoost with an accuracy of 99.26%. The increased accuracy is essential for detecting skin disease due to the similarity in the symptoms which make it challenging to differentiate between the different conditions. In order to foster trust and gain insights into the results we turn to the promising field of Explainable Artificial Intelligence (XAI). We explore two such frameworks for local as well as global explanations for these machine learning models namely, SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME).
摘要:
通过利用机器学习技术进行疾病检测的研究一直受到关注。机器学习技术的使用对于及时发现重大疾病并提供适当的治疗非常重要。疾病检测是一项重要而敏感的任务,而机器学习模型可以提供强大的解决方案。他们可能会觉得复杂和不直观。因此,衡量对预测的更好理解和对结果的信任是很重要的。本文承担了皮肤病检测的关键任务,介绍了一种结合SVM和XGBoost的混合机器学习模型用于检测任务。所提出的模型优于现有的机器学习模型-支持向量机(SVM),决策树,和XGBoost,准确率为99.26%。由于症状的相似性,提高的准确性对于检测皮肤病至关重要,这使得区分不同状况具有挑战性。为了增进信任并深入了解结果,我们转向了可解释人工智能(XAI)的有前途的领域。我们探索了两个这样的框架,用于对这些机器学习模型进行局部和全局解释,即,沙普利加性扩张(SHAP)和局部可解释模型不可知解释(LIME)。
公众号