A cross-sectional study of explainable machine learning in Alzheimer’s disease: diagnostic classification using MR radiomic features

Leandrou, Stephanos and Lamnisos, Demetris and Bougias, Haralabos and Stogiannos, Nikolaos and Georgiadou, Eleni and Achilleos, K. G. and Pattichis, Constantinos S. (2023) A cross-sectional study of explainable machine learning in Alzheimer’s disease: diagnostic classification using MR radiomic features. Frontiers in Aging Neuroscience, 15. ISSN 1663-4365

[thumbnail of pubmed-zip/versions/1/package-entries/fnagi-15-1149871/fnagi-15-1149871.pdf] Text
pubmed-zip/versions/1/package-entries/fnagi-15-1149871/fnagi-15-1149871.pdf - Published Version

Download (1MB)

Abstract

Introduction: Alzheimer’s disease (AD) even nowadays remains a complex neurodegenerative disease and its diagnosis relies mainly on cognitive tests which have many limitations. On the other hand, qualitative imaging will not provide an early diagnosis because the radiologist will perceive brain atrophy on a late disease stage. Therefore, the main objective of this study is to investigate the necessity of quantitative imaging in the assessment of AD by using machine learning (ML) methods. Nowadays, ML methods are used to address high dimensional data, integrate data from different sources, model the etiological and clinical heterogeneity, and discover new biomarkers in the assessment of AD.

Methods: In this study radiomic features from both entorhinal cortex and hippocampus were extracted from 194 normal controls (NC), 284 mild cognitive impairment (MCI) and 130 AD subjects. Texture analysis evaluates statistical properties of the image intensities which might represent changes in MRI image pixel intensity due to the pathophysiology of a disease. Therefore, this quantitative method could detect smaller-scale changes of neurodegeneration. Then the radiomics signatures extracted by texture analysis and baseline neuropsychological scales, were used to build an XGBoost integrated model which has been trained and integrated.

Results: The model was explained by using the Shapley values produced by the SHAP (SHapley Additive exPlanations) method. XGBoost produced a f1-score of 0.949, 0.818, and 0.810 between NC vs. AD, MC vs. MCI, and MCI vs. AD, respectively.

Discussion: These directions have the potential to help to the earlier diagnosis and to a better manage of the disease progression and therefore, develop novel treatment strategies. This study clearly showed the importance of explainable ML approach in the assessment of AD.

Item Type: Article
Subjects: Digital Academic Press > Medical Science
Depositing User: Unnamed user with email support@digiacademicpress.org
Date Deposited: 11 May 2024 10:00
Last Modified: 11 May 2024 10:00
URI: http://science.researchersasian.com/id/eprint/1734

Actions (login required)

View Item
View Item