Page 129 - DP-2-3
P. 129

Design+                                                             ML for predicting Alzheimer’s progression



               figures. Alzheimers Dement. 2023;19(4):1598-1695.  14.  Goyal C. Data Leakage and Its Effect on the Performance of an
                                                                  ML Model. Analytics Vidhya; 2021. Available from: https://
               doi: 10.1002/alz.13016.
                                                                  www.analyticsvidhya.com/blog/2021/07/data-leakage-
            2.   Yang Q, Li X, Ding X, Xu F, Ling Z. Deep learning-  and-its-effect-on-the-performance-of-an-ml-model [Last
               based speech analysis for Alzheimer’s disease detection:   accessed on 2024 Apr 30].
               A literature review. Alzheimers Res Ther. 2022;14(1):186.
                                                               15.  Stekhoven  D, Bühlmann P.  MissForest--non-parametric
               doi: 10.1186/s13195-022-01131-3                    missing  value  imputation  for  mixed-type  data.
            3.   Shahbaz M, Ali S, Guergachi A, Niazi A, Umer A.   Bioinformatics. 2012;28(1):112-118.
               Classification  of  Alzheimer’s  Disease  Using  Machine      doi: 10.1093/bioinformatics/btr597
               Learning Techniques. In: Proceedings of the 12  International
                                                th
               Joint Conference on Biomedical Engineering Systems and   16.  Saeys Y, Inza I, Larranaga P. A  review of feature
               Technologies. Prague, Czech Republic; 2019. p. 296-303.  selection techniques in bioinformatics.  Bioinformatics.
                                                                  2007;23(19):2507-2517.
               doi: 10.5220/0007949902960303
                                                                  doi: 10.1093/bioinformatics/btm344
            4.   AIBL Study ADNI Non-imaging Data. aibl.csiro.au. Available
               from:  https://aibl.csiro.au/adni/nonimaging.php  [Last  17.  Malato G. Feature Selection with Random Forest. Your Data
               accessed on 2024 Apr 30].                          Teacher; 2021. Available from: https://www.yourdatateacher.
                                                                  com/2021/10/11/feature-selection-with-random-forest
            5.   ADNI.  About. Available from: https://adni.loni.usc.edu/  [Last accessed on 2024 Apr 30].
               about [Last accessed on 2024 Apr 30].
                                                               18.  Tanuja D, Goutam S. Classification of imbalanced big data
            6.   Rahman M, Prasad G. Comprehensive study on machine   using SMOTE with rough random forest.  Int J Eng Adv
               learning methods to increase the prediction accuracy of   Technol. 2019;9:5174.
               classifiers and reduce the number of medical tests required
               to diagnose Alzheimer’s disease. arXiv (Machine Learning).      doi: 10.35940/ijeat.B4096.129219
               2022;1-10.                                      19.  Brownlee J. Parametric and Nonparametric Machine Learning
                                                                  Algorithms. Machine Learning Mastery; 2016. Available
               doi: 10.48550/arXiv.2212.00414
                                                                  from:   https://machinelearningmastery.com/parametric-
            7.   Wirth R, Hipp J.  CRISP-DM: Towards a Standard Process   and-nonparametric-machine-learning-algorithms  [Last
               Model  for  Data  Mining. Available from: https://cs.unibo.  accessed on 2024 Apr 30].
               it/~danilo.montesi/CBD/Beatriz/10.1.1.198.5133.pdf [Last
               accessed on 2024 Apr 30].                       20.  Breiman L. Random forests. Mach Learn. 2001;45(1):5-32.
                                                                  doi: 10.1023/a:1010933404324
            8.   Harrikrishna NB.  Confusion Matrix, Accuracy, Precision,
               Recall, F1 Score. Medium; 2020. Available from: https://  21.  N. Room. How to Use XGBoost for Time-Series Forecasting?
               medium.com/analytics-vidhya/confusion-matrix-accuracy-  Datadance; 2024. Available from: https://datadance.ai/
               precision-recall-f1-score-ade299cf63cd  [Last  accessed  on   machine-learning/how-to-use-xgboost-for-time-series-
               2024 Apr 30].                                      forecasting/#step-3-handling-missing-values-and-outliers
                                                                  [Last accessed on 2024 Apr 30].
            9.  Scikit-Learn. Scikit-learn: Machine Learning in Python.
               Scikit-Learn; 2019. Available from: https://scikit-learn.org/  22.  Chen T, Guestrin C. XGBoost: A  Scalable Tree Boosting
                                                                                               nd
               stable [Last accessed on 2024 Apr 30].             System. In:  Proceedings of the 22   ACM SIGKDD
                                                                  International Conference on Knowledge Discovery and Data
            10.  Nabel R.  PyDrive: Google Drive API Made Easy. PyPI.
               Available from: https://pypi.org/project/pydrive [Last   Mining - KDD ‘16. 2016. 785-794.
               accessed on 2024 Apr 30].                          doi: 10.1145/2939672.2939785
            11.  Bhandari A. Multicollinearity. Causes, Effects and Detection   23.  Scikit-Learn.   Sklearn.Model_Selection.
               Using VIF.  Analytics  Vidhya;  2023.  Available  from:   RandomizedSearchCV - Scikit-Learn 0.21.3 Documentation.
               https://www.analyticsvidhya.com/blog/2020/03/what-is-  Scikit-Learn; 2019. Available from: https://scikit-learn.
               multicollinearity/#:~:text=multicollinearity%20is%20a%20  org/stable/modules/generated/sklearn.model_selection.
               statistical%20phenomenon [Last Accessed on 2024 Apr 30].  RandomizedSearchCV.html [Last accessed on 2024 Apr 30].
            12.  Turney S.  Chi-Square (2) Tests. Types, Formula and   24.  Scikit. 3.3. Metrics and Scoring: Quantifying the Quality of
               Examples. Scribbr; 2022. Available from: https://www.  Predictions  -  Scikit-Learn 0.23.2  Documentation. Scikit-
               scribbr.com/statistics/chi-square-tests  [Last  accessed  on   Learn.  Available  from:  https://scikit-learn.org/stable/
               2024 Apr 30].                                      modules/model_evaluation.html#classification-report [Last
                                                                  accessed on 2024 Apr 30].
            13.  Pandas.factorize  --  Pandas 1.5.3 Documentation.  Available
               from: https://pandas.pydata.org/docs/reference/api/pandas.  25.  Lanier ST.  Choosing Performance Metrics. Medium; 2020.
               factorize.html [Last accessed on 2024 Apr 30].     Available from: https://towardsdatascience.com/choosing-


            Volume 2 Issue 3 (2025)                         11                           doi: 10.36922/DP025270031
   124   125   126   127   128   129   130   131   132   133   134