Abstract
Federated Learning (FL) offers a promising solution for training machine learning models on decentralized data while preserving privacy, making it particularly valuable for sensitive applications such as healthcare. This study implements FL for the classification of Alzheimer’s disease using MRI images, addressing two critical challenges: data heterogeneity and class imbalance. The research evaluates the performance of the FedAdagrad optimization algorithm against the standard FedAvg approach under varying data distribution scenarios. The methodology employs a CNN trained on a dataset of 6,400 MRI images across four severity classes, partitioned non-IID using Dirichlet distributions (α = 0.1, 0.5, 0.9) to simulate real-world heterogeneity. Experiments were conducted using the Flower framework with four clients over ten communication rounds. Results indicate that FedAdagrad achieves a superior F1-score of 50.33% compared to FedAvg’s 48.14%, though both fall short of centralized CNN performance (55%). High data heterogeneity (α = 0.1) leads to a 13.35% accuracy decline, underscoring FL’s sensitivity to uneven data distributions. Class imbalance emerges as the primary bottleneck, affecting all models. The findings contribute to the growing body of research on adaptive optimization in federated settings, offering insights for future improvements in decentralized healthcare AI.
References
M. Chui, B. Hall, H. Mayhew, and A. Singla, “The state of AI in 2022 — and a half decade in review Five years in review : AI adoption, impact, and spend,” Quantum Black, AI by McKinsey, no. December, 2022.
Z. L. Teo et al., “Federated machine learning in healthcare: A systematic review on clinical applications and technical architecture,” Cell Rep Med, vol. 5, no. 2, p. 101419, 2024, doi: 10.1016/j.xcrm.2024.101419.
F. Zhang et al., “Recent methodological advances in federated learning for healthcare,” Patterns, vol. 5, no. 6, p. 101006, Jun. 2024, doi: 10.1016/j.patter.2024.101006.
Y. Jernite et al., “Data Governance in the Age of Large-Scale Data-Driven Language Technology,” in Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency, in FAccT ’22. New York, NY, USA: Association for Computing Machinery, 2022, pp. 2206–2222. doi: 10.1145/3531146.3534637.
Y. Luo, Z. Pan, Q. Fu, and S. Qin, “FAdagrad: Adaptive federated learning with differential privacy,” in 2024 IEEE International Conference on High Performance Computing and Communications (HPCC), 2024, pp. 508–515. doi: 10.1109/HPCC64274.2024.00074.
L. Li, Y. Fan, M. Tse, and K.-Y. Lin, “A review of applications in federated learning,” Comput Ind Eng, vol. 149, p. 106854, 2020, doi: https://doi.org/10.1016/j.cie.2020.106854.
M. Ye, X. Fang, B. Du, P. C. Yuen, and D. Tao, “Heterogeneous Federated Learning: State-of-the-art and Research Challenges,” ACM Comput Surv, vol. 56, no. 3, 2024, doi: 10.1145/3625558.
A. Das, A. Krishnadas, V. S. Krishnan, A. Farida, and G. Sarath, “An Investigation of Federated Learning Strategies for Disease Diagnosis,” in 2024 15th International Conference on Computing Communication and Networking Technologies (ICCCNT), 2024, pp. 1–8. doi: 10.1109/ICCCNT61001.2024.10725147.
N. Elgendy and A. Elragal, “Big Data Analytics: A Literature Review Paper BT - Advances in Data Mining. Applications and Theoretical Aspects,” P. Perner, Ed., Cham: Springer International Publishing, 2014, pp. 214–227.
P. M. Mammen, “Federated Learning: Opportunities and Challenges,” 2021, [Online]. Available: http://arxiv.org/abs/2101.05428
M. Fadilurrahman, T. Kurniawan, Ramadhani, Misnasanti, and S. Shaddiq, “Systematic literature review of disruption era in Indonesia: The resistance of industrial revolution 4.0,” Journal of Robotics and Control (JRC), vol. 2, no. 1, pp. 51–59, 2021, doi: 10.18196/jrc.2152.
B. Murdoch, “Privacy and artificial intelligence: challenges for protecting health information in a new era,” BMC Med Ethics, vol. 22, no. 1, pp. 1–5, 2021, doi: 10.1186/s12910-021-00687-3.
Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V. Chandra, “Federated Learning with Non-IID Data,” 2018, [Online]. Available: https://arxiv.org/abs/1806.00582
S. J. Reddi et al., “Adaptive Federated Optimization,” ICLR 2021 - 9th International Conference on Learning Representations, no. 2, pp. 1–38, 2021.
Falah.G.Salieh, “Alzheimer MRI Dataset,” Hugging Face. [Online]. Available: https://huggingface.co/datasets/Falah/Alzheimer_MRI
B. Yurdem, M. Kuzlu, M. K. Gullu, F. O. Catak, and M. Tabassum, “Federated learning: Overview, strategies, applications, tools and future directions,” Heliyon, vol. 10, no. 19, pp. e38137–e38137, Oct. 2024, doi: 10.1016/j.heliyon.2024.e38137.
D. J. Beutel, T. Topal, A. Mathur, X. Qiu, and Fernandez-Marques, “Flower: A Friendly Federated Learning Research Framework,” arXiv preprint arXiv:2007.14390. [Online]. Available: https://github.com/adap/flower
H. Brendan McMahan, E. Moore, D. Ramage, S. Hampson, and B. Agüera y Arcas, “Communication-efficient learning of deep networks from decentralized data,” Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017, vol. 54, 2017.
P. Kairouz et al., “Advances and open problems in federated learning,” Foundations and Trends in Machine Learning, vol. 14, no. 1–2, pp. 1–210, 2021, doi: 10.1561/2200000083.
W. Chen, K. Yang, Z. Yu, Y. Shi, and C. L. P. Chen, “A survey on imbalanced learning: latest research, applications and future directions,” Artif Intell Rev, vol. 57, no. 6, p. 137, 2024, doi: 10.1007/s10462-024-10759-6.
M. Yurochkin, M. Agarwal, S. Ghosh, K. Greenewald, T. N. Hoang, and Y. Khazaeni, “Bayesian nonparametric federated learning of neural networks,” 36th International Conference on Machine Learning, ICML 2019, vol. 2019-June, pp. 12583–12597, 2019.
G. A. Baumgart, J. Shin, A. Payani, M. Lee, and R. R. Kompella, “Not All Federated Learning Algorithms Are Created Equal: A Performance Evaluation Study,” 2024, [Online]. Available: http://arxiv.org/abs/2403.17287
J. Wen, Z. Zhang, Y. Lan, Z. Cui, J. Cai, and W. Zhang, “A survey on federated learning: challenges and applications,” International Journal of Machine Learning and Cybernetics, vol. 14, no. 2, pp. 513–535, 2023, doi: 10.1007/s13042-022-01647-y.
Q. Li et al., “A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection,” IEEE Trans Knowl Data Eng, vol. 35, no. 4, pp. 3347–3366, 2023, doi: 10.1109/TKDE.2021.3124599.
B. L. Nelson and L. Pei, “Why Do We Simulate? BT - Foundations and Methods of Stochastic Simulation: A First Course,” B. L. Nelson and L. Pei, Eds., Cham: Springer International Publishing, 2021, pp. 1–6. doi: 10.1007/978-3-030-86194-0_1.
T. Sun, D. Li, and B. Wang, “Decentralized Federated Averaging,” IEEE Trans Pattern Anal Mach Intell, vol. 45, no. 4, pp. 4289–4301, 2023, doi: 10.1109/TPAMI.2022.3196503.
M. Patel, “FedGrad: Optimisation in Decentralised Machine Learning,” Nov. 2022, [Online]. Available: http://arxiv.org/abs/2211.04254
Y. Xiao, X. Jin, T. Pan, Z. Yu, and L. Ding, “A Federated Learning Algorithm That Combines DCScaffold and Differential Privacy for Load Prediction,” Energies (Basel), vol. 18, no. 6, 2025, doi: 10.3390/en18061482.
Y. Huang, Y. Xu, L. Kong, Q. Li, and L. Cui, “Towards Heterogeneous Federated Learning,” 2023, pp. 390–404. doi: 10.1007/978-981-99-2356-4_31.
J. Tang et al., “FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation,” Sensors, vol. 23, no. 14, 2023, doi: 10.3390/s23146518.
B. Xu et al., “Heterogeneous Federated Learning Driven by Multi-Knowledge Distillation,” IEEE Trans Mob Comput, vol. 24, no. 12, pp. 13048–13061, 2025, doi: 10.1109/TMC.2025.3586921.

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
