ATHENA-FL: Avoiding Statistical Heterogeneity with One-versus-All in Federated Learning

Authors

DOI:

https://doi.org/10.5753/jisa.2024.3826

Keywords:

Federated learning, non-IID data, privacy-preserving AI

Abstract

Federated learning (FL) is a distributed approach to train machine learning models without disclosing private data from participating clients to a central server. Nevertheless, FL training struggles to converge when clients have distinct data distributions, which leads to an increased training time and model prediction error. We propose ATHENA-FL, a federated learning system that considers clients with heterogeneous data distributions to generate accurate models in fewer training epochs than state-of-the-art approaches. ATHENA-FL reduces communication costs, providing an additional positive aspect for resource-constrained scenarios. ATHENA-FL mitigates data heterogeneity by introducing a preliminary step before training that clusters clients with similar data distribution. To handle that, we use the weights of a locally trained neural network used as a probe. The proposed system also uses the one-versus-all model to train one binary detector for each class in the cluster. Thus, clients can compose complex models combining multiple detectors. These detectors are shared with all participants through the system's database. We evaluate the clustering procedure using different layers from the neural network and verify that the last layer is sufficient to cluster the clients efficiently. The experiments show that using the last layer as input for the clustering algorithm transmits 99.68% fewer bytes to generate clusters compared to using all the neural network weights. Finally, our results show that ATHENA-FL correctly identifies samples, achieving up to 10.9% higher accuracy than traditional training. Furthermore, ATHENA-FL achieves lower training communication costs compared with MobileNet architecture, reducing the number of transmitted bytes between 25% and 97% across evaluated scenarios.

Downloads

Download data is not yet available.

References

Beutel, D. J. et al. (2020). Flower: A Friendly Federated Learning Research Framework. arXiv preprint arXiv:2007.14390. DOI: 10.48550/arXiv.2007.14390.

Blondel, V. D. et al. (2008). Fast Unfolding of Communities in Large Networks. Journal of Statistical Mechanics: Theory and Experiment, pages 1-12. DOI: 10.1088/1742-5468/2008/10/P10008.

Chu, D., Jaafar, W., and Yanikomeroglu, H. (2022). On the Design of Communication-Efficient Federated Learning for Health Monitoring. IEEE GLOBECOM, pages 1-6. DOI: 10.1109/GLOBECOM48099.2022.10001077.

de Souza, L. A. C., Camilo, G. F., Campista, M. E. M., and Costa, L. H. M. K. (2023a). Hierarchical clustering of nodes for accuracy increase in federated learning. Techinical Report, Electrical Engineering Program, COPPE/UFRJ. Available online [link].

de Souza, L. A. C., Camilo, G. F., Rebello, G. A. F., Sammarco, M., Campista, M. E. M., and Costa, L. H. M. (2023b). ATHENA-FL: Evitando a Heterogeneidade Estatística através do Um-contra-Todos no Aprendizado Federado. In Anais do VII Workshop de Computação Urbana, pages 40-53. SBC. DOI: 10.5753/courb.2023.717.

de Souza, L. A. C. et al. (2020). DFedForest: Decentralized Federated Forest. In IEEE International Conference on Blockchain, pages 90-97. DOI: 10.1109/Blockchain50366.2020.00019.

Dennis, D. K., Li, T., and Smith, V. (2021). Heterogeneity for the Win: One-Shot Federated Clustering. arXiv preprint arXiv:2103.00697. DOI: 10.48550/arXiv.2103.00697.

Djenouri, Y., Michalak, T. P., and Lin, J. C.-W. (2023). Federated Deep Learning for Smart City Edge-based Applications. Future Generation Computer Systems, 147:350-359. DOI: 10.1016/j.future.2023.04.034.

Duan, M., Liu, D., Ji, X., Wu, Y., Liang, L., Chen, X., Tan, Y., and Ren, A. (2022). Flexible Clustered Federated Learning for Client-Level Data Distribution Shift. IEEE Transactions on Parallel and Distributed Systems, 33(11):2661-2674. DOI: 10.1109/TPDS.2021.3134263.

Ester, M., Kriegel, H.-P., Sander, J., Xu, X., et al. (1996). A Density-based Algorithm for Discovering Clusters in Large Spatial Databases with Noise. In KDD, pages 226-231. Available online [link].

Fraboni, Y., Vidal, R., Kameni, L., and Lorenzi, M. (2021). Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning. In International Conference on Machine Learning, pages 3407-3416. PMLR. Available online [link].

Fu, L. et al. (2022). Client Selection in Federated Learning: Principles, Challenges, and Opportunities. arXiv preprint arXiv:2211.01549, pages 1-8. DOI: 10.48550/arXiv.2211.01549.

Ghosh, A., Chung, J., Yin, D., and Ramchandran, K. (2020). An Efficient Framework for Clustered Federated Learning. arXiv preprint arXiv:2006.04088. DOI: 10.48550/arXiv.2006.04088.

Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv preprint arXiv:1704.04861. DOI: 10.48550/arXiv.1704.04861.

Krizhevsky, A., Nair, V., and Hinton, G. (2014). The CIFAR-10 Dataset. online: http://www. cs. toronto. edu/kriz/cifar. html, 55(5). Available online [link].

Lai, F., Zhu, X., Madhyastha, H. V., and Chowdhury, M. (2021). Oort: Efficient Federated Learning via Guided Participant Selection. In USENIX OSDI, pages 19-35. Available online [link].

LeCun, Y., Cortes, C., and Burges, C. J. (2010). MNIST Handwritten Digit Database. http://yann.lecun.com/exdb/mnist/. Available online [link].

Li, D., Lai, J., Wang, R., Li, X., Vijayakumar, P., Gupta, B. B., and Alhalabi, W. (2023a). Ubiquitous Intelligent Federated Learning Privacy-preserving Scheme under Edge Computing. Future Generation Computer Systems, 144:205-218. DOI: 10.1016/j.future.2023.03.010.

Li, H., Cai, Z., Wang, J., Tang, J., Ding, W., Lin, C.-T., and Shi, Y. (2023b). FedTP: Federated Learning by Transformer Personalization. IEEE Transactions on Neural Networks and Learning Systems. DOI: 10.1109/TNNLS.2023.3269062.

Li, T., Hu, S., Beirami, A., and Smith, V. (2021). Ditto: Fair and Robust Federated Learning through Personalization. In International Conference on Machine Learning, pages 6357-6368. PMLR. Available online [link].

Li, T., Sahu, A. K., Zaheer, M., Sanjabi, M., Talwalkar, A., and Smith, V. (2020). Federated Optimization in Heterogeneous Networks. Proceedings of Machine Learning and Systems, 2:429-450. Available online [link].

Liu, B., Ding, M., Shaham, S., Rahayu, W., Farokhi, F., and Lin, Z. (2021). When Machine Learning Meets Privacy: A Survey and Outlook. ACM Computing Surveys (CSUR), 54(2):1-36. DOI: 10.1145/3436755.

Liu, L., Zhang, J., Song, S., and Letaief, K. B. (2020). Client-Edge-Cloud Hierarchical Federated Learning. In International Conference on Communications, pages 1-6. DOI: 10.1109/ICC40277.2020.9148862.

Luo, B. et al. (2022). Tackling System and Statistical Heterogeneity for Federated Learning with Adaptive Client Sampling. In IEEE INFOCOM, pages 1739-1748. DOI: 10.1109/INFOCOM48880.2022.9796935.

Ma, X., Zhu, J., Lin, Z., Chen, S., and Qin, Y. (2022). A State-of-the-Art Survey on Solving Non-IID Data in Federated Learning. Future Generation Computer Systems, 135:244-258. DOI: 10.1016/j.future.2022.05.003.

McMahan, B. et al. (2017). Communication-efficient Learning of Deep Networks from Decentralized Data. Artificial Intelligence and Statistics, pages 1273-1282. Available online [link].

Neto, H. N. C., Dusparic, I., Mattos, D. M., and Fernande, N. C. (2022). FedSA: Accelerating Intrusion Detection in Collaborative Environments with Federated Simulated Annealing. In International Conference on Network Softwarization (NetSoft), pages 420-428. IEEE. DOI: 10.1109/NetSoft54395.2022.9844024.

Nishio, T. and Yonetani, R. (2019). Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge. In International Conference on Communications, pages 1-7. DOI: 10.1109/ICC.2019.8761315.

Ouyang, X. et al. (2021). ClusterFL: a Similarity-Aware Federated Learning System for Human Activity Recognition. In Proceedings of the International Conference on Mobile Systems, Applications, and Services, pages 54-66. DOI: 10.1145/3458864.3467681.

Qin, T., Cheng, G., Wei, Y., and Yao, Z. (2023). Hier-SFL: Client-Edge-Cloud Collaborative Traffic Classification Framework based on Hierarchical Federated Split Learning. Future Generation Computer Systems. DOI: 10.1016/j.future.2023.07.001.

Rai, S., Kumari, A., and Prasad, D. K. (2022). Client Selection in Federated Learning under Imperfections in Environment. AI, 3(1):124-145. DOI: 10.3390/ai3010008.

Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.-C. (2018). MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4510-4520. DOI: 10.1109/CVPR.2018.00474.

Sanguineti, M. (2021). Cats VS Dogs Convolutional Classifier. Towards Data Science. Available online [link].

Sattler, F., Müller, K.-R., and Samek, W. (2020). Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization under Privacy Constraints. IEEE Transactions on Neural Networks and Learning Systems. DOI: 10.1109/TNNLS.2020.3015958.

Singh, S., Rathore, S., Alfarraj, O., Tolba, A., and Yoon, B. (2022). A Framework for Privacy-preservation of IoT Healthcare Data using Federated Learning and Blockchain Technology. Future Generation Computer Systems, 129:380-388. DOI: 10.1016/j.future.2021.11.028.

Tan, A. Z., Yu, H., Cui, L., and Yang, Q. (2022). Towards Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems, pages 1-17. DOI: 10.1109/TNNLS.2022.3160699.

Tang, Z., Hu, Z., Shi, S., Cheung, Y.-m., Jin, Y., Ren, Z., and Chu, X. (2021). Data Resampling for Federated Learning with Non-IID Labels. In FTL-IJCAI'21. Available online [link].

Wang, H. et al. (2020a). Optimizing Federated Learning on Non-IID Data with Reinforcement Learning. In IEEE INFOCOM, pages 1698-1707. DOI: 10.1109/INFOCOM41043.2020.9155494.

Wang, J. et al. (2020b). Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization. NeurIPS, 33:7611-7623. Available online [link].

Xiao, H., Rasul, K., and Vollgraf, R. (2017). Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv preprint arXiv:1708.07747. DOI: 10.48550/arXiv.1708.07747.

Yang, Q., Liu, Y., Chen, T., and Tong, Y. (2019). Federated Machine Learning: Concept and Applications. Transactions on Intelligent Systems and Technology (TIST), 10(2):1-19. DOI: 10.1145/3298981.

Zeng, D., Hu, X., Liu, S., Yu, Y., Wang, Q., and Xu, Z. (2023). Stochastic Clustered Federated Learning. arXiv preprint arXiv:2303.00897. DOI: 10.48550/arXiv.2303.00897.

Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., et al. (2018). Federated Learning with Non-IID Data. arXiv preprint arXiv:1806.00582. DOI: 10.48550/arXiv.1806.00582.

Zhong, Z. et al. (2022). FLEE: A Hierarchical Federated Learning Framework for Distributed Deep Neural Network over Cloud, Edge and End Device. ACM TIST, pages 1-24. DOI: 10.1145/3514501.

Zhu, Y., Markos, C., Zhao, R., Zheng, Y., and James, J. (2021). FedOVA: One-vs-All Training Method for Federated Learning with Non-IID Data. In IEEE IJCNN, pages 1-7. DOI: 10.1109/IJCNN52387.2021.9533409.

Downloads

Published

2024-08-14

How to Cite

de Souza, L. A. C., Camilo, G. F., Rebello, G. A. F., Sammarco, M., Campista, M. E. M., & Costa, L. H. M. K. (2024). ATHENA-FL: Avoiding Statistical Heterogeneity with One-versus-All in Federated Learning. Journal of Internet Services and Applications, 15(1), 273–288. https://doi.org/10.5753/jisa.2024.3826

Issue

Section

Research article