M.S. Final Oral Exam: Duy Phuong Nguyen
Speaker:Duy Phuong Nguyen
Improving Diverse Federated Learning through Knowledge Acquisition and Fusion of Multiple Models
This thesis presents a novel approach to federated learning (FL) that addresses the challenges of model heterogeneity, computational diversity, and high communication costs. Unlike traditional FL methods that aggregate model weights across edge devices, our method focuses on aggregating local knowledge extracted from these models. By employing knowledge distillation, we distill this local knowledge into a robust global knowledge, forming the server model. This process is further refined through deep mutual learning, resulting in a compact knowledge network. Our resource-aware FL approach enables edge clients to deploy efficient models and perform multi-model knowledge fusion, optimizing both communication efficiency and model diversity. Empirical evaluations demonstrate that our method significantly outperforms existing FL algorithms in terms of reducing communication costs and enhancing generalization performance across heterogeneous data and model landscapes.
Committee: Ali Jannesari (major professor), Hongyang Gao, and Wensheng Zhang