PhD final Oral Exam: Duy Phuong Nguyen
Communication-efficient personalization in federated learning for edge devices
This dissertation advances practical, privacy‑preserving federated learning under real‑world constraints of heterogeneity, limited resources, and diverse data modalities. It develops algorithms that make collaborative training more efficient, robust, and personalized without centralizing data.
First, we introduce a dataset‑aware dynamic pruning strategy coupled with gradient control to curb overfitting on heterogeneous clients, stabilize convergence, and lower both computation and communication during local updates.
Next, we propose a multimodal federated framework with dual adapters: one larger adapter that is private to each client for personalization and a compact, shared adapter for knowledge transfer, augmented with selective pruning to balance local adaptation and global generalization for vision and language tasks.
Then, we present a lightweight, convolution‑based approach to time‑series forecasting that pairs learnable trend/seasonality decomposition with an efficient federated protocol, enabling accurate prediction across distributed, streaming signals on constrained devices.
Finally, we develop adaptive federated distillation with dual adapters and instance‑wise fusion, aligning shared knowledge at the server while preserving client‑specific representations to improve personalization under non‑IID data.
Together, these contributions chart a cohesive path toward scalable, resource-aware, and personalization friendly federated learning across data types and tasks, closing the gap between theoretical promise and deployment reality while maintaining user privacy.
Committee: Ali Jannesari (major professor), Samik Basu, Hongyang Gao, Wensheng Zhang, and Xiaoqiu Huang