MS Final Oral Exam: Fatema Siddika

MS Final Oral Exam: Fatema Siddika

Dec 1, 2025 - 11:00 AM
to , -

Dual Knowledge Distillation with Adaptive Class-wise Prototype Margins for Heterogeneous Federated Learning

Heterogeneous Federated Learning (HFL) has gained significant attention for its capacity to handle both model and data heterogeneity across clients. Prototype-based HFL methods emerge as a promising solution to address heterogeneous data distributions and privacy challenges, paving the way for new advancements in HFL research. This method focuses on sharing only class-representative prototypes among heterogeneous clients. However, these prototypes are often aggregated on the server using weighted averaging, leading to sub-optimal global knowledge; these cause the shrinking of aggregated prototypes, which negatively affect the model performance in scenarios when models are heterogeneous and data distributions are extremely non-IID. We propose ProtoDistill in a Heterogeneous Federated Learning setting, using an enhanced dual-knowledge distillation mechanism to improve the system performance with clients’ logits and prototype feature representation. We aim to resolve the prototype margin-shrinking problem using a contrastive learning-based trainable server prototype by leveraging a class-wise adaptive prototype margin. Furthermore, we assess the importance of public samples using the closeness of the sample’s prototype to its class representative prototypes, which enhances learning performance. ProtoDistill improved test accuracy by an average of 1.13% and up to 34.13% across various settings, significantly outperforming existing state-of-the-art HFL methods.

Committee: Ali Jannesari (major professor), Wensheng Zhang and Meisam Mohammady