Ph.D. Preliminary Oral Exam: Phuong Nugyen
Adaptive Dynamic Pruning for Non-iid Federated Learning
Federated Learning (FL) has emerged as a new paradigm for training machine learning models distributively without sacrificing data security and privacy. Learning models on edge devices such as mobile phones is one of the most common use cases for FL. However, Non-identical independent distributed (non-IID) data in edge devices easily leads to training failures. Especially, over-parameterized machine learning models can easily be over-fitted on such data, hence, resulting in inefficient federated learning and poor model performance. To overcome the over-fitting issue, we proposed an adaptive dynamic pruning
approach for FL, which can dynamically slim the model by dropping out unimportant parameters, hence, preventing overfittings. Since the machine learning model’s parameters react differently for different training samples, adaptive dynamic pruning will evaluate the salience of the model’s parameter according to the input training sample, and only retain the salient parameter’s gradients when doing back-propagation.
Committee: Ali Jannesari (major professor), Hongyang Gao, Samik Basu, Wensheng Zhang and Xiaoqiu Huang.
Join on Zoom: https://iastate.zoom.us/j/98136119417