Reprogramming of Neural Networks: A New and Improved Machine Learning Technique
Neural networks can be repurposed via reprogramming to perform new tasks, which are different from the tasks they were originally trained for. We introduce new and improved reprogramming technique that, compared to prior works, achieves better accuracy, scalability, and can be successfully applied to more complex tasks. We show that our method yields significantly more accurate models than models trained from scratch with the same hyperparameters. In addition to that, our method also works well when applied to small training datasets. While prior literature focused on potential malicious uses of reprogramming, we argue that reprogramming can be viewed as an efficient training method. Our reprogramming method allows for re-using existing pre-trained models and easily reprogramming them to perform new tasks. This technique requires a lot less effort and hyperparameter tuning compared training new models from scratch. Therefore, we believe that our improved and scalable reprogramming method has potential to become a new method for creating machine learning models.
Committee: Dr. Jin Tian (Major Professor), Dr. Kris De Brabanter, and Dr. Zhengdao Wang