Ph.D. Final Oral Exam: Eliska Kloberdanz

Eliska Kloberdanz
Friday, March 31, 2023 - 10:30am
Event Type: 

Understanding and Improving Numerical Stability of Deep Learning Algorithms

Deep learning (DL) has become an integral part of solutions to various important problems, which is why ensuring the quality of DL systems is essential. One of the challenges of achieving reliability and robustness of DL software is to ensure that algorithm implementations are numerically stable. Numerical stability is a property of numerical algorithms, which governs how changes or errors introduced through inputs or during execution affect the accuracy of algorithm outputs. In numerically unstable algorithms, those errors are magnified and adversely affect the fidelity of algorithm’s outputs via incorrect or inaccurate results. In this thesis we analyze the numerical stability of DL algorithms to better understand and improve numerical stability of DL algorithms. First, we identify and analyze unstable numerical methods and their solutions in DL. Second, we learn assertions on inputs into DL functions that ensure their numerical stability. Third, we focus on neural network quantization, which we found to cause numerical stability issues. Specifically, we propose a new quantization algorithm that optimizes the trade-off between low-bit representation and loss of precision. Next, we focus on analyzing the numerical stability of residual networks by leveraging their dynamic systems interpretation. In particular, we propose that residual networks behave as stiff numerically unstable ordinary differential equations. Finally, we introduce a novel numerically stable solver for neural ordinary differential equations.

Committee: Wei Le (major professor), hongyang gao, Chris Quinn, Hailiang Liu, and Zhengdao Wang

Join on Webex: Meeting password: SweVr4twT22