2022 EEDE Thesis Award & Distinctions

© Angelos Katharopoulos

© Angelos Katharopoulos

Angelos Katharopoulos has received the Electrical Engineering Doctoral program (EEDE) Thesis Award for his outstanding research on the efficiency of deep learning models. EEDE Thesis Distinctions have also been awarded to Lorenzo Bertoni, Michaël Defferrard, and Thanh-An Pham.

Angelos Katharopoulos' prize-winning thesis, entitled Stop Wasting My FLOPS: Improving the Efficiency of Deep Learning Models, was completed under the supervision of L’IDIAP adjunct professor François Fleuret and co-directed by Pascal Frossard, head of the Signal Processing Laboratory (LTS4) in the School of Engineering.

In his work, Katharopoulos proposes three methods for improving the efficiency of deep learning neural networks, which despite revolutionizing the field of machine learning, carry very costly computational and memory requirements. His efficiency recommendations focus in particular on an importance sampling algorithm to help improve the sample inefficiency of neural network training; a model for processing large input images with greatly reduced computational and memory requirements; and efficient approximations for the attention mechanism used in transformers which provide a better trade-off between performance and computation in comparison to original transformer architectures.

The annual EEDE Thesis Award honors an “outstanding and remarkable” PhD thesis work in the field of electrical engineering from an EEDE student. Each year, EEDE Thesis Distinctions are also granted to a selection of very high quality theses, in order to highlight the doctoral candidates’ research work and their scientific merit. For each doctoral program, nominated graduates are selected on the basis of their oral examination. Then the program committee evaluates the nominees and rewards the best 8%. The EDEE Distinction 8% for 2022 was awarded to Lorenzo Bertoni, Michaël Defferrard, and Thanh-An Pham.

References

Stop Wasting My FLOPS: Improving the Efficiency of Deep Learning Models. 10.5075/epfl-thesis-8607