Evolving intelligence : Overcoming challenges for Evolutionary Deep Learning

Sammanfattning: Deep Learning (DL) has achieved remarkable results in both academic and industrial fields over the last few years. However, DL models are often hard to design and require proper selection of features and tuning of hyper-parameters to achieve high performance. These selections are tedious for human experts and require substantial time and resources. A difficulty that encouraged a growing number of researchers to use Evolutionary Computation (EC) algorithms to optimize Deep Neural Networks (DNN); a research branch called Evolutionary Deep Learning (EDL).This thesis is a two-fold exploration within the domains of EDL, and more broadly Evolutionary Machine Learning (EML). The first goal is to makeEDL/EML algorithms more practical by reducing the high computational costassociated with EC methods. In particular, we have proposed methods to alleviate the computation burden using approximate models. We show that surrogate-models can speed up EC methods by three times without compromising the quality of the final solutions. Our surrogate-assisted approach allows EC methods to scale better for both, expensive learning algorithms and large datasets with over 100K instances. Our second objective is to leverage EC methods for advancing our understanding of Deep Neural Network (DNN) design. We identify a knowledge gap in DL algorithms and introduce an EC algorithm precisely designed to optimize this uncharted aspect of DL design. Our analytical focus revolves around revealing avant-garde concepts and acquiring novel insights. In our study of randomness techniques in DNN, we offer insights into the design and training of more robust and generalizable neural networks. We also propose, in another study, a novel survival regression loss function discovered based on evolutionary search.