Deep-n-cheap: An automated efficient and extensible search framework for cost-effective deep learning
S. Dey, S. Babakniya, S. C. Kanala, M. Paolieri, L. Golubchik, P. A. Beerel, K. M. Chugg
Abstract: Artificial neural networks (NNs) in deep learning systems are critical drivers of emerging technologies such as computer vision, text classification, and natural language processing. Fundamental to their success is the development of accurate and efficient NN models. In this article, we report our work on Deep-n-Cheap – an open-source Automated Machine Learning (AutoML) search framework for deep learning models. The search includes both, architecture and training hyperparameters and supports convolutional neural networks and multi-layer perceptrons, applicable to multiple domains. Our framework is targeted for deployment on both benchmark and custom datasets, and as a result, offers a greater degree of search space customizability as compared to a more limited search over only pre-existing models from literature. We also introduce the technique of ‘search transfer’, which demonstrates the generalization capabilities of the models found by our framework to multiple datasets. Deep-n-Cheap includes a user-customizable complexity penalty which trades off performance with training time or number of parameters. Specifically, our framework can find models with performance comparable to state-of-the art while taking 1-2 orders of magnitude less time to train than models from other AutoML and model search frameworks. Additionally, we investigate and develop insight into the search process that should aid future development of deep learning models.
SN Comput. Sci., 2(4):265:1-265:15, 2021 2021Performance ModelsML SystemsEdge AIApplications
copy bib | save bib | save pdf | go to publisher
🏠 Home