Continuously Constructive Deep Neural Networks

Traditionally, deep learning algorithms update the network weights, whereas the network architecture is chosen manually using a process of trial and error. In this paper, we propose two novel approaches that automatically update the network structure while also learning its weights. The novelty of o...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transaction on neural networks and learning systems Ročník 31; číslo 4; s. 1124 - 1133
Hlavní autori: Irsoy, Ozan, Alpaydin, Ethem
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.04.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2162-237X, 2162-2388, 2162-2388
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Traditionally, deep learning algorithms update the network weights, whereas the network architecture is chosen manually using a process of trial and error. In this paper, we propose two novel approaches that automatically update the network structure while also learning its weights. The novelty of our approach lies in our parameterization, where the depth, or additional complexity, is encapsulated continuously in the parameter space through control parameters that add additional complexity. We propose two methods. In tunnel networks, this selection is done at the level of a hidden unit, and in budding perceptrons, this is done at the level of a network layer; updating this control parameter introduces either another hidden unit or layer. We show the effectiveness of our methods on the synthetic two-spiral data and on three real data sets of MNIST, MIRFLICKR, and CIFAR, where we see that our proposed methods, with the same set of hyperparameters, can correctly adjust the network complexity to the task complexity.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2019.2918225