Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches

This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure; furthermore,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems Jg. 178; S. 74 - 83
1. Verfasser: Yoo, YoungJun
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Amsterdam Elsevier B.V 15.08.2019
Elsevier Science Ltd
Schlagworte:
ISSN:0950-7051, 1872-7409
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure; furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models; an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology. •An optimization method for hyper-parameters for a deep neural network.•Performing optimization of the network using a univariate dynamic encoding algorithm for searches.•Validation of the proposed method with two neural network model with MNIST data set.•Fast convergence speed and a small computational amount to optimize hyper-parameter of the network.
AbstractList This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure; furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models; an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology. •An optimization method for hyper-parameters for a deep neural network.•Performing optimization of the network using a univariate dynamic encoding algorithm for searches.•Validation of the proposed method with two neural network model with MNIST data set.•Fast convergence speed and a small computational amount to optimize hyper-parameter of the network.
This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure; furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models; an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology.
Author Yoo, YoungJun
Author_xml – sequence: 1
  givenname: YoungJun
  surname: Yoo
  fullname: Yoo, YoungJun
  email: youdalj@postech.ac.kr
  organization: Department of Electronic Engineering, Pohang University of Science and Technology (POSTECH), San 31, Hyojadong, Namgu, Pohang, Gyungbuk, 790-784, Republic of Korea
BookMark eNqFkEFv1DAQhS3USmxb_gEHS5wTxo6dbDggoYrSSpW4wNnyOpPWuxs7jJ2i5dfj7fbEoZzeYd73ZuZdsLMQAzL2XkAtQLQft_UuxHRItQTR16DqIm_YSqw7WXUK-jO2gl5D1YEWb9lFSlsAkFKsV2x3e5iRZkt2wozE45z95P_Y7GPgceQD4swDLmT3RfLvSDu-JB8e-BL8kyVvM_LhEOzkHcfg4nCc2f1DJJ8fJz5G4gktuUdMV-x8tPuE7170kv28-frj-ra6__7t7vrLfeUUQK6stHbtcNx0IPUIjXYaZVtOV7oZu65v9NC0A2q3GWXveoVio1ohVd-Odije5pJ9OOXOFH8tmLLZxoVCWWmk1KLESgXF9enkchRTIhyN8_n570zW740AcyzXbM2pXHMs14AyRQqs_oFn8pOlw_-wzycMy_tPHskk50trOHhCl80Q_esBfwGmPZtk
CitedBy_id crossref_primary_10_1007_s42488_022_00075_5
crossref_primary_10_1038_s41598_021_89352_8
crossref_primary_10_1016_j_oceaneng_2022_112873
crossref_primary_10_1016_j_tust_2024_105733
crossref_primary_10_1016_j_apor_2024_104265
crossref_primary_10_3390_s21030972
crossref_primary_10_1007_s11042_020_10114_1
crossref_primary_10_3390_make3040047
crossref_primary_10_1016_j_bspc_2024_106855
crossref_primary_10_1007_s00500_022_07020_z
crossref_primary_10_1016_j_eswa_2021_115147
crossref_primary_10_1016_j_jocs_2025_102588
crossref_primary_10_1002_adts_202200656
crossref_primary_10_1155_2022_7452638
crossref_primary_10_1016_j_knosys_2021_106905
crossref_primary_10_1016_j_foodres_2023_113105
crossref_primary_10_1007_s10694_024_01593_x
crossref_primary_10_1016_j_buildenv_2022_108911
crossref_primary_10_3390_s22072429
crossref_primary_10_1038_s41467_022_33253_5
crossref_primary_10_1016_j_knosys_2022_108326
crossref_primary_10_3390_buildings13102508
crossref_primary_10_1016_j_eswa_2021_114803
crossref_primary_10_3390_s22103700
crossref_primary_10_1007_s10462_021_09992_0
crossref_primary_10_1016_j_mtcomm_2022_104900
crossref_primary_10_1016_j_ijhydene_2023_11_137
crossref_primary_10_1039_D4NR03081H
crossref_primary_10_1016_j_jisa_2021_102804
crossref_primary_10_1016_j_jksuci_2021_12_018
crossref_primary_10_1007_s10115_024_02202_7
crossref_primary_10_1007_s41870_022_00983_0
crossref_primary_10_3390_app11093802
crossref_primary_10_1016_j_knosys_2022_108797
crossref_primary_10_1016_j_eswa_2020_113309
crossref_primary_10_1007_s10439_023_03422_8
crossref_primary_10_1016_j_aei_2022_101525
crossref_primary_10_1007_s12613_020_2168_z
crossref_primary_10_3390_s23146613
crossref_primary_10_1109_ACCESS_2022_3212387
crossref_primary_10_1002_nme_6927
crossref_primary_10_1007_s00202_024_02281_3
crossref_primary_10_1016_j_agrformet_2024_110263
crossref_primary_10_1142_S0219467825500196
crossref_primary_10_1109_ACCESS_2020_2985717
crossref_primary_10_1109_ACCESS_2020_3010506
crossref_primary_10_3389_fpls_2020_00025
crossref_primary_10_3390_en16207094
crossref_primary_10_1109_JSTARS_2021_3110994
crossref_primary_10_32604_cmc_2023_043239
crossref_primary_10_3390_computation9100103
crossref_primary_10_1155_2022_5485284
crossref_primary_10_1016_j_asoc_2020_106742
crossref_primary_10_1038_s41598_024_54964_3
crossref_primary_10_3389_fimmu_2022_835760
crossref_primary_10_1007_s10143_021_01573_7
crossref_primary_10_1016_j_jksuci_2021_05_012
crossref_primary_10_1109_ACCESS_2020_3044949
crossref_primary_10_1016_j_jhydrol_2021_126794
crossref_primary_10_1155_2022_2123662
crossref_primary_10_1016_j_conbuildmat_2023_133534
crossref_primary_10_1016_j_knosys_2020_106602
crossref_primary_10_1093_bib_bbad202
crossref_primary_10_1016_j_energy_2023_127965
crossref_primary_10_3233_JIFS_232325
crossref_primary_10_1016_j_conbuildmat_2024_137240
crossref_primary_10_1016_j_csi_2025_104018
crossref_primary_10_3390_a16010046
crossref_primary_10_1002_cpe_6988
crossref_primary_10_1007_s10278_022_00617_8
crossref_primary_10_1007_s00779_021_01587_4
crossref_primary_10_3389_fenrg_2022_905155
Cites_doi 10.1142/S1469026818500086
10.1016/0167-8191(94)00078-O
10.1162/089976600300015187
10.1016/j.neucom.2008.04.027
10.1007/BF00941312
10.1109/TEC.2008.926068
10.1017/S0263574714000344
10.1016/j.apm.2018.11.035
10.5267/j.msl.2015.4.002
10.1609/aaai.v29i1.9375
10.1126/science.267.5198.664
10.1137/0906002
10.1093/ietfec/e90-a.8.1679
10.1080/00207543.2018.1436789
ContentType Journal Article
Copyright 2019 Elsevier B.V.
Copyright Elsevier Science Ltd. Aug 15, 2019
Copyright_xml – notice: 2019 Elsevier B.V.
– notice: Copyright Elsevier Science Ltd. Aug 15, 2019
DBID AAYXX
CITATION
7SC
8FD
E3H
F2A
JQ2
L7M
L~C
L~D
DOI 10.1016/j.knosys.2019.04.019
DatabaseName CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Library & Information Sciences Abstracts (LISA)
Library & Information Science Abstracts (LISA)
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Library and Information Science Abstracts (LISA)
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList
Technology Research Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1872-7409
EndPage 83
ExternalDocumentID 10_1016_j_knosys_2019_04_019
S0950705119301923
GroupedDBID --K
--M
.DC
.~1
0R~
1B1
1~.
1~5
4.4
457
4G.
5VS
7-5
71M
77K
8P~
9JN
AACTN
AAEDT
AAEDW
AAIAV
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAXUO
AAYFN
ABAOU
ABBOA
ABIVO
ABJNI
ABMAC
ABYKQ
ACAZW
ACDAQ
ACGFS
ACRLP
ACZNC
ADBBV
ADEZE
ADGUI
ADTZH
AEBSH
AECPX
AEKER
AENEX
AFKWA
AFTJW
AGHFR
AGUBO
AGYEJ
AHHHB
AHJVU
AHZHX
AIALX
AIEXJ
AIKHN
AITUG
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
AOUOD
ARUGR
AXJTR
BJAXD
BKOJK
BLXMC
CS3
DU5
EBS
EFJIC
EFLBG
EJD
EO8
EO9
EP2
EP3
FDB
FIRID
FNPLU
FYGXN
G-Q
GBLVA
GBOLZ
IHE
J1W
JJJVA
KOM
LG9
LY7
M41
MHUIS
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
PQQKQ
Q38
RIG
ROL
RPZ
SDF
SDG
SDP
SES
SPC
SPCBC
SST
SSV
SSW
SSZ
T5K
WH7
XPP
ZMT
~02
~G-
29L
77I
9DU
AAQXK
AATTM
AAXKI
AAYWO
AAYXX
ABDPE
ABWVN
ABXDB
ACLOT
ACNNM
ACRPL
ACVFH
ADCNI
ADJOM
ADMUD
ADNMO
AEIPS
AEUPX
AFJKZ
AFPUW
AGQPQ
AIGII
AIIUN
AKBMS
AKRWK
AKYEP
ANKPU
APXCP
ASPBG
AVWKF
AZFZN
CITATION
EFKBS
FEDTE
FGOYB
G-2
HLZ
HVGLF
HZ~
R2-
SBC
SET
SEW
UHS
WUQ
~HD
7SC
8FD
E3H
F2A
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c400t-a2aa8cefb7025f035c5e26051453f77935d36de5cbf29c94e1b4612496fadc5e3
ISICitedReferencesCount 86
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000472687500007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0950-7051
IngestDate Fri Nov 14 19:19:47 EST 2025
Sat Nov 29 07:10:16 EST 2025
Tue Nov 18 22:38:35 EST 2025
Fri Feb 23 02:18:39 EST 2024
IsPeerReviewed true
IsScholarly true
Keywords Hyperparameter optimization
Deep neural network
Convolution neural network
Autoencoder
Gradient-free optimization
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c400t-a2aa8cefb7025f035c5e26051453f77935d36de5cbf29c94e1b4612496fadc5e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
PQID 2251702240
PQPubID 2035257
PageCount 10
ParticipantIDs proquest_journals_2251702240
crossref_citationtrail_10_1016_j_knosys_2019_04_019
crossref_primary_10_1016_j_knosys_2019_04_019
elsevier_sciencedirect_doi_10_1016_j_knosys_2019_04_019
PublicationCentury 2000
PublicationDate 2019-08-15
PublicationDateYYYYMMDD 2019-08-15
PublicationDate_xml – month: 08
  year: 2019
  text: 2019-08-15
  day: 15
PublicationDecade 2010
PublicationPlace Amsterdam
PublicationPlace_xml – name: Amsterdam
PublicationTitle Knowledge-based systems
PublicationYear 2019
Publisher Elsevier B.V
Elsevier Science Ltd
Publisher_xml – name: Elsevier B.V
– name: Elsevier Science Ltd
References Hinz, Navarro-Guerrero, Magg, Wermter (b3) 2018; 17
Ilievski, Akhtar, Feng, Shoemaker (b7) 2016
K. Eggensperger, M. Feurer, F. Hutter, J. Bergstra, J. Snoek, H. Hoos, K. Leyton-Brown, Towards an empirical foundation for assessing bayesian optimization of hyperparameters, in: NIPS workshop on Bayesian Optimization in Theory and Practice 2013, 2013.
Gharaei, Karimi, Shekarabi (b27) 2019; 69
Gharaei, Naderi, Mohammadi (b30) 2015; 5
Bergstra, Bardenet, Bengio, Kegl (b8) 2011; 24
Torn, Zilinskas (b25) 1989
Kim, Kim, Choi, Kim (b15) 2008; 6
Kan, Timmer (b21) 1985
Guo, Yang, Wu, Wang, Liang (b5) 2008; 71
Jang, Kim (b18) 2007; 5
Rao (b13) 1996
Yoo, Jung, Jang, Won (b16) 2015; 33
Aluffi-Pentini, Paris, Zirilli (b19) 1985; 47
Bengio (b4) 2000; 12
Bergstra, Bengio (b1) 2012; 13
Shekarab, Gharaei, Karimi (b28) 2018
K. Eggensperger, F. Hutter, H.H. Hoos, K. Leyton-Brown, Efficient benchmarking of hyperparameter optimizers via surrogates, in: AAAI, 2015, 2015, pp. 1114–1120.
Knuth (b31) 1973
Levy, Montalvo (b23) 1985; 6
Duan, Deng, Gharaei, Wu, Wang (b26) 2018; 56
Kim, Kim (b14) 2007; E90-A
Maron, Moore (b6) 1993; 6
Gharaei, Shekarab, Karimi (b29) 2019
Yong, Lishan, Evans (b20) 1995; 21
Snoek, Larochelle, Adams (b2) 2012; 25
Ratschek, Rokne (b22) 1988
Kim, Kim, Park, Kim (b17) 2008; 23
J. Bergstra, D. Yamins, D.D. Cox, Making a science of model search: Hyperparameter optimizationin hundreds of dimensions for vision architectures, Presented at the 30th International Conference on Machine Learning, ICML 2013, in: JMLR Workshop and Conference Proceedings Vol. 28 (1), 2013, pp. 115–123.
Snoek, Larochelle, Adams (b12) 2012; 25
Cvijovic, Klinowski (b24) 1995; 267
Torn (10.1016/j.knosys.2019.04.019_b25) 1989
Yoo (10.1016/j.knosys.2019.04.019_b16) 2015; 33
Jang (10.1016/j.knosys.2019.04.019_b18) 2007; 5
Kim (10.1016/j.knosys.2019.04.019_b15) 2008; 6
Kan (10.1016/j.knosys.2019.04.019_b21) 1985
Bergstra (10.1016/j.knosys.2019.04.019_b8) 2011; 24
Kim (10.1016/j.knosys.2019.04.019_b14) 2007; E90-A
10.1016/j.knosys.2019.04.019_b11
10.1016/j.knosys.2019.04.019_b10
Gharaei (10.1016/j.knosys.2019.04.019_b29) 2019
Gharaei (10.1016/j.knosys.2019.04.019_b27) 2019; 69
Maron (10.1016/j.knosys.2019.04.019_b6) 1993; 6
Shekarab (10.1016/j.knosys.2019.04.019_b28) 2018
Yong (10.1016/j.knosys.2019.04.019_b20) 1995; 21
Hinz (10.1016/j.knosys.2019.04.019_b3) 2018; 17
Bengio (10.1016/j.knosys.2019.04.019_b4) 2000; 12
Guo (10.1016/j.knosys.2019.04.019_b5) 2008; 71
Aluffi-Pentini (10.1016/j.knosys.2019.04.019_b19) 1985; 47
Bergstra (10.1016/j.knosys.2019.04.019_b1) 2012; 13
Levy (10.1016/j.knosys.2019.04.019_b23) 1985; 6
10.1016/j.knosys.2019.04.019_b9
Snoek (10.1016/j.knosys.2019.04.019_b2) 2012; 25
Duan (10.1016/j.knosys.2019.04.019_b26) 2018; 56
Gharaei (10.1016/j.knosys.2019.04.019_b30) 2015; 5
Cvijovic (10.1016/j.knosys.2019.04.019_b24) 1995; 267
Ilievski (10.1016/j.knosys.2019.04.019_b7) 2016
Ratschek (10.1016/j.knosys.2019.04.019_b22) 1988
Knuth (10.1016/j.knosys.2019.04.019_b31) 1973
Snoek (10.1016/j.knosys.2019.04.019_b12) 2012; 25
Kim (10.1016/j.knosys.2019.04.019_b17) 2008; 23
Rao (10.1016/j.knosys.2019.04.019_b13) 1996
References_xml – volume: 25
  year: 2012
  ident: b12
  article-title: Practical Bayesian optimization of machine learning algorithms
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 6
  start-page: 571
  year: 2008
  end-page: 582
  ident: b15
  article-title: On the global convergence of univariate dynamic encoding algorithm for searches (uDEAS)
  publication-title: Int. J. Control Autom. Syst.
– year: 1973
  ident: b31
  article-title: The Art of Computer Programming, 3: Sorting and Searching
– volume: 6
  start-page: 59
  year: 1993
  end-page: 66
  ident: b6
  article-title: Hoeffding races: accelerating model selection search for classification and function approximation
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 17
  year: 2018
  ident: b3
  article-title: Speeding up the hyperparameter optimization of deep convolutional neural networks
  publication-title: Int. J. Comput. Intell. Appl.
– volume: E90-A
  start-page: 1679
  year: 2007
  end-page: 1689
  ident: b14
  article-title: A fast computational optimization method: Univariate dynamic encoding algorithm for searches (uDEAS)
  publication-title: IEICE Trans. Fundam. Electron. Commun. Comput. Sci.
– year: 1988
  ident: b22
  article-title: New Computer Methods for Global Optimization
– volume: 23
  start-page: 804
  year: 2008
  end-page: 813
  ident: b17
  article-title: On-load motor parameter identification using univariate dynamic encoding algorithm for searches
  publication-title: IEEE Trans. Energy Convers.
– volume: 71
  start-page: 3211
  year: 2008
  end-page: 3215
  ident: b5
  article-title: A novel LSSVMs hyper-parameter selection based on particle swarm optimization
  publication-title: Neurocomput.
– volume: 6
  start-page: 15
  year: 1985
  end-page: 29
  ident: b23
  article-title: The tunneling algorithm for the global minimization of functions
  publication-title: SIAM J. Sci. Stat. Comput.
– volume: 13
  start-page: 281
  year: 2012
  end-page: 305
  ident: b1
  article-title: Random search for hyper-parameter optimization
  publication-title: J. Mach. Learn. Res.
– volume: 56
  start-page: 1
  year: 2018
  end-page: 19
  ident: b26
  article-title: Selective maintenance scheduling under stochastic maintenance quality with multiple maintenance actions
  publication-title: Int. J. Prod. Res.
– year: 1989
  ident: b25
  article-title: Global Optimization
– volume: 25
  start-page: 2951
  year: 2012
  end-page: 2959
  ident: b2
  article-title: Practical Bayesian optimization of machine learning algorithms
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 5
  start-page: 629
  year: 2015
  end-page: 638
  ident: b30
  article-title: Optimization of rewards in single machine scheduling in the rewards-driven systems
  publication-title: Manag. Sci. Lett.
– volume: 33
  start-page: 295
  year: 2015
  end-page: 313
  ident: b16
  article-title: Fuzzy weighted subtask controller for redundant manipulator
  publication-title: Robotica
– reference: J. Bergstra, D. Yamins, D.D. Cox, Making a science of model search: Hyperparameter optimizationin hundreds of dimensions for vision architectures, Presented at the 30th International Conference on Machine Learning, ICML 2013, in: JMLR Workshop and Conference Proceedings Vol. 28 (1), 2013, pp. 115–123.
– year: 2016
  ident: b7
  article-title: Efficient hyperparameter optimization of deep learning algorithms using deterministic RBF surrogates
  publication-title: Thirty-First AAAI Conference on Artificial Intelligence
– volume: 5
  start-page: 43
  year: 2007
  end-page: 50
  ident: b18
  article-title: An estimation of a billet temperature during reheating furnace operation
  publication-title: Int. J. Control Autom. Syst.
– start-page: 245
  year: 1985
  end-page: 262
  ident: b21
  article-title: A stochastic approach to global optimization
  publication-title: Numerical Optimization
– volume: 47
  start-page: 1
  year: 1985
  end-page: 15
  ident: b19
  article-title: Global optimization and stochastic differential equations
  publication-title: J. Optim. Theory Appl.
– start-page: 1
  year: 2018
  end-page: 21
  ident: b28
  article-title: Modelling and optimal lot-sizing of integrated multi-level multi-wholesaler supply chains under the shortage and limited warehouse space: generalised outer approximation
  publication-title: Int. J. Syst. Sci. Oper. Logist.
– year: 1996
  ident: b13
  article-title: Engineering Optimization
– volume: 12
  start-page: 1889
  year: 2000
  end-page: 1900
  ident: b4
  article-title: Gradient-based optimization of hyperparameters
  publication-title: Neural Comput.
– volume: 24
  year: 2011
  ident: b8
  article-title: Algorithms for hyper-parameter optimization
  publication-title: Adv. Neural Inf. Process. Syst.
– reference: K. Eggensperger, M. Feurer, F. Hutter, J. Bergstra, J. Snoek, H. Hoos, K. Leyton-Brown, Towards an empirical foundation for assessing bayesian optimization of hyperparameters, in: NIPS workshop on Bayesian Optimization in Theory and Practice 2013, 2013.
– volume: 69
  start-page: 223
  year: 2019
  end-page: 254
  ident: b27
  article-title: An integrated multi-product, multi-buyer supply chain under penalty, green, and quality control polices and a vendor managed inventory with consignment stock agreement: The outer approximation with equality relaxation and augmented penalty algorithm
  publication-title: Appl. Math. Model.
– volume: 21
  start-page: 389
  year: 1995
  end-page: 400
  ident: b20
  article-title: The annealing evolution algorithm as function optimizer
  publication-title: Parallel Comput.
– volume: 267
  start-page: 664
  year: 1995
  end-page: 666
  ident: b24
  article-title: Taboo search: An approach to the multiple minima problem
  publication-title: Science
– year: 2019
  ident: b29
  article-title: Modelling and optimal lot-sizing of the replenishments in constrained, multi-product and bi-objective EPQ models with defective products: Generalised cross decomposition
  publication-title: Int. J. Syst. Sci. Oper. Logist.
– reference: K. Eggensperger, F. Hutter, H.H. Hoos, K. Leyton-Brown, Efficient benchmarking of hyperparameter optimizers via surrogates, in: AAAI, 2015, 2015, pp. 1114–1120.
– ident: 10.1016/j.knosys.2019.04.019_b9
– volume: 17
  issue: 02
  year: 2018
  ident: 10.1016/j.knosys.2019.04.019_b3
  article-title: Speeding up the hyperparameter optimization of deep convolutional neural networks
  publication-title: Int. J. Comput. Intell. Appl.
  doi: 10.1142/S1469026818500086
– volume: 21
  start-page: 389
  issue: 3
  year: 1995
  ident: 10.1016/j.knosys.2019.04.019_b20
  article-title: The annealing evolution algorithm as function optimizer
  publication-title: Parallel Comput.
  doi: 10.1016/0167-8191(94)00078-O
– volume: 12
  start-page: 1889
  issue: 8
  year: 2000
  ident: 10.1016/j.knosys.2019.04.019_b4
  article-title: Gradient-based optimization of hyperparameters
  publication-title: Neural Comput.
  doi: 10.1162/089976600300015187
– volume: 25
  year: 2012
  ident: 10.1016/j.knosys.2019.04.019_b12
  article-title: Practical Bayesian optimization of machine learning algorithms
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 71
  start-page: 3211
  issue: 16
  year: 2008
  ident: 10.1016/j.knosys.2019.04.019_b5
  article-title: A novel LSSVMs hyper-parameter selection based on particle swarm optimization
  publication-title: Neurocomput.
  doi: 10.1016/j.neucom.2008.04.027
– volume: 6
  start-page: 571
  issue: 4
  year: 2008
  ident: 10.1016/j.knosys.2019.04.019_b15
  article-title: On the global convergence of univariate dynamic encoding algorithm for searches (uDEAS)
  publication-title: Int. J. Control Autom. Syst.
– volume: 6
  start-page: 59
  year: 1993
  ident: 10.1016/j.knosys.2019.04.019_b6
  article-title: Hoeffding races: accelerating model selection search for classification and function approximation
  publication-title: Adv. Neural Inf. Process. Syst.
– year: 2016
  ident: 10.1016/j.knosys.2019.04.019_b7
  article-title: Efficient hyperparameter optimization of deep learning algorithms using deterministic RBF surrogates
– volume: 47
  start-page: 1
  year: 1985
  ident: 10.1016/j.knosys.2019.04.019_b19
  article-title: Global optimization and stochastic differential equations
  publication-title: J. Optim. Theory Appl.
  doi: 10.1007/BF00941312
– volume: 23
  start-page: 804
  issue: 3
  year: 2008
  ident: 10.1016/j.knosys.2019.04.019_b17
  article-title: On-load motor parameter identification using univariate dynamic encoding algorithm for searches
  publication-title: IEEE Trans. Energy Convers.
  doi: 10.1109/TEC.2008.926068
– volume: 25
  start-page: 2951
  year: 2012
  ident: 10.1016/j.knosys.2019.04.019_b2
  article-title: Practical Bayesian optimization of machine learning algorithms
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 33
  start-page: 295
  issue: 2
  year: 2015
  ident: 10.1016/j.knosys.2019.04.019_b16
  article-title: Fuzzy weighted subtask controller for redundant manipulator
  publication-title: Robotica
  doi: 10.1017/S0263574714000344
– year: 1988
  ident: 10.1016/j.knosys.2019.04.019_b22
– volume: 69
  start-page: 223
  year: 2019
  ident: 10.1016/j.knosys.2019.04.019_b27
  article-title: An integrated multi-product, multi-buyer supply chain under penalty, green, and quality control polices and a vendor managed inventory with consignment stock agreement: The outer approximation with equality relaxation and augmented penalty algorithm
  publication-title: Appl. Math. Model.
  doi: 10.1016/j.apm.2018.11.035
– volume: 5
  start-page: 629
  year: 2015
  ident: 10.1016/j.knosys.2019.04.019_b30
  article-title: Optimization of rewards in single machine scheduling in the rewards-driven systems
  publication-title: Manag. Sci. Lett.
  doi: 10.5267/j.msl.2015.4.002
– year: 2019
  ident: 10.1016/j.knosys.2019.04.019_b29
  article-title: Modelling and optimal lot-sizing of the replenishments in constrained, multi-product and bi-objective EPQ models with defective products: Generalised cross decomposition
  publication-title: Int. J. Syst. Sci. Oper. Logist.
– ident: 10.1016/j.knosys.2019.04.019_b11
  doi: 10.1609/aaai.v29i1.9375
– volume: 267
  start-page: 664
  year: 1995
  ident: 10.1016/j.knosys.2019.04.019_b24
  article-title: Taboo search: An approach to the multiple minima problem
  publication-title: Science
  doi: 10.1126/science.267.5198.664
– start-page: 1
  year: 2018
  ident: 10.1016/j.knosys.2019.04.019_b28
  article-title: Modelling and optimal lot-sizing of integrated multi-level multi-wholesaler supply chains under the shortage and limited warehouse space: generalised outer approximation
  publication-title: Int. J. Syst. Sci. Oper. Logist.
– start-page: 245
  year: 1985
  ident: 10.1016/j.knosys.2019.04.019_b21
  article-title: A stochastic approach to global optimization
– year: 1989
  ident: 10.1016/j.knosys.2019.04.019_b25
– year: 1996
  ident: 10.1016/j.knosys.2019.04.019_b13
– volume: 24
  year: 2011
  ident: 10.1016/j.knosys.2019.04.019_b8
  article-title: Algorithms for hyper-parameter optimization
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 6
  start-page: 15
  year: 1985
  ident: 10.1016/j.knosys.2019.04.019_b23
  article-title: The tunneling algorithm for the global minimization of functions
  publication-title: SIAM J. Sci. Stat. Comput.
  doi: 10.1137/0906002
– year: 1973
  ident: 10.1016/j.knosys.2019.04.019_b31
– ident: 10.1016/j.knosys.2019.04.019_b10
– volume: 13
  start-page: 281
  year: 2012
  ident: 10.1016/j.knosys.2019.04.019_b1
  article-title: Random search for hyper-parameter optimization
  publication-title: J. Mach. Learn. Res.
– volume: 5
  start-page: 43
  issue: 1
  year: 2007
  ident: 10.1016/j.knosys.2019.04.019_b18
  article-title: An estimation of a billet temperature during reheating furnace operation
  publication-title: Int. J. Control Autom. Syst.
– volume: E90-A
  start-page: 1679
  issue: 8
  year: 2007
  ident: 10.1016/j.knosys.2019.04.019_b14
  article-title: A fast computational optimization method: Univariate dynamic encoding algorithm for searches (uDEAS)
  publication-title: IEICE Trans. Fundam. Electron. Commun. Comput. Sci.
  doi: 10.1093/ietfec/e90-a.8.1679
– volume: 56
  start-page: 1
  issue: 23
  year: 2018
  ident: 10.1016/j.knosys.2019.04.019_b26
  article-title: Selective maintenance scheduling under stochastic maintenance quality with multiple maintenance actions
  publication-title: Int. J. Prod. Res.
  doi: 10.1080/00207543.2018.1436789
SSID ssj0002218
Score 2.5630476
Snippet This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches....
SourceID proquest
crossref
elsevier
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 74
SubjectTerms Algorithms
Artificial neural networks
Autoencoder
Averages
Computer simulation
Convergence
Convolution
Convolution neural network
Deep neural network
Encoding
Function words
Genetic algorithms
Genetics
Gradient-free optimization
Hyperparameter optimization
Networks
Neural networks
Optimization
Searching
Simulated annealing
Title Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches
URI https://dx.doi.org/10.1016/j.knosys.2019.04.019
https://www.proquest.com/docview/2251702240
Volume 178
WOSCitedRecordID wos000472687500007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVESC
  databaseName: Elsevier SD Freedom Collection Journals 2021
  customDbUrl:
  eissn: 1872-7409
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002218
  issn: 0950-7051
  databaseCode: AIEXJ
  dateStart: 19950201
  isFulltext: true
  titleUrlDefault: https://www.sciencedirect.com
  providerName: Elsevier
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF5ByoELb0ShoD0gLtEi17v22scKpSolChxSlNvKXq8hbWOHPKr-fGb24SRUUDhwcSK_Yu18_nZ2MvMNIW95VMioFprVVaqZEDlnhRCG8bgu0tQAZ1qd7q9DORplk0n-xafyLm07Adk02fV1Pv-vpoZ9YGwsnf0Hc3c3hR3wHYwOWzA7bP_K8CewslygovcMM136LXDCzBdbomdYGTPvo4ol2KZxOeD9tQ0YrJvpFaycwfnsV65PfR9VLm3VS3H5rV1MV99nNi3RS4Astz3bTyE4x3BirLxE9HLDKjYma8nldN1sBxuwviljrtzSRcBCFUwgnk3qkQspRkxGXkDWOELNJHjwIsp3GNd17fGc6br03KByF1U4f3_RtPDEmISXW1Faz7A7ytmjz-r4bDhU48Fk_G7-g2FTMfzz3XdYuUv2YpnkWY_sHX0cTE67qTqObQC4e_BQW2kTAG_-8O98l19mceuajB-RB35NQY8cFh6TO6Z5Qh6Gfh3Uj-JTcrELDboNDdrWFKFBHTSohwa10KAbaFAPDRqgQTtoUIAGDdB4Rs6OB-MPJ8w322AaaHzFirgoMm3qUoIXXEc80YnBte6hSHgtgcWTiqeVSXRZx7nOhTksRYqdy9O6qOBc_pz0mrYxLwhFCcNElqZIwXeMdJlVUZKVXGBnhIxzuU94GEalvRI9NkS5VCHl8Fy5wVc4-CoSCj72CeuumjslllvOl8FCynuTzktUgLBbrjwIBlX-xYbjqO1nHeCXfz78itzfvDkHpLdarM1rck9frabLxRuPwJ_sn6Sw
linkProvider Elsevier
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Hyperparameter+optimization+of+deep+neural+network+using+univariate+dynamic+encoding+algorithm+for+searches&rft.jtitle=Knowledge-based+systems&rft.au=Yoo%2C+YoungJun&rft.date=2019-08-15&rft.pub=Elsevier+Science+Ltd&rft.issn=0950-7051&rft.eissn=1872-7409&rft.volume=178&rft.spage=74&rft_id=info:doi/10.1016%2Fj.knosys.2019.04.019&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0950-7051&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0950-7051&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0950-7051&client=summon