A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning

Nowadays, deep learning is a current and a stimulating field of machine learning. Deep learning is the most effective, supervised, time and cost efficient machine learning approach. Deep learning is not a restricted learning approach, but it abides various procedures and topographies which can be ap...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Archives of computational methods in engineering Ročník 27; číslo 4; s. 1071 - 1092
Hlavní autori: Dargan, Shaveta, Kumar, Munish, Ayyagari, Maruthi Rohit, Kumar, Gulshan
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Dordrecht Springer Netherlands 01.09.2020
Springer Nature B.V
Predmet:
ISSN:1134-3060, 1886-1784
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Nowadays, deep learning is a current and a stimulating field of machine learning. Deep learning is the most effective, supervised, time and cost efficient machine learning approach. Deep learning is not a restricted learning approach, but it abides various procedures and topographies which can be applied to an immense speculum of complicated problems. The technique learns the illustrative and differential features in a very stratified way. Deep learning methods have made a significant breakthrough with appreciable performance in a wide variety of applications with useful security tools. It is considered to be the best choice for discovering complex architecture in high-dimensional data by employing back propagation algorithm. As deep learning has made significant advancements and tremendous performance in numerous applications, the widely used domains of deep learning are business, science and government which further includes adaptive testing, biological image classification, computer vision, cancer detection, natural language processing, object detection, face recognition, handwriting recognition, speech recognition, stock market analysis, smart city and many more. This paper focuses on the concepts of deep learning, its basic and advanced architectures, techniques, motivational aspects, characteristics and the limitations. The paper also presents the major differences between the deep learning, classical machine learning and conventional learning approaches and the major challenges ahead. The main intention of this paper is to explore and present chronologically, a comprehensive survey of the major applications of deep learning covering variety of areas, study of the techniques and architectures used and further the contribution of that respective application in the real world. Finally, the paper ends with the conclusion and future aspects.
AbstractList Nowadays, deep learning is a current and a stimulating field of machine learning. Deep learning is the most effective, supervised, time and cost efficient machine learning approach. Deep learning is not a restricted learning approach, but it abides various procedures and topographies which can be applied to an immense speculum of complicated problems. The technique learns the illustrative and differential features in a very stratified way. Deep learning methods have made a significant breakthrough with appreciable performance in a wide variety of applications with useful security tools. It is considered to be the best choice for discovering complex architecture in high-dimensional data by employing back propagation algorithm. As deep learning has made significant advancements and tremendous performance in numerous applications, the widely used domains of deep learning are business, science and government which further includes adaptive testing, biological image classification, computer vision, cancer detection, natural language processing, object detection, face recognition, handwriting recognition, speech recognition, stock market analysis, smart city and many more. This paper focuses on the concepts of deep learning, its basic and advanced architectures, techniques, motivational aspects, characteristics and the limitations. The paper also presents the major differences between the deep learning, classical machine learning and conventional learning approaches and the major challenges ahead. The main intention of this paper is to explore and present chronologically, a comprehensive survey of the major applications of deep learning covering variety of areas, study of the techniques and architectures used and further the contribution of that respective application in the real world. Finally, the paper ends with the conclusion and future aspects.
Author Ayyagari, Maruthi Rohit
Dargan, Shaveta
Kumar, Munish
Kumar, Gulshan
Author_xml – sequence: 1
  givenname: Shaveta
  surname: Dargan
  fullname: Dargan, Shaveta
  organization: Department of Computational Sciences, Maharaja Ranjit Singh Punjab Technical University
– sequence: 2
  givenname: Munish
  orcidid: 0000-0003-0115-1620
  surname: Kumar
  fullname: Kumar, Munish
  email: munishcse@gmail.com
  organization: Department of Computational Sciences, Maharaja Ranjit Singh Punjab Technical University
– sequence: 3
  givenname: Maruthi Rohit
  surname: Ayyagari
  fullname: Ayyagari, Maruthi Rohit
  organization: College of Business, University of Dallas
– sequence: 4
  givenname: Gulshan
  surname: Kumar
  fullname: Kumar, Gulshan
  organization: Department of Computer Applications, Shaheed Bhagat Singh State Technical Campus
BookMark eNp9kMlOwzAURS1UJNrCD7CyxNpgx05is4vKVKkMYlhbjmMXV60T7JSqf09oEEgsunpvcc8bzggMfO0NAKcEnxOM84tICKcEYSIQFpQxtDkAQ8J5hkjO2aDrCWWI4gwfgVGMC4xTJkQyBM8FfFmHT7OFtYVXxjRwZlTwzs-h8hWcthEWTbN0WrWu9vESFvDBbOCTCqpy8xVsa3iv9Lvz5hc8BodWLaM5-alj8HZz_Tq5Q7PH2-mkmCFNedqilDGRiIqbqqyUslxbZo1VmtGcMJtTS7DgNLWpxlzhMi9FycrU6pQrga2idAzO-rlNqD_WJrZyUa-D71bKhNGMpBnPWZfifUqHOsZgrNSu3T3TBuWWkmD5bVD2BmVnUO4Myk2HJv_QJriVCtv9EO2h2IX93IS_q_ZQX-0bhY8
CitedBy_id crossref_primary_10_1088_1741_2552_ac74e0
crossref_primary_10_3390_agronomy14051034
crossref_primary_10_1007_s43684_022_00041_3
crossref_primary_10_3390_electronics11213533
crossref_primary_10_1111_exsy_13415
crossref_primary_10_1007_s44206_022_00032_0
crossref_primary_10_1007_s11227_024_06136_3
crossref_primary_10_1016_j_pnucene_2025_105745
crossref_primary_10_3390_bioengineering12050517
crossref_primary_10_1016_j_jpba_2023_115623
crossref_primary_10_1007_s13369_022_07412_1
crossref_primary_10_1016_j_jretconser_2023_103299
crossref_primary_10_1007_s11831_022_09763_2
crossref_primary_10_1002_ett_4458
crossref_primary_10_1007_s11042_024_18592_3
crossref_primary_10_1016_j_neucom_2025_130854
crossref_primary_10_1109_JRFID_2024_3387996
crossref_primary_10_1088_1742_6596_2129_1_012023
crossref_primary_10_1002_ps_8473
crossref_primary_10_3389_fpls_2023_1271320
crossref_primary_10_3390_rs14092086
crossref_primary_10_1080_17499518_2022_2138918
crossref_primary_10_1016_j_artmed_2024_102902
crossref_primary_10_1007_s11517_021_02447_2
crossref_primary_10_1007_s12293_025_00463_5
crossref_primary_10_1038_s41598_023_41171_9
crossref_primary_10_1109_TIFS_2025_3560690
crossref_primary_10_1109_TVCG_2022_3209384
crossref_primary_10_1007_s11042_022_12102_z
crossref_primary_10_1109_ACCESS_2025_3548167
crossref_primary_10_1016_j_imu_2021_100564
crossref_primary_10_1109_ACCESS_2021_3136898
crossref_primary_10_1016_j_eswa_2023_120754
crossref_primary_10_1080_1206212X_2024_2414042
crossref_primary_10_1109_ACCESS_2021_3124200
crossref_primary_10_1016_j_asr_2024_03_033
crossref_primary_10_1109_ACCESS_2023_3244599
crossref_primary_10_3390_app14010313
crossref_primary_10_3390_ai5040144
crossref_primary_10_2166_hydro_2022_134
crossref_primary_10_1007_s11277_022_09500_9
crossref_primary_10_3390_plants13182556
crossref_primary_10_1051_e3sconf_202560100015
crossref_primary_10_1002_cam4_6363
crossref_primary_10_1109_JSAC_2021_3078501
crossref_primary_10_32604_cmc_2024_047875
crossref_primary_10_1061_JCEMD4_COENG_12367
crossref_primary_10_3390_math11133008
crossref_primary_10_1007_s11042_023_15933_6
crossref_primary_10_1016_j_vibspec_2023_103522
crossref_primary_10_3389_fphys_2022_952709
crossref_primary_10_3390_plants11070970
crossref_primary_10_1007_s11042_022_12551_6
crossref_primary_10_1007_s13369_022_07096_7
crossref_primary_10_1007_s13246_022_01139_x
crossref_primary_10_3390_s23010062
crossref_primary_10_1089_big_2022_0050
crossref_primary_10_1145_3700821
crossref_primary_10_1007_s11042_022_12245_z
crossref_primary_10_1145_3469722
crossref_primary_10_1007_s11042_020_10103_4
crossref_primary_10_1016_j_aei_2020_101097
crossref_primary_10_1186_s13007_024_01275_3
crossref_primary_10_1109_ACCESS_2023_3282309
crossref_primary_10_1007_s11760_024_03719_8
crossref_primary_10_1007_s10278_021_00564_w
crossref_primary_10_1109_ACCESS_2024_3491655
crossref_primary_10_1007_s11042_022_12100_1
crossref_primary_10_1016_j_iswa_2025_200532
crossref_primary_10_1155_2022_6173463
crossref_primary_10_3390_e24020196
crossref_primary_10_1109_ACCESS_2020_2990765
crossref_primary_10_1038_s41598_025_97793_8
crossref_primary_10_1016_j_aiia_2025_01_003
crossref_primary_10_3390_diagnostics13172804
crossref_primary_10_1057_s41599_022_01090_y
crossref_primary_10_1016_j_eswa_2021_115742
crossref_primary_10_3390_jmmp8060244
crossref_primary_10_1016_j_trd_2024_104389
crossref_primary_10_1016_j_chemosphere_2023_140802
crossref_primary_10_1007_s13201_024_02301_4
crossref_primary_10_1016_j_ecoenv_2025_118176
crossref_primary_10_1016_j_heliyon_2024_e32189
crossref_primary_10_1177_09544070241244858
crossref_primary_10_1007_s11227_024_06663_z
crossref_primary_10_1016_j_future_2022_12_014
crossref_primary_10_1016_j_jclepro_2023_136771
crossref_primary_10_1007_s00500_021_06691_4
crossref_primary_10_1016_j_artmed_2025_103135
crossref_primary_10_3390_app13116386
crossref_primary_10_1016_j_est_2023_106904
crossref_primary_10_1016_j_bspc_2025_108383
crossref_primary_10_1007_s12145_025_01736_w
crossref_primary_10_3390_bioengineering9110683
crossref_primary_10_1007_s00354_024_00245_6
crossref_primary_10_1007_s11042_022_12586_9
crossref_primary_10_1016_j_cpb_2024_100382
crossref_primary_10_1007_s12161_024_02701_x
crossref_primary_10_1155_2022_7622392
crossref_primary_10_1109_ACCESS_2022_3146397
crossref_primary_10_1007_s11042_022_13957_y
crossref_primary_10_3390_rs17162908
crossref_primary_10_1016_j_apenergy_2025_125349
crossref_primary_10_1016_j_iot_2024_101153
crossref_primary_10_3390_toxics13050349
crossref_primary_10_1002_jcu_23558
crossref_primary_10_1016_j_ijleo_2024_172155
crossref_primary_10_3389_frai_2023_1232640
crossref_primary_10_3390_technologies11040087
crossref_primary_10_1007_s11042_023_14664_y
crossref_primary_10_1016_j_aei_2024_103070
crossref_primary_10_3390_aerospace11120961
crossref_primary_10_1007_s00170_025_15671_z
crossref_primary_10_1007_s10489_024_05936_7
crossref_primary_10_1186_s40537_022_00625_z
crossref_primary_10_32604_cmc_2023_038446
crossref_primary_10_1109_JSTARS_2023_3262679
crossref_primary_10_1155_2022_6872045
crossref_primary_10_3390_info14090513
crossref_primary_10_1007_s42107_023_00739_6
crossref_primary_10_1007_s11069_022_05336_5
crossref_primary_10_3390_catal15090842
crossref_primary_10_1007_s11036_022_02021_6
crossref_primary_10_1051_shsconf_202419703002
crossref_primary_10_1016_j_heliyon_2024_e34103
crossref_primary_10_1016_j_inffus_2024_102922
crossref_primary_10_1049_cmu2_12447
crossref_primary_10_3390_su17177602
crossref_primary_10_1007_s10845_024_02521_0
crossref_primary_10_1109_JSTQE_2023_3296385
crossref_primary_10_1371_journal_pone_0268007
crossref_primary_10_1038_s41598_024_73334_7
crossref_primary_10_4108_eetpht_10_5541
crossref_primary_10_1109_JIOT_2023_3237361
crossref_primary_10_1007_s11356_023_29064_w
crossref_primary_10_1093_bib_bbab462
crossref_primary_10_1016_j_neucom_2023_126609
crossref_primary_10_2298_JSC230201049K
crossref_primary_10_1007_s11042_023_16407_5
crossref_primary_10_1007_s42979_021_00772_9
crossref_primary_10_1108_JIC_07_2024_0201
crossref_primary_10_1007_s11053_025_10491_0
crossref_primary_10_1016_j_bspc_2022_104138
crossref_primary_10_3389_fpls_2022_808380
crossref_primary_10_1016_j_nexres_2025_100424
crossref_primary_10_1016_j_compag_2024_109344
crossref_primary_10_1016_j_datak_2024_102402
crossref_primary_10_1016_j_mtcomm_2025_111625
crossref_primary_10_3390_bios14040160
crossref_primary_10_1016_j_asoc_2023_110703
crossref_primary_10_1007_s12652_020_02787_1
crossref_primary_10_1007_s44244_023_00008_0
crossref_primary_10_3389_fcomm_2023_1075654
crossref_primary_10_1080_02533839_2021_1919561
crossref_primary_10_1016_j_measurement_2025_116885
crossref_primary_10_1007_s00521_024_09435_1
crossref_primary_10_1007_s10462_021_10025_z
crossref_primary_10_1007_s11042_022_14313_w
crossref_primary_10_1007_s11831_023_09989_8
crossref_primary_10_1016_j_eswa_2023_119594
crossref_primary_10_3390_rs14246371
crossref_primary_10_1016_j_compbiomed_2021_105146
crossref_primary_10_2106_JBJS_RVW_22_00225
crossref_primary_10_1016_j_nantod_2022_101665
crossref_primary_10_3390_s21206910
crossref_primary_10_1007_s12559_022_10084_6
crossref_primary_10_3233_JIFS_235940
crossref_primary_10_1109_ACCESS_2024_3506853
crossref_primary_10_1515_commun_2023_0133
crossref_primary_10_3390_app11030999
crossref_primary_10_3390_bdcc8110151
crossref_primary_10_1016_j_ipm_2023_103367
crossref_primary_10_1109_ACCESS_2024_3462825
crossref_primary_10_1007_s12652_023_04670_1
crossref_primary_10_1016_j_cie_2023_109714
crossref_primary_10_1109_ACCESS_2025_3530242
crossref_primary_10_1016_j_ces_2025_121452
crossref_primary_10_3390_electronics12061421
crossref_primary_10_1016_j_sca_2024_100076
crossref_primary_10_3390_bdcc8120195
crossref_primary_10_1038_s43856_024_00688_4
crossref_primary_10_23887_jstundiksha_v13i2_84743
crossref_primary_10_3390_molecules26247548
crossref_primary_10_3390_app112411807
crossref_primary_10_3390_app12157701
crossref_primary_10_3390_s24175473
crossref_primary_10_1007_s00432_023_04956_z
crossref_primary_10_3390_rs15112833
crossref_primary_10_1007_s42979_022_01022_2
crossref_primary_10_1007_s11276_021_02670_7
crossref_primary_10_1016_j_jpowsour_2025_236607
crossref_primary_10_1007_s11042_024_19395_2
crossref_primary_10_3390_make7030061
crossref_primary_10_1109_JSYST_2023_3238678
crossref_primary_10_7717_peerj_cs_773
crossref_primary_10_1016_j_dsp_2025_105036
crossref_primary_10_1007_s11274_025_04385_9
crossref_primary_10_1109_ACCESS_2025_3553139
crossref_primary_10_3390_app12115714
crossref_primary_10_1007_s00521_023_08957_4
crossref_primary_10_1007_s10845_023_02251_9
crossref_primary_10_1089_3dp_2023_0208
crossref_primary_10_1007_s11265_022_01816_w
crossref_primary_10_3390_agriculture14122282
crossref_primary_10_1016_j_eswa_2023_121777
crossref_primary_10_1016_j_iot_2024_101120
crossref_primary_10_1016_j_fbio_2025_106238
crossref_primary_10_3390_rs13214467
crossref_primary_10_1038_s41598_022_13153_w
crossref_primary_10_3390_s25082599
crossref_primary_10_63995_FXPC8243
crossref_primary_10_3390_math12152376
crossref_primary_10_1007_s11042_021_11788_x
crossref_primary_10_1016_j_cities_2024_105151
crossref_primary_10_1007_s11042_023_15503_w
crossref_primary_10_3390_s22124614
crossref_primary_10_1007_s00704_022_04068_7
crossref_primary_10_1016_j_measurement_2025_116762
crossref_primary_10_3390_agriculture13030540
crossref_primary_10_1016_j_eswa_2025_129740
crossref_primary_10_1016_j_jksuci_2023_101611
crossref_primary_10_1080_09715010_2023_2214107
crossref_primary_10_1007_s11831_024_10063_0
crossref_primary_10_3389_fenrg_2024_1363895
crossref_primary_10_3390_sym17081344
crossref_primary_10_1080_00207543_2019_1662133
crossref_primary_10_1007_s11042_022_13636_y
crossref_primary_10_1007_s41315_022_00231_5
crossref_primary_10_1007_s11277_022_09693_z
crossref_primary_10_3390_math12010168
crossref_primary_10_1088_1361_6501_ac57ec
crossref_primary_10_3233_JIFS_231145
crossref_primary_10_3390_sym13081497
crossref_primary_10_1016_j_artmed_2025_103095
crossref_primary_10_1016_j_procs_2025_04_334
crossref_primary_10_1109_TVT_2021_3068119
crossref_primary_10_1155_2022_3088043
crossref_primary_10_3390_app13137811
crossref_primary_10_1007_s00500_023_08419_y
crossref_primary_10_3390_computers13090229
crossref_primary_10_1016_j_csbj_2025_06_038
crossref_primary_10_1049_itr2_12529
crossref_primary_10_3390_rs14030638
crossref_primary_10_1039_D4FO00591K
crossref_primary_10_3390_biomedicines13010208
crossref_primary_10_1016_j_jhydrol_2024_131345
crossref_primary_10_3390_ai1040029
crossref_primary_10_3390_app14188125
crossref_primary_10_1016_j_inffus_2023_101847
crossref_primary_10_1016_j_optcom_2022_128741
crossref_primary_10_1016_j_jnca_2022_103464
crossref_primary_10_1007_s41060_024_00526_9
crossref_primary_10_29024_jsim_216
crossref_primary_10_1016_j_eswa_2023_121365
crossref_primary_10_3390_app14219932
crossref_primary_10_1016_j_compag_2021_106359
crossref_primary_10_1109_TKDE_2020_3014246
crossref_primary_10_1177_01655515211047423
crossref_primary_10_1016_j_nanoen_2025_110897
crossref_primary_10_1109_COMST_2022_3208196
crossref_primary_10_3390_app13063439
crossref_primary_10_3390_rs16101778
crossref_primary_10_1016_j_compag_2021_106483
crossref_primary_10_1016_j_artmed_2024_102811
crossref_primary_10_1016_j_eswa_2025_129538
crossref_primary_10_1016_j_istruc_2025_108695
crossref_primary_10_1007_s42243_023_00956_y
crossref_primary_10_1109_JSEN_2020_3036465
crossref_primary_10_48084_etasr_10499
crossref_primary_10_3390_electronics10111350
crossref_primary_10_1155_2023_5507814
crossref_primary_10_1016_j_mtcomm_2024_109767
crossref_primary_10_1155_2022_9580991
crossref_primary_10_3390_app14156648
crossref_primary_10_1109_TII_2023_3342886
crossref_primary_10_3390_app14188470
crossref_primary_10_1016_j_eswa_2023_120246
crossref_primary_10_1016_j_swevo_2025_101930
crossref_primary_10_1016_j_eswa_2025_127357
crossref_primary_10_3233_JIFS_230150
crossref_primary_10_1007_s00500_022_07118_4
crossref_primary_10_1007_s11042_023_15996_5
crossref_primary_10_32604_cmc_2024_054007
crossref_primary_10_1007_s11276_024_03724_2
crossref_primary_10_1007_s11831_025_10348_y
crossref_primary_10_1007_s10278_021_00432_7
crossref_primary_10_1002_bkcs_12445
crossref_primary_10_1007_s11042_022_13009_5
crossref_primary_10_1007_s12559_025_10484_4
crossref_primary_10_1016_j_jmgm_2023_108673
crossref_primary_10_1016_j_watres_2023_120518
crossref_primary_10_1007_s10462_021_10000_8
crossref_primary_10_1155_2022_4806763
crossref_primary_10_1016_j_apor_2023_103511
crossref_primary_10_1109_ACCESS_2025_3573277
crossref_primary_10_1007_s10796_024_10540_8
crossref_primary_10_1007_s11042_022_12277_5
crossref_primary_10_3390_agronomy14020294
crossref_primary_10_1007_s11042_023_14859_3
crossref_primary_10_1016_j_ecolind_2025_113557
crossref_primary_10_1007_s11042_022_12403_3
crossref_primary_10_1108_IJSI_08_2023_0082
crossref_primary_10_1016_j_indcrop_2024_119671
crossref_primary_10_1007_s11042_024_20258_z
crossref_primary_10_1016_j_knosys_2023_111107
crossref_primary_10_32604_csse_2023_037124
crossref_primary_10_1016_j_asoc_2025_113152
crossref_primary_10_3389_fagro_2024_1425425
crossref_primary_10_1080_10589759_2024_2324378
crossref_primary_10_3390_info16060424
crossref_primary_10_3390_agriculture12060813
crossref_primary_10_1016_j_talanta_2025_128652
crossref_primary_10_3389_fpubh_2022_894920
crossref_primary_10_1155_2021_5549111
crossref_primary_10_1007_s11042_022_13894_w
crossref_primary_10_1016_j_asoc_2023_110555
crossref_primary_10_1007_s11042_022_13927_4
crossref_primary_10_1007_s10462_021_09985_z
crossref_primary_10_1016_j_micron_2023_103581
crossref_primary_10_1007_s00521_023_08565_2
crossref_primary_10_1007_s12652_020_02674_9
crossref_primary_10_1007_s10915_022_01939_z
crossref_primary_10_1155_2022_2965638
crossref_primary_10_1016_j_pmpp_2025_102940
crossref_primary_10_1371_journal_pone_0276427
crossref_primary_10_3390_agronomy14071571
crossref_primary_10_1016_j_rsase_2024_101384
crossref_primary_10_3390_act13100387
crossref_primary_10_1364_AO_453929
crossref_primary_10_1007_s11356_024_32807_y
crossref_primary_10_1371_journal_pone_0287439
crossref_primary_10_1109_JSEN_2022_3160762
crossref_primary_10_4018_IJITSA_321133
crossref_primary_10_1155_2022_4202181
crossref_primary_10_3390_rs13163166
crossref_primary_10_3390_electronics11162545
crossref_primary_10_7717_peerj_cs_2232
crossref_primary_10_1007_s11831_021_09541_6
crossref_primary_10_15407_kvt206_04_005
crossref_primary_10_1155_2022_5667264
crossref_primary_10_3390_technologies9040077
crossref_primary_10_1016_j_foodcont_2025_111522
crossref_primary_10_1061__ASCE_CP_1943_5487_0001064
crossref_primary_10_1016_j_neucom_2022_02_083
crossref_primary_10_1016_j_procs_2022_12_023
crossref_primary_10_3390_rs15040883
crossref_primary_10_1007_s11042_022_13510_x
crossref_primary_10_1016_j_scs_2022_104277
crossref_primary_10_3390_app131910994
crossref_primary_10_1109_JBHI_2023_3318419
crossref_primary_10_1007_s11042_024_20574_4
crossref_primary_10_1109_TBME_2021_3102466
crossref_primary_10_1016_j_sasc_2025_200281
crossref_primary_10_3390_app12125919
crossref_primary_10_1002_jrs_6365
crossref_primary_10_1016_j_scitotenv_2025_179797
crossref_primary_10_1186_s13321_024_00941_x
crossref_primary_10_1016_j_rse_2024_114214
crossref_primary_10_1109_ACCESS_2023_3294093
crossref_primary_10_1016_j_eswa_2023_122220
crossref_primary_10_1145_3506699
crossref_primary_10_1016_j_eswa_2024_123860
crossref_primary_10_1016_j_chemolab_2025_105333
crossref_primary_10_1007_s10207_023_00689_9
crossref_primary_10_1016_j_jechem_2022_12_055
crossref_primary_10_2464_jilm_75_166
crossref_primary_10_3389_fpls_2023_1178902
crossref_primary_10_1007_s11042_022_13215_1
crossref_primary_10_1111_srt_13414
crossref_primary_10_1007_s00784_025_06485_0
crossref_primary_10_3390_app132011408
crossref_primary_10_3390_rs14133157
crossref_primary_10_1007_s00521_021_06401_z
crossref_primary_10_1016_j_meafoo_2022_100038
crossref_primary_10_1016_j_ijleo_2023_171262
crossref_primary_10_1002_adfm_202508438
crossref_primary_10_1109_ACCESS_2022_3203568
crossref_primary_10_1145_3648609
crossref_primary_10_3390_f14071325
crossref_primary_10_1007_s12652_023_04514_y
crossref_primary_10_1002_srin_202200284
crossref_primary_10_1109_TAP_2024_3372772
crossref_primary_10_3390_jmse10111699
crossref_primary_10_1016_j_aquatox_2024_107137
crossref_primary_10_1002_er_7713
crossref_primary_10_1155_2022_8123671
crossref_primary_10_1007_s00500_021_06521_7
crossref_primary_10_1016_j_eswa_2025_129594
crossref_primary_10_1080_00207543_2022_2162618
crossref_primary_10_1016_j_jobe_2023_107155
crossref_primary_10_3390_w16233398
crossref_primary_10_1007_s00170_024_13670_0
crossref_primary_10_46810_tdfd_1529139
crossref_primary_10_1108_IMDS_08_2022_0468
crossref_primary_10_1007_s43681_022_00135_x
crossref_primary_10_1007_s11042_023_16445_z
crossref_primary_10_3390_app13074601
crossref_primary_10_1007_s11042_023_15796_x
crossref_primary_10_3390_agriculture11121216
crossref_primary_10_1080_15252019_2022_2126337
crossref_primary_10_1186_s13007_022_00838_6
crossref_primary_10_3390_data9050069
crossref_primary_10_1007_s42417_021_00286_x
crossref_primary_10_3390_s24010266
crossref_primary_10_1016_j_cosrev_2024_100662
crossref_primary_10_3390_data5020044
crossref_primary_10_3390_diagnostics13091652
crossref_primary_10_1016_j_asoc_2025_113453
crossref_primary_10_2478_johh_2023_0036
crossref_primary_10_3390_wevj15090381
crossref_primary_10_1007_s11042_022_12833_z
crossref_primary_10_1007_s11269_025_04142_5
crossref_primary_10_3390_math13081263
crossref_primary_10_1080_01431161_2025_2549533
crossref_primary_10_7717_peerj_cs_2024
crossref_primary_10_1080_10408398_2022_2131725
crossref_primary_10_22399_ijcesen_3855
crossref_primary_10_3390_app12189188
crossref_primary_10_1016_j_jag_2023_103611
crossref_primary_10_1016_j_autcon_2023_104780
crossref_primary_10_1109_JSEN_2024_3419806
crossref_primary_10_1093_jas_skac132
crossref_primary_10_3390_math10224283
crossref_primary_10_1007_s41748_024_00457_2
crossref_primary_10_1007_s00607_025_01485_0
crossref_primary_10_3390_drones7030214
crossref_primary_10_1016_j_microc_2024_111344
crossref_primary_10_3389_fenvs_2022_876707
crossref_primary_10_1007_s10462_023_10530_3
crossref_primary_10_23947_2334_8496_2025_13_1_1_13
crossref_primary_10_1016_j_eswa_2024_123697
crossref_primary_10_3390_agronomy13122866
crossref_primary_10_4018_IJCINI_380619
crossref_primary_10_1007_s42044_025_00277_1
crossref_primary_10_3390_bioengineering10010047
crossref_primary_10_1155_2022_5945117
crossref_primary_10_3390_iot6020029
crossref_primary_10_3390_make4040053
crossref_primary_10_1016_j_engappai_2024_108550
crossref_primary_10_1088_1361_6501_acabdb
crossref_primary_10_1038_s41598_021_86432_7
crossref_primary_10_1039_D5PM00089K
crossref_primary_10_1145_3577925
crossref_primary_10_1016_j_jad_2025_119718
crossref_primary_10_3390_ma16051894
crossref_primary_10_1007_s00521_021_06319_6
crossref_primary_10_3390_jcdd10120485
crossref_primary_10_1109_JIOT_2024_3520426
crossref_primary_10_1371_journal_pone_0283121
crossref_primary_10_1002_ima_23214
crossref_primary_10_1007_s10772_025_10194_0
crossref_primary_10_1007_s10462_020_09924_4
crossref_primary_10_1007_s11042_023_17368_5
crossref_primary_10_1016_j_neucom_2023_127047
crossref_primary_10_1109_ACCESS_2021_3094675
crossref_primary_10_1007_s10462_023_10444_0
crossref_primary_10_1109_TII_2023_3281835
crossref_primary_10_47164_ijngc_v13i5_901
crossref_primary_10_1016_j_ijpx_2024_100287
crossref_primary_10_1016_j_inffus_2023_102217
crossref_primary_10_1088_2399_6528_abebcf
crossref_primary_10_1007_s00521_021_06153_w
crossref_primary_10_3389_fmicb_2021_712886
crossref_primary_10_1007_s00521_025_11145_1
crossref_primary_10_1146_annurev_soc_090523_050708
crossref_primary_10_1007_s11042_021_10952_7
crossref_primary_10_3390_rs14040830
crossref_primary_10_1016_j_vacuum_2025_114258
crossref_primary_10_1007_s11280_023_01153_3
crossref_primary_10_1007_s42979_023_01774_5
crossref_primary_10_1016_j_techfore_2024_123966
crossref_primary_10_3390_foods12152879
crossref_primary_10_1111_ppa_13322
crossref_primary_10_1177_01423312211027027
crossref_primary_10_3390_pr13041143
crossref_primary_10_1142_S0218001425400051
crossref_primary_10_1080_13467581_2024_2328634
crossref_primary_10_1016_j_bspc_2023_105339
crossref_primary_10_1016_j_ces_2023_119244
crossref_primary_10_3390_electronics12010074
crossref_primary_10_1007_s11042_022_13738_7
crossref_primary_10_1016_j_engappai_2025_110924
crossref_primary_10_1007_s11042_022_13102_9
crossref_primary_10_1016_j_nanoen_2022_108041
crossref_primary_10_3390_fi15100332
crossref_primary_10_1108_MD_05_2024_1112
crossref_primary_10_1088_1361_6463_accbcc
crossref_primary_10_1016_j_kjs_2024_100337
crossref_primary_10_1016_j_ress_2021_108191
crossref_primary_10_1002_adom_202203104
crossref_primary_10_32604_csse_2023_037992
crossref_primary_10_1002_sam_11701
crossref_primary_10_3390_electronics11233998
crossref_primary_10_1016_j_cmpb_2021_106121
crossref_primary_10_3390_machines10050333
crossref_primary_10_1016_j_infrared_2024_105168
crossref_primary_10_1038_s41598_020_70159_y
crossref_primary_10_1007_s43621_025_01709_5
crossref_primary_10_1088_1361_6439_ad3a72
crossref_primary_10_1016_j_snb_2024_137194
crossref_primary_10_1080_12460125_2024_2428999
crossref_primary_10_1002_ieam_4552
crossref_primary_10_1016_j_imu_2023_101351
crossref_primary_10_1088_1361_6501_ad457d
crossref_primary_10_1016_j_oceaneng_2022_112839
crossref_primary_10_1007_s40140_022_00539_9
crossref_primary_10_1007_s11042_023_15980_z
crossref_primary_10_3390_jsan13050055
crossref_primary_10_1007_s11042_022_12007_x
crossref_primary_10_1016_j_jclepro_2023_139771
crossref_primary_10_1038_s41598_025_11445_5
crossref_primary_10_1016_j_jece_2023_110593
crossref_primary_10_1007_s11042_022_11934_z
crossref_primary_10_1007_s11831_025_10375_9
crossref_primary_10_1109_JSEN_2025_3532781
crossref_primary_10_1007_s11042_023_16438_y
crossref_primary_10_1007_s11760_025_04349_4
crossref_primary_10_1016_j_jmsy_2024_04_005
crossref_primary_10_1109_TITS_2023_3294519
crossref_primary_10_1016_j_autcon_2025_105983
crossref_primary_10_1080_21680566_2024_2358211
crossref_primary_10_1109_JSTARS_2025_3529293
crossref_primary_10_1109_TMC_2024_3398801
crossref_primary_10_1111_jerd_13079
crossref_primary_10_1007_s11831_022_09720_z
crossref_primary_10_1109_TGRS_2024_3478249
crossref_primary_10_1109_LCA_2021_3101505
crossref_primary_10_3390_s21020463
crossref_primary_10_1007_s11063_022_10927_1
crossref_primary_10_1007_s12046_023_02126_y
crossref_primary_10_1016_j_comcom_2025_108055
crossref_primary_10_1109_ACCESS_2021_3085855
crossref_primary_10_1016_j_eswa_2022_117123
crossref_primary_10_3389_fnhum_2023_1174104
crossref_primary_10_32604_cmc_2023_039236
crossref_primary_10_1016_j_mechmachtheory_2022_105107
crossref_primary_10_1016_j_sasc_2025_200317
crossref_primary_10_1145_3643563
crossref_primary_10_1007_s13369_025_10169_y
crossref_primary_10_1186_s13102_025_01294_0
crossref_primary_10_1007_s11042_022_14074_6
crossref_primary_10_1145_3688569
crossref_primary_10_1002_jrs_70028
crossref_primary_10_1186_s13007_023_01101_2
crossref_primary_10_1093_bib_bbae103
crossref_primary_10_3390_urbansci9040132
crossref_primary_10_1002_for_3201
crossref_primary_10_1007_s10772_022_09984_7
crossref_primary_10_1007_s11042_021_11377_y
crossref_primary_10_32604_cmc_2023_037087
crossref_primary_10_1007_s42979_024_03552_3
crossref_primary_10_1016_j_conbuildmat_2025_140753
crossref_primary_10_1109_TITS_2023_3279024
crossref_primary_10_1016_j_scitotenv_2024_172822
crossref_primary_10_1088_1752_7163_acb283
crossref_primary_10_1109_ACCESS_2023_3303961
crossref_primary_10_1088_1755_1315_1501_1_012007
crossref_primary_10_1145_3727643
crossref_primary_10_1016_j_biortech_2022_128418
crossref_primary_10_1109_ACCESS_2021_3049325
crossref_primary_10_1142_S1793984425300067
crossref_primary_10_1109_ACCESS_2021_3103903
crossref_primary_10_1109_JPHOT_2022_3221726
crossref_primary_10_1007_s00264_022_05628_2
crossref_primary_10_1007_s11042_021_11059_9
crossref_primary_10_1051_e3sconf_202341201083
crossref_primary_10_1109_ACCESS_2021_3106171
crossref_primary_10_1007_s00500_019_04525_y
crossref_primary_10_1007_s11042_022_14075_5
crossref_primary_10_1109_ACCESS_2021_3071479
crossref_primary_10_3390_s22103691
Cites_doi 10.1109/CSE-EUC.2017.215
10.1109/JSTSP.2017.2784181
10.1007/978-3-319-01931-4_3
10.1109/TPAMI.2017.2699184
10.1109/TASLP.2014.2346313
10.1109/MGRS.2017.2762307
10.1109/MS.2017.79
10.1109/CCDC.2017.7979332
10.1007/s11831-015-9157-9
10.1109/ICIS.2017.7960069
10.1016/j.asoc.2017.09.027
10.1007/s10032-018-0295-0
10.1007/s10489-014-0629-7
10.1109/CC.2017.8233654
10.1016/j.eswa.2017.02.002
10.1016/j.strusafe.2015.05.001
10.1016/j.patrec.2018.02.009
10.1109/MSP.2014.2359987
10.1002/stc.2012
10.21037/mhealth.2017.09.01
10.1007/s10462-018-9633-3
10.1145/2683483.2683514
10.1109/icpr.2016.7900165
10.1109/IJCNN.2016.7727426
10.21437/Interspeech.2015-270
10.1016/j.eswa.2015.05.034
10.1016/j.eswa.2018.05.016
10.1007/s10044-014-0433-3
10.1016/j.asoc.2018.05.023
10.1007/978-3-319-71746-3_5
10.1109/ICIVPR.2017.7890866
10.1016/j.asoc.2016.08.013
10.1016/j.patcog.2018.05.007
10.1016/j.eswa.2017.04.030
10.1177/0037549717709932
10.1016/j.patrec.2018.04.029
10.1007/978-3-319-42297-8_40
10.1016/j.eswa.2016.10.055
10.1109/ChiCC.2016.7554408
10.1016/j.patcog.2016.04.012
10.1016/j.engstruct.2016.04.012
10.1142/S0219467817500164
10.1109/DAS.2018.35
10.1109/CVPR.2012.6248110
10.1109/MGRS.2016.2540798
10.1109/TLA.2018.8358674
10.1109/IPACT.2017.8245200
10.1016/j.eswa.2017.05.039
10.1016/j.eswa.2016.10.017
10.1038/nature14539
10.1080/15481603.2017.1323377
10.1109/das.2018.70
10.1016/j.patrec.2017.11.005
10.1109/ACCESS.2014.2325029
10.1109/ACPR.2015.7486478
10.1109/ICFHR.2016.0112
10.1016/j.eswa.2018.03.056
10.1016/j.patcog.2017.12.017
10.1109/SYSOSE.2017.7994976
10.1109/ASPDAC.2016.7428100
10.1016/j.patcog.2017.05.015
10.1007/978-3-319-51969-22
10.1016/j.patcog.2018.01.036
10.1109/wci.2015.7495514
10.1109/ICCSS.2015.7281139
ContentType Journal Article
Copyright CIMNE, Barcelona, Spain 2019
CIMNE, Barcelona, Spain 2019.
Copyright_xml – notice: CIMNE, Barcelona, Spain 2019
– notice: CIMNE, Barcelona, Spain 2019.
DBID AAYXX
CITATION
JQ2
DOI 10.1007/s11831-019-09344-w
DatabaseName CrossRef
ProQuest Computer Science Collection
DatabaseTitle CrossRef
ProQuest Computer Science Collection
DatabaseTitleList ProQuest Computer Science Collection

DeliveryMethod fulltext_linktorsrc
Discipline Applied Sciences
Engineering
EISSN 1886-1784
EndPage 1092
ExternalDocumentID 10_1007_s11831_019_09344_w
GroupedDBID -5B
-5G
-BR
-EM
-Y2
-~C
-~X
.4S
.86
.DC
.VR
06D
0R~
0VY
1N0
1SB
203
23M
28-
29~
2J2
2JN
2JY
2KG
2KM
2LR
2VQ
2~H
30V
3V.
4.4
406
408
40D
40E
5GY
5VS
67Z
6NX
7WY
8FE
8FG
8FL
8TC
8UJ
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABDZT
ABECU
ABFTV
ABHQN
ABJCF
ABJNI
ABJOX
ABKCH
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACHSB
ACHXU
ACIWK
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACZOJ
ADHHG
ADHIR
ADINQ
ADKNI
ADKPE
ADMLS
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCEE
ARCSS
ARMRJ
ASPBG
AVWKF
AXYYD
AYJHY
AZFZN
AZQEC
B-.
BA0
BBWZM
BDATZ
BENPR
BEZIV
BGLVJ
BGNMA
BPHCQ
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DNIVK
DPUIP
DWQXO
EBLON
EBS
EDO
EIOEI
EJD
ESBYG
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRNLG
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GROUPED_ABI_INFORM_COMPLETE
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HRMNR
HVGLF
HZ~
I-F
IJ-
IKXTQ
IWAJR
IXC
IXD
IXE
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
K60
K6V
K6~
K7-
KDC
KOV
L6V
LLZTM
M0C
M0N
M4Y
M7S
MA-
MK~
N2Q
NB0
NDZJH
NF0
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
P19
P2P
P62
P9P
PF0
PQBIZ
PQBZA
PQQKQ
PROAC
PT4
PT5
PTHSS
QOK
QOS
R4E
R89
R9I
RHV
RNI
RNS
ROL
RPX
RSV
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SCLPG
SCV
SDH
SDM
SEG
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W48
WK8
YLTOR
Z45
Z5O
Z7R
Z7X
Z7Y
Z7Z
Z83
Z88
ZMTXR
_50
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
AEZWR
AFDZB
AFFHD
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
JZLTJ
PHGZM
PHGZT
PQGLB
JQ2
ID FETCH-LOGICAL-c385t-544929d8edbdaaf8cf4fefac43714f73f109835f5c08a0b7b9b4b5fc58a90fa33
IEDL.DBID RSV
ISICitedReferencesCount 735
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000565732400005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1134-3060
IngestDate Thu Sep 18 00:04:37 EDT 2025
Tue Nov 18 22:18:50 EST 2025
Sat Nov 29 06:21:45 EST 2025
Fri Feb 21 02:32:01 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c385t-544929d8edbdaaf8cf4fefac43714f73f109835f5c08a0b7b9b4b5fc58a90fa33
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-0115-1620
PQID 2436156874
PQPubID 1486352
PageCount 22
ParticipantIDs proquest_journals_2436156874
crossref_citationtrail_10_1007_s11831_019_09344_w
crossref_primary_10_1007_s11831_019_09344_w
springer_journals_10_1007_s11831_019_09344_w
PublicationCentury 2000
PublicationDate 2020-09-01
PublicationDateYYYYMMDD 2020-09-01
PublicationDate_xml – month: 09
  year: 2020
  text: 2020-09-01
  day: 01
PublicationDecade 2020
PublicationPlace Dordrecht
PublicationPlace_xml – name: Dordrecht
PublicationSubtitle State of the Art Reviews
PublicationTitle Archives of computational methods in engineering
PublicationTitleAbbrev Arch Computat Methods Eng
PublicationYear 2020
Publisher Springer Netherlands
Springer Nature B.V
Publisher_xml – name: Springer Netherlands
– name: Springer Nature B.V
References FalciniFLamiGCostanzaAMDeep learning in automotive softwareIEEE Softw2017343566310.1109/MS.2017.79
Nweke HF, Teh YW, Al-garadi MA, Alo UR (2018) Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst Appl: 1–87
RipollVJRWojdelARomeroARamosPBrugadaJECG assessment based on neural networks with pertainingAppl Soft Comput20164939940610.1016/j.asoc.2016.08.013
Gurjar N, Sudholt S, Fink GA (2018) Learning deep representations for word spotting under weak supervision. In: Proceedings of the 13th IAPR international workshop on document analysis systems (DAS), pp 7s–12s
Wicht B, Fischer A, Hennebert J (2016) Deep learning features for handwritten keyword spotting. In: Proceedings of the 23rd international conference on pattern recognition (ICPR). https://doi.org/10.1109/icpr.2016.7900165
Mohsen AM, El-Makky NM, Ghanem N (2017) Author identification using deep learning. In: Proceedings of the 15th IEEE international conference on machine learning and applications, pp 898–903
SanakoyeuABautistaMAOmmerBDeep unsupervised learning of visual similaritiesPattern Recogn20187833134310.1016/j.patcog.2018.01.036
Kannan RJ, Subramanian S (2015) An adaptive approach of tamil character recognition using deep learning with big data-a survey. Adv Intell Syst Comput: 557–567
Gheisari M, Wang G, Bhuiyan MZA (2017) A survey on deep learning in big data. In: Proceedings of the IEEE international conference on embedded and ubiquitous computing (EUC), pp 1–8
Krishnan P, Dutta K, Jawahar CV (2018) Word spotting and recognition using deep embedding. In: Proceedings of 13th IAPR international workshop on document analysis systems (DAS). https://doi.org/10.1109/das.2018.70
Zhou X, Gong W, Fu W, Du F (2017) Application of deep learning in object detection. In: Proceedings of the IEEE/ACIS 16th international conference on computer and information science (ICIS), pp 631–634
ChenXWLinXBig data deep learning: challenges and perspectivesIEEE2014251452510.1109/ACCESS.2014.2325029
SalazarFMoranRToledoMAOñateEData-based models for the prediction of dam behaviour: a review and some methodological considerationsArch Comput Methods Eng20172411211360.7401710.1007/s11831-015-9157-9
Zulkarneev M, Grigoryan R, Shamraev N (2013) Acoustic modeling with deep belief networks for Russian speech recognition. In: Proceedings of the international conference on speech and computer, pp 17–24
Lopez D, Rivas E, Gualdron O (2017) Primary user characterization for cognitive radio wireless networks using a neural system based on deep learning. Artif Intell Rev: 1–27
Zhong SH, Li Y, Le B (2015) Query oriented unsupervised multi document summarization via deep learning. Expert Syst Appl, pp 1–10
AmatoGCarraraFFalchiFGennaroCMeghiniCVairoCDeep learning for decentralized parking lot occupancy detectionExpert Syst Appl20177232733410.1016/j.eswa.2016.10.055
ZhangLZhangLDuBDeep learning for remote sensing data: a technical tutorial on the state of the artIEEE Geosci Remote Sens Mag201642224010.1109/MGRS.2016.2540798
Chu J, Srihari S (2014) Writer identification using a deep neural network. In: Proceedings of the 2014 Indian conference on computer vision graphics and image processing, pp 1–7
ChongEHanCParkFCDeep learning network for stock market analysis and prediction: methodology, data representations and case studiesExpert Syst Appl20178318720510.1016/j.eswa.2017.04.030
Xiao B, Xiong J, Shi Y (2016) Novel applications of deep learning hidden features for adaptive testing. In: Proceedings of the 21st Asia and South Pacific design automation conference, pp 743–748
XueSHamidOAJiangHDaiLLiuQFast adaptation of deep neural network based on discriminant codes for speech recognitionIEEE/ACM Trans Audio Speech Lang Process201422121713172510.1109/TASLP.2014.2346313
Poznanski A, Wolf L (2016) CNN-N-gram for handwriting word recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2305–2314
RudinFLiGJWangKAn algorithm for power system fault analysis based on convolutional deep learning neural networksInt J Res Educ Sci Methods2017591118
PapakostasMGiannakopoulosTSpeech-music discrimination using deep visual feature extractorsExpert Syst Appl201811433434410.1016/j.eswa.2018.05.016
SerizelRGDDeep-neural network approaches for speech recognition with heterogeneous groups of speakers including childrenNat Lang Eng201613126
Mohamed A, Dahl G, Geoffrey H (2009) Deep belief networks for phone recognition. In: Proceedings of the nips workshop on deep learning for speech recognition and related applications, pp 1–9
SalazarFToledoMAMoránROñateEAn empirical comparison of machine learning techniques for dam behaviour modelling structural safetyStruct Saf20155691710.1016/j.strusafe.2015.05.001
NodaKYamaguchiYNakadaiKOkunoHGOgataTAudio-visual speech recognition using deep learningAppl Intell201542472273710.1007/s10489-014-0629-7
ThomasSChatelainCHeutteLPaquetTKessentiniYA deep HMM model for multiple keywords spotting in handwritten documentsPattern Anal Appl201518410031015339705910.1007/s10044-014-0433-3
MarkovnikovNKipyatkovaIKarpovAFilchenkovADeep neural networks in russian speech recognitionArtif Intell Nat Lang Commun Comput Inf Sci2018789546710.1007/978-3-319-71746-3_5
Zhang XL, Wu J (2013) Denoising deep neural networks based voice activity detection. In: Proceedings of the IEEE international conference on acoustics, speech and signal processing, pp 853–857
Chen CH, Lee CR, Lu WCH (2016) A mobile cloud framework for deep learning and its application to smart car camera. In: Proceedings of the international conference on internet of vehicles, pp 14–25. https://doi.org/10.1007/978-3-319-51969-22
IgnatovAReal-time human activity recognition from accelerometer data using convolutional neural networksAppl Soft Comput20186291592210.1016/j.asoc.2017.09.027
Makhmudov AZ, Abdukarimov SS (2016) Speech recognition using deep learning algorithms. In: Proceedings of the international conference on informatics: problems, methodology, technologies, pp 10–15
Jia X (2017) image recognition method based on deep learning. In: Proceedings of the 29th IEEE, Chinese control and decision conference (CCDC), pp 4730–4735
PrabhanjanSDineshRdeep learning approach for devanagari script recognitionProc Int J Image Graph2017173175001610.1142/S0219467817500164
ChenLCPapandreouGKokkinosIMurphyKYuilleALDeeplab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFsIEEE Trans Pattern Anal Mach Intell201840483484810.1109/TPAMI.2017.2699184
ZhaoCChenKWeiZChenYMiaoDWangWMultilevel triplet deep learning model for person reidentificationPattern Recogn Lett201810.1016/j.patrec.2018.04.029
DaiYWangGA deep inference learning framework for healthcarePattern Recogn Lett201810.1016/j.patrec.2018.02.009
Nguyen HD, Le AD, Nakagawa M (2015) Deep neural networks for recognizing online handwritten mathematical symbols. In: Proceedings of the 3rd IAPR IEEE Asian conference on pattern recognition (ACPR), pp 121–125
AbbasQIbrahimMEAJaffarMAA comprehensive review of recent advances on deep vision systemsArtif Intell Rev201810.1007/s10462-018-9633-3
Puthussery AR, Haradi KP, Erol BA, Benavidez P, Rad P, Jamshidi M (2017) A deep vision landmark framework for robot navigation. In: Proceedings of the system of systems engineering conference, pp 1–6
WangTWenCKWangHJiangFJinSDeep learning for wireless physical layer: opportunities and challengesChina Commun201714119211110.1109/CC.2017.8233654
Alwzwazy HA, Albehadili HA, Alwan YS, Islam NE (2016) Handwritten digit recognition using convolutional neural networks. In: Proceedings of international journal of innovative research in computer and communication engineering, vol 4(2), pp 1101–1106
AzarMYHameyLText summarization using unsupervised deep learningExpert Syst Appl2017689310510.1016/j.eswa.2016.10.017
Soniya, Paul S, Singh L (2015) A review on advances in deep learning. In: Proceedings of IEEE workshop on computational intelligence: theories, applications and future directions (WCI), pp 1–6. https://doi.org/10.1109/wci.2015.7495514
SudholtSFinkGAAttribute CNNs for word spotting in handwritten documentsInt J Doc Anal Recognit (IJDAR)201710.1007/s10032-018-0295-0
SantanaLMQDSantosRMMatosLNMacedoHTDeep neural networks for acoustic modeling in the presence of noiseIEEE Latin Am Trans201816391892510.1109/TLA.2018.8358674
ZhuXXTuiaDMouLXiaGSZhangLXuFFraundorferFDeep learning in remote sensing: a comprehensive review and list of resourcesIEEE Geosci Remote Sens Mag20175483610.1109/MGRS.2017.2762307
LohBCSThenPHHDeep learning for cardiaccomputer-aided diagnosis: benefits, issues & solutionsM Health201710.21037/mhealth.2017.09.01
Looks M, Herreshoff M, Hutchins D, Norvig P (2017) Deep learning with dynamic computation graphs. In: Proceedings of the international conference on learning representation, pp 1–12
Salazar F, Oñate E, Toledo MA (2017a) A machine learning based methodology for anomaly detection in dam behaviour, CIMNE, monograph no M170, 250 pp, Barcelona
Wu Z, Swietojanski P, Veaux C, Renals S, King S (2015) A study of speaker adaptation for DNN-based speech synthesis. In: Proceedings of the sixteenth annual conference of the international speech communication association, pp 879–883
LingZHKangSYZenHSeniorASchusterMQianXJMengHMDengLDeep learning for acoustic modeling in parametric speech generation: a systematic review of existing techniques and future trendsIEEE Signal Process Mag2015323355210.1109/MSP.2014.2359987
Luckow A, Cook M, Ashcraft N, Weill E, Djerekarov E, Vorster B (2017) Deep learning in the automotive industry: applications and tools. In: Proceedings of the IEEE international conference on big data, pp 3759–3768
RoyPBhuniaAKDasADeyPHMM-based indic handwritten word recognition using zone segmentationPattern Recognit2016601057107510.1016/j.patcog.2016.04.012
AffonsoCRossiALDVieriaFHACarvalhoACPDLFDDeep learning for biological image classificationExpert Syst Appl2
9344_CR21
9344_CR65
X Yuan (9344_CR70) 2018; 77
9344_CR20
9344_CR64
LMQD Santana (9344_CR54) 2018; 16
9344_CR63
S Thomas (9344_CR58) 2015; 18
9344_CR69
D Cheng (9344_CR12) 2018; 82
9344_CR24
9344_CR23
9344_CR67
9344_CR26
F Rudin (9344_CR47) 2017; 5
A Sanakoyeu (9344_CR53) 2018; 78
M Papakostas (9344_CR85) 2018; 114
N Markovnikov (9344_CR36) 2018; 789
XX Zhu (9344_CR75) 2017; 5
F Salazar (9344_CR49) 2015; 56
F Salazar (9344_CR52) 2017; 24
9344_CR61
Y Dai (9344_CR15) 2018
9344_CR10
F Salazar (9344_CR50) 2016; 119
9344_CR51
E Chong (9344_CR13) 2017; 83
9344_CR14
9344_CR56
9344_CR18
9344_CR16
A Ignatov (9344_CR22) 2018; 62
B Chandra (9344_CR77) 2016; 63
9344_CR19
K Ota (9344_CR83) 2017; 13
B Yonel (9344_CR68) 2017; 12
9344_CR1
Q Abbas (9344_CR2) 2018
XW Chen (9344_CR9) 2014; 2
9344_CR87
9344_CR42
9344_CR86
9344_CR41
CN Vasconcelos (9344_CR60) 2017
9344_CR84
T Wang (9344_CR62) 2017; 14
9344_CR4
9344_CR7
9344_CR44
Y Ling (9344_CR30) 2018; 6
VJR Ripoll (9344_CR46) 2016; 49
S Prabhanjan (9344_CR43) 2017; 17
A Ucar (9344_CR59) 2017; 93
M Kaushal (9344_CR25) 2018; 70
F Salazar (9344_CR48) 2012; 24
K Noda (9344_CR40) 2015; 42
C Affonso (9344_CR3) 2017; 85
MI Razzak (9344_CR45) 2018; 26
C Zhao (9344_CR72) 2018
S Xue (9344_CR66) 2014; 22
9344_CR82
9344_CR32
9344_CR76
9344_CR31
9344_CR74
9344_CR73
9344_CR35
9344_CR79
9344_CR34
9344_CR78
9344_CR33
9344_CR39
Y LeCun (9344_CR27) 2015; 521
9344_CR38
MY Azar (9344_CR8) 2017; 68
F Falcini (9344_CR17) 2017; 34
9344_CR37
G Amato (9344_CR5) 2017; 72
SH Lee (9344_CR28) 2017; 71
LC Chen (9344_CR11) 2018; 40
O Araque (9344_CR6) 2017; 77
ZH Ling (9344_CR29) 2015; 32
RGD Serizel (9344_CR55) 2016; 1
S Sudholt (9344_CR57) 2017
P Roy (9344_CR80) 2016; 60
L Zhang (9344_CR71) 2016; 4
BCS Loh (9344_CR81) 2017
References_xml – reference: Hamid OA, Jiang H (2013) Rapid and effective speaker adaptation of convolutional neural network based models for speech recognition. In: INTERSPEECH, pp 1248–1252
– reference: FalciniFLamiGCostanzaAMDeep learning in automotive softwareIEEE Softw2017343566310.1109/MS.2017.79
– reference: YonelBMasonEYaziciBDeep learning for passive synthetic aperture radarIEEE J Sel Top Signal Process20171219010310.1109/JSTSP.2017.2784181
– reference: Alwzwazy HA, Albehadili HA, Alwan YS, Islam NE (2016) Handwritten digit recognition using convolutional neural networks. In: Proceedings of international journal of innovative research in computer and communication engineering, vol 4(2), pp 1101–1106
– reference: Kannan RJ, Subramanian S (2015) An adaptive approach of tamil character recognition using deep learning with big data-a survey. Adv Intell Syst Comput: 557–567
– reference: Mohamed A, Dahl G, Geoffrey H (2009) Deep belief networks for phone recognition. In: Proceedings of the nips workshop on deep learning for speech recognition and related applications, pp 1–9
– reference: SanakoyeuABautistaMAOmmerBDeep unsupervised learning of visual similaritiesPattern Recogn20187833134310.1016/j.patcog.2018.01.036
– reference: VasconcelosCNVasconcwlosBNExperiment using deep learning for dermoscopy image analysisPattern Recognit Lett201710.1016/j.patrec.2017.11.005
– reference: Lopez D, Rivas E, Gualdron O (2017) Primary user characterization for cognitive radio wireless networks using a neural system based on deep learning. Artif Intell Rev: 1–27
– reference: ChenLCPapandreouGKokkinosIMurphyKYuilleALDeeplab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFsIEEE Trans Pattern Anal Mach Intell201840483484810.1109/TPAMI.2017.2699184
– reference: PapakostasMGiannakopoulosTSpeech-music discrimination using deep visual feature extractorsExpert Syst Appl201811433434410.1016/j.eswa.2018.05.016
– reference: Zhang XL, Wu J (2013) Denoising deep neural networks based voice activity detection. In: Proceedings of the IEEE international conference on acoustics, speech and signal processing, pp 853–857
– reference: Mohsen AM, El-Makky NM, Ghanem N (2017) Author identification using deep learning. In: Proceedings of the 15th IEEE international conference on machine learning and applications, pp 898–903
– reference: ZhaoCChenKWeiZChenYMiaoDWangWMultilevel triplet deep learning model for person reidentificationPattern Recogn Lett201810.1016/j.patrec.2018.04.029
– reference: Poznanski A, Wolf L (2016) CNN-N-gram for handwriting word recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2305–2314
– reference: RipollVJRWojdelARomeroARamosPBrugadaJECG assessment based on neural networks with pertainingAppl Soft Comput20164939940610.1016/j.asoc.2016.08.013
– reference: ThomasSChatelainCHeutteLPaquetTKessentiniYA deep HMM model for multiple keywords spotting in handwritten documentsPattern Anal Appl201518410031015339705910.1007/s10044-014-0433-3
– reference: SalazarFToledoMAMoránROñateEAn empirical comparison of machine learning techniques for dam behaviour modelling structural safetyStruct Saf20155691710.1016/j.strusafe.2015.05.001
– reference: UcarADemirYGuzelisCObject recognition and detection with deep learning for autonomous driving applicationsInt Trans Soc Model Simul201793975976910.1177/0037549717709932
– reference: Soniya, Paul S, Singh L (2015) A review on advances in deep learning. In: Proceedings of IEEE workshop on computational intelligence: theories, applications and future directions (WCI), pp 1–6. https://doi.org/10.1109/wci.2015.7495514
– reference: DaiYWangGA deep inference learning framework for healthcarePattern Recogn Lett201810.1016/j.patrec.2018.02.009
– reference: Wu Z, Swietozanski P, Veaux C, Renals S (2015) A study of speaker adaptation for DNN-based speech synthesis. In: Proceedings of the interspeech conference, pp 1–5
– reference: Gurjar N, Sudholt S, Fink GA (2018) Learning deep representations for word spotting under weak supervision. In: Proceedings of the 13th IAPR international workshop on document analysis systems (DAS), pp 7s–12s
– reference: SantanaLMQDSantosRMMatosLNMacedoHTDeep neural networks for acoustic modeling in the presence of noiseIEEE Latin Am Trans201816391892510.1109/TLA.2018.8358674
– reference: Luckow A, Cook M, Ashcraft N, Weill E, Djerekarov E, Vorster B (2017) Deep learning in the automotive industry: applications and tools. In: Proceedings of the IEEE international conference on big data, pp 3759–3768
– reference: PrabhanjanSDineshRdeep learning approach for devanagari script recognitionProc Int J Image Graph2017173175001610.1142/S0219467817500164
– reference: Wang L, Sng D (2015) Deep learning algorithms with applications to video analytics for a smart city: a survey. arXiv, preprint arXiv: 1512.03131
– reference: Wang Y, Liu M, Bao Z (2016) Deep learning neural network for power system fault diagnosis. In: Proceedings of the 35th Chinese control conference, 1–6
– reference: Wu Z, Swietojanski P, Veaux C, Renals S, King S (2015) A study of speaker adaptation for DNN-based speech synthesis. In: Proceedings of the sixteenth annual conference of the international speech communication association, pp 879–883
– reference: Ashiquzzaman A, Tushar AK (2017) Handwritten arabic numeral recognition using deep learning neural networks. In: Proceedings of IEEE international conference on imaging, vision & pattern recognition, pp 1–4. https://doi.org/10.1109/ICIVPR.2017.7890866
– reference: ChengDGongYChangbXShiaWHauptmannbAZhengaNDeep feature learning via structured graph Laplacian embedding for person re-identificationPattern Recogn2018829410410.1016/j.patcog.2018.05.007
– reference: NodaKYamaguchiYNakadaiKOkunoHGOgataTAudio-visual speech recognition using deep learningAppl Intell201542472273710.1007/s10489-014-0629-7
– reference: AraqueOCorcuera-PlatasISánchez-RadaJFIglesiasCAEnhancing deep learning sentiment analysis with ensemble techniques in social applicationsExpert Syst Appl20177723624610.1016/j.eswa.2017.02.002
– reference: ChongEHanCParkFCDeep learning network for stock market analysis and prediction: methodology, data representations and case studiesExpert Syst Appl20178318720510.1016/j.eswa.2017.04.030
– reference: ChenXWLinXBig data deep learning: challenges and perspectivesIEEE2014251452510.1109/ACCESS.2014.2325029
– reference: LeeSHChanCSMayoSJRemagninoPHow deep learning extracts and learns leaf features for plant classificationPattern Recogn20177111310.1016/j.patcog.2017.05.015
– reference: XueSHamidOAJiangHDaiLLiuQFast adaptation of deep neural network based on discriminant codes for speech recognitionIEEE/ACM Trans Audio Speech Lang Process201422121713172510.1109/TASLP.2014.2346313
– reference: Xiao B, Xiong J, Shi Y (2016) Novel applications of deep learning hidden features for adaptive testing. In: Proceedings of the 21st Asia and South Pacific design automation conference, pp 743–748
– reference: MarkovnikovNKipyatkovaIKarpovAFilchenkovADeep neural networks in russian speech recognitionArtif Intell Nat Lang Commun Comput Inf Sci2018789546710.1007/978-3-319-71746-3_5
– reference: WangTWenCKWangHJiangFJinSDeep learning for wireless physical layer: opportunities and challengesChina Commun201714119211110.1109/CC.2017.8233654
– reference: ZhuXXTuiaDMouLXiaGSZhangLXuFFraundorferFDeep learning in remote sensing: a comprehensive review and list of resourcesIEEE Geosci Remote Sens Mag20175483610.1109/MGRS.2017.2762307
– reference: AzarMYHameyLText summarization using unsupervised deep learningExpert Syst Appl2017689310510.1016/j.eswa.2016.10.017
– reference: Nguyen HD, Le AD, Nakagawa M (2015) Deep neural networks for recognizing online handwritten mathematical symbols. In: Proceedings of the 3rd IAPR IEEE Asian conference on pattern recognition (ACPR), pp 121–125
– reference: Zhou X, Gong W, Fu W, Du F (2017) Application of deep learning in object detection. In: Proceedings of the IEEE/ACIS 16th international conference on computer and information science (ICIS), pp 631–634
– reference: Xing L, Qiao Y (2016) DeepWriter: a multi-stream deep CNN for text-independent writer identification. Comput Vis Pattern Recognit. arXiv:1606.06472
– reference: Krishnan P, Dutta K, Jawahar CV (2018) Word spotting and recognition using deep embedding. In: Proceedings of 13th IAPR international workshop on document analysis systems (DAS). https://doi.org/10.1109/das.2018.70
– reference: Looks M, Herreshoff M, Hutchins D, Norvig P (2017) Deep learning with dynamic computation graphs. In: Proceedings of the international conference on learning representation, pp 1–12
– reference: AmatoGCarraraFFalchiFGennaroCMeghiniCVairoCDeep learning for decentralized parking lot occupancy detectionExpert Syst Appl20177232733410.1016/j.eswa.2016.10.055
– reference: Makhmudov AZ, Abdukarimov SS (2016) Speech recognition using deep learning algorithms. In: Proceedings of the international conference on informatics: problems, methodology, technologies, pp 10–15
– reference: Zulkarneev M, Grigoryan R, Shamraev N (2013) Acoustic modeling with deep belief networks for Russian speech recognition. In: Proceedings of the international conference on speech and computer, pp 17–24
– reference: Dhieb T, Ouarda W, Boubaker H, Alilmi AM (2016) Deep neural network for online writer identification using Beta-elliptic model. In: Proceedings of the international joint conference on neural networks, pp 1863–1870
– reference: ZhangLZhangLDuBDeep learning for remote sensing data: a technical tutorial on the state of the artIEEE Geosci Remote Sens Mag201642224010.1109/MGRS.2016.2540798
– reference: SudholtSFinkGAAttribute CNNs for word spotting in handwritten documentsInt J Doc Anal Recognit (IJDAR)201710.1007/s10032-018-0295-0
– reference: SalazarFToledoMAGonzálezJMOñateEEarly detection of anomalies in dam performance: a methodology based on boosted regression treesStruct Control Health Monit201224112012201710.1002/stc.2012
– reference: LeCunYBengioYHintonGDeep learningNature2015521110334274110.1038/nature14539
– reference: IgnatovAReal-time human activity recognition from accelerometer data using convolutional neural networksAppl Soft Comput20186291592210.1016/j.asoc.2017.09.027
– reference: Wicht B, Fischer A, Hennebert J (2016) Deep learning features for handwritten keyword spotting. In: Proceedings of the 23rd international conference on pattern recognition (ICPR). https://doi.org/10.1109/icpr.2016.7900165
– reference: Ghosh MMA, Maghari AY (2017) A comparative study on handwriting digit recognition using neural networks. In: Proceedings of the promising electronic technologies (ICPET), pp 77–81
– reference: SerizelRGDDeep-neural network approaches for speech recognition with heterogeneous groups of speakers including childrenNat Lang Eng201613126
– reference: AbbasQIbrahimMEAJaffarMAA comprehensive review of recent advances on deep vision systemsArtif Intell Rev201810.1007/s10462-018-9633-3
– reference: Arevalo A, Niño J, Hernández G, Sandoval J (2016) High-frequency trading strategy based on deep neural networks. In: Proceedings of the international conference on intelligent computing, pp 424–436
– reference: RoyPBhuniaAKDasADeyPHMM-based indic handwritten word recognition using zone segmentationPattern Recognit2016601057107510.1016/j.patcog.2016.04.012
– reference: RazzakMINazSZaibADeep learning for medical image processing: overview, challenges and the futureClassif BioApps Lect Notes Comput Vis Biomech201826323350
– reference: SalazarFMoranRToledoMAOñateEData-based models for the prediction of dam behaviour: a review and some methodological considerationsArch Comput Methods Eng20172411211360.7401710.1007/s11831-015-9157-9
– reference: RudinFLiGJWangKAn algorithm for power system fault analysis based on convolutional deep learning neural networksInt J Res Educ Sci Methods2017591118
– reference: Liu PH, Su SF, Chen MC, Hsiao CC (2015) Deep learning and its application to general image classification. In: Proceedings of the international conference on informatics and cybernetics for computational social systems, pp 1–4
– reference: LingYJinCGuoruDYaTJianYJiachenSSpectrum prediction based on Taguchi method in deep learning with long short-term memoryIEEE Access2018614592345933
– reference: AffonsoCRossiALDVieriaFHACarvalhoACPDLFDDeep learning for biological image classificationExpert Syst Appl20178511412210.1016/j.eswa.2017.05.039
– reference: OtaKDaoMSMezarisVNataleFGBDDeep learning for mobile multimedia: a surveyACM Trans Multimed Comput Commun Appl (TOMM)2017133s34
– reference: Jia X (2017) image recognition method based on deep learning. In: Proceedings of the 29th IEEE, Chinese control and decision conference (CCDC), pp 4730–4735
– reference: YuanXXieLAbouelenienMA regularized ensemble framework of deep learning for cancer detection from multi-class, imbalanced training dataPattern Recogn20187716017210.1016/j.patcog.2017.12.017
– reference: Chen CH, Lee CR, Lu WCH (2016) A mobile cloud framework for deep learning and its application to smart car camera. In: Proceedings of the international conference on internet of vehicles, pp 14–25. https://doi.org/10.1007/978-3-319-51969-22
– reference: Abadi M, Paul B, Jianmin C, Zhifeng C, Andy D, Jeffrey D, Matthieu D (2016) Tensorflow: a system for large-scale machine learning. In: The proceedings of the 12th USENIX symposium on operating systems design and implementation (OSDI’16), vol 16, pp 265–283
– reference: LingZHKangSYZenHSeniorASchusterMQianXJMengHMDengLDeep learning for acoustic modeling in parametric speech generation: a systematic review of existing techniques and future trendsIEEE Signal Process Mag2015323355210.1109/MSP.2014.2359987
– reference: Chu J, Srihari S (2014) Writer identification using a deep neural network. In: Proceedings of the 2014 Indian conference on computer vision graphics and image processing, pp 1–7
– reference: Puthussery AR, Haradi KP, Erol BA, Benavidez P, Rad P, Jamshidi M (2017) A deep vision landmark framework for robot navigation. In: Proceedings of the system of systems engineering conference, pp 1–6
– reference: ChandraBSharmaRKDeep learning withadaptive learning rate using Laplacian score, expert systems with applicationsInt J201663C17
– reference: Cireşan D, Meier U, Schmidhuber J (2012) Multi-column deep neural networks for image classification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3642–3649
– reference: Yadav U, Verma S, Xaxa DK, Mahobiya C (2017) A deep learning based character recognition system from multimedia document. In: Proceedings of the international conference on innovations in power and advanced computing technologies, pp 1–7
– reference: Yu X, Wu X, Luo C, Ren P (2017) Deep learning in remote sensing scene classification: a data augmentation enhanced convolutional neural network framework. GIScience & Remote Sens: 1–19
– reference: LohBCSThenPHHDeep learning for cardiaccomputer-aided diagnosis: benefits, issues & solutionsM Health201710.21037/mhealth.2017.09.01
– reference: KaushalMKhehraBSharmaASoft computing based object detection and tracking approaches: state-of-the-art surveyAppl Soft Comput20187042346410.1016/j.asoc.2018.05.023
– reference: Gheisari M, Wang G, Bhuiyan MZA (2017) A survey on deep learning in big data. In: Proceedings of the IEEE international conference on embedded and ubiquitous computing (EUC), pp 1–8
– reference: Zhong SH, Li Y, Le B (2015) Query oriented unsupervised multi document summarization via deep learning. Expert Syst Appl, pp 1–10
– reference: Nweke HF, Teh YW, Al-garadi MA, Alo UR (2018) Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst Appl: 1–87
– reference: SalazarFToledoMAOñateESuárezBInterpretation of dam deformation and leakage with boosted regression treesEng Struct201611923025110.1016/j.engstruct.2016.04.012
– reference: Salazar F, Oñate E, Toledo MA (2017a) A machine learning based methodology for anomaly detection in dam behaviour, CIMNE, monograph no M170, 250 pp, Barcelona
– volume: 1
  start-page: 1
  issue: 3
  year: 2016
  ident: 9344_CR55
  publication-title: Nat Lang Eng
– ident: 9344_CR87
– ident: 9344_CR18
  doi: 10.1109/CSE-EUC.2017.215
– volume: 12
  start-page: 90
  issue: 1
  year: 2017
  ident: 9344_CR68
  publication-title: IEEE J Sel Top Signal Process
  doi: 10.1109/JSTSP.2017.2784181
– ident: 9344_CR35
– ident: 9344_CR76
  doi: 10.1007/978-3-319-01931-4_3
– volume: 40
  start-page: 834
  issue: 4
  year: 2018
  ident: 9344_CR11
  publication-title: IEEE Trans Pattern Anal Mach Intell
  doi: 10.1109/TPAMI.2017.2699184
– volume: 22
  start-page: 1713
  issue: 12
  year: 2014
  ident: 9344_CR66
  publication-title: IEEE/ACM Trans Audio Speech Lang Process
  doi: 10.1109/TASLP.2014.2346313
– volume: 5
  start-page: 8
  issue: 4
  year: 2017
  ident: 9344_CR75
  publication-title: IEEE Geosci Remote Sens Mag
  doi: 10.1109/MGRS.2017.2762307
– volume: 34
  start-page: 56
  issue: 3
  year: 2017
  ident: 9344_CR17
  publication-title: IEEE Softw
  doi: 10.1109/MS.2017.79
– ident: 9344_CR23
  doi: 10.1109/CCDC.2017.7979332
– volume: 24
  start-page: 1
  issue: 1
  year: 2017
  ident: 9344_CR52
  publication-title: Arch Comput Methods Eng
  doi: 10.1007/s11831-015-9157-9
– ident: 9344_CR74
  doi: 10.1109/ICIS.2017.7960069
– volume: 63
  start-page: 1
  issue: C
  year: 2016
  ident: 9344_CR77
  publication-title: Int J
– volume: 62
  start-page: 915
  year: 2018
  ident: 9344_CR22
  publication-title: Appl Soft Comput
  doi: 10.1016/j.asoc.2017.09.027
– ident: 9344_CR51
– year: 2017
  ident: 9344_CR57
  publication-title: Int J Doc Anal Recognit (IJDAR)
  doi: 10.1007/s10032-018-0295-0
– volume: 42
  start-page: 722
  issue: 4
  year: 2015
  ident: 9344_CR40
  publication-title: Appl Intell
  doi: 10.1007/s10489-014-0629-7
– ident: 9344_CR32
– volume: 14
  start-page: 92
  issue: 11
  year: 2017
  ident: 9344_CR62
  publication-title: China Commun
  doi: 10.1109/CC.2017.8233654
– ident: 9344_CR84
– volume: 77
  start-page: 236
  year: 2017
  ident: 9344_CR6
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2017.02.002
– volume: 56
  start-page: 9
  year: 2015
  ident: 9344_CR49
  publication-title: Struct Saf
  doi: 10.1016/j.strusafe.2015.05.001
– year: 2018
  ident: 9344_CR15
  publication-title: Pattern Recogn Lett
  doi: 10.1016/j.patrec.2018.02.009
– volume: 32
  start-page: 35
  issue: 3
  year: 2015
  ident: 9344_CR29
  publication-title: IEEE Signal Process Mag
  doi: 10.1109/MSP.2014.2359987
– volume: 24
  start-page: 2012
  issue: 11
  year: 2012
  ident: 9344_CR48
  publication-title: Struct Control Health Monit
  doi: 10.1002/stc.2012
– year: 2017
  ident: 9344_CR81
  publication-title: M Health
  doi: 10.21037/mhealth.2017.09.01
– year: 2018
  ident: 9344_CR2
  publication-title: Artif Intell Rev
  doi: 10.1007/s10462-018-9633-3
– ident: 9344_CR14
  doi: 10.1145/2683483.2683514
– ident: 9344_CR63
  doi: 10.1109/icpr.2016.7900165
– ident: 9344_CR16
  doi: 10.1109/IJCNN.2016.7727426
– ident: 9344_CR42
– ident: 9344_CR38
– volume: 13
  start-page: 34
  issue: 3s
  year: 2017
  ident: 9344_CR83
  publication-title: ACM Trans Multimed Comput Commun Appl (TOMM)
– ident: 9344_CR21
– ident: 9344_CR78
  doi: 10.21437/Interspeech.2015-270
– ident: 9344_CR73
  doi: 10.1016/j.eswa.2015.05.034
– ident: 9344_CR24
– ident: 9344_CR64
  doi: 10.21437/Interspeech.2015-270
– volume: 5
  start-page: 11
  issue: 9
  year: 2017
  ident: 9344_CR47
  publication-title: Int J Res Educ Sci Methods
– volume: 114
  start-page: 334
  year: 2018
  ident: 9344_CR85
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2018.05.016
– volume: 26
  start-page: 323
  year: 2018
  ident: 9344_CR45
  publication-title: Classif BioApps Lect Notes Comput Vis Biomech
– volume: 18
  start-page: 1003
  issue: 4
  year: 2015
  ident: 9344_CR58
  publication-title: Pattern Anal Appl
  doi: 10.1007/s10044-014-0433-3
– volume: 70
  start-page: 423
  year: 2018
  ident: 9344_CR25
  publication-title: Appl Soft Comput
  doi: 10.1016/j.asoc.2018.05.023
– volume: 789
  start-page: 54
  year: 2018
  ident: 9344_CR36
  publication-title: Artif Intell Nat Lang Commun Comput Inf Sci
  doi: 10.1007/978-3-319-71746-3_5
– ident: 9344_CR7
  doi: 10.1109/ICIVPR.2017.7890866
– volume: 49
  start-page: 399
  year: 2016
  ident: 9344_CR46
  publication-title: Appl Soft Comput
  doi: 10.1016/j.asoc.2016.08.013
– volume: 6
  start-page: 45923
  issue: 1
  year: 2018
  ident: 9344_CR30
  publication-title: IEEE Access
– volume: 82
  start-page: 94
  year: 2018
  ident: 9344_CR12
  publication-title: Pattern Recogn
  doi: 10.1016/j.patcog.2018.05.007
– ident: 9344_CR37
– volume: 83
  start-page: 187
  year: 2017
  ident: 9344_CR13
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2017.04.030
– ident: 9344_CR1
– volume: 93
  start-page: 759
  issue: 9
  year: 2017
  ident: 9344_CR59
  publication-title: Int Trans Soc Model Simul
  doi: 10.1177/0037549717709932
– year: 2018
  ident: 9344_CR72
  publication-title: Pattern Recogn Lett
  doi: 10.1016/j.patrec.2018.04.029
– ident: 9344_CR86
  doi: 10.1007/978-3-319-42297-8_40
– volume: 72
  start-page: 327
  year: 2017
  ident: 9344_CR5
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2016.10.055
– ident: 9344_CR61
  doi: 10.1109/ChiCC.2016.7554408
– volume: 60
  start-page: 1057
  year: 2016
  ident: 9344_CR80
  publication-title: Pattern Recognit
  doi: 10.1016/j.patcog.2016.04.012
– volume: 119
  start-page: 230
  year: 2016
  ident: 9344_CR50
  publication-title: Eng Struct
  doi: 10.1016/j.engstruct.2016.04.012
– ident: 9344_CR34
– volume: 17
  start-page: 1750016
  issue: 3
  year: 2017
  ident: 9344_CR43
  publication-title: Proc Int J Image Graph
  doi: 10.1142/S0219467817500164
– ident: 9344_CR20
  doi: 10.1109/DAS.2018.35
– ident: 9344_CR82
  doi: 10.1109/CVPR.2012.6248110
– volume: 4
  start-page: 22
  issue: 2
  year: 2016
  ident: 9344_CR71
  publication-title: IEEE Geosci Remote Sens Mag
  doi: 10.1109/MGRS.2016.2540798
– volume: 16
  start-page: 918
  issue: 3
  year: 2018
  ident: 9344_CR54
  publication-title: IEEE Latin Am Trans
  doi: 10.1109/TLA.2018.8358674
– ident: 9344_CR67
  doi: 10.1109/IPACT.2017.8245200
– volume: 85
  start-page: 114
  year: 2017
  ident: 9344_CR3
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2017.05.039
– volume: 68
  start-page: 93
  year: 2017
  ident: 9344_CR8
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2016.10.017
– volume: 521
  start-page: 1
  year: 2015
  ident: 9344_CR27
  publication-title: Nature
  doi: 10.1038/nature14539
– ident: 9344_CR69
  doi: 10.1080/15481603.2017.1323377
– ident: 9344_CR19
– ident: 9344_CR26
  doi: 10.1109/das.2018.70
– year: 2017
  ident: 9344_CR60
  publication-title: Pattern Recognit Lett
  doi: 10.1016/j.patrec.2017.11.005
– volume: 2
  start-page: 514
  year: 2014
  ident: 9344_CR9
  publication-title: IEEE
  doi: 10.1109/ACCESS.2014.2325029
– ident: 9344_CR39
  doi: 10.1109/ACPR.2015.7486478
– ident: 9344_CR79
  doi: 10.1109/ICFHR.2016.0112
– ident: 9344_CR41
  doi: 10.1016/j.eswa.2018.03.056
– volume: 77
  start-page: 160
  year: 2018
  ident: 9344_CR70
  publication-title: Pattern Recogn
  doi: 10.1016/j.patcog.2017.12.017
– ident: 9344_CR44
  doi: 10.1109/SYSOSE.2017.7994976
– ident: 9344_CR65
  doi: 10.1109/ASPDAC.2016.7428100
– volume: 71
  start-page: 1
  year: 2017
  ident: 9344_CR28
  publication-title: Pattern Recogn
  doi: 10.1016/j.patcog.2017.05.015
– ident: 9344_CR4
– ident: 9344_CR10
  doi: 10.1007/978-3-319-51969-22
– volume: 78
  start-page: 331
  year: 2018
  ident: 9344_CR53
  publication-title: Pattern Recogn
  doi: 10.1016/j.patcog.2018.01.036
– ident: 9344_CR56
  doi: 10.1109/wci.2015.7495514
– ident: 9344_CR33
– ident: 9344_CR31
  doi: 10.1109/ICCSS.2015.7281139
SSID ssj0054992
Score 2.6917987
Snippet Nowadays, deep learning is a current and a stimulating field of machine learning. Deep learning is the most effective, supervised, time and cost efficient...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1071
SubjectTerms Algorithms
Computer vision
Deep learning
Engineering
Face recognition
Handwriting recognition
Image classification
Machine learning
Mathematical and Computational Engineering
Natural language processing
Object recognition
Original Paper
Speech recognition
Title A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning
URI https://link.springer.com/article/10.1007/s11831-019-09344-w
https://www.proquest.com/docview/2436156874
Volume 27
WOSCitedRecordID wos000565732400005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAVX
  databaseName: SpringerLINK Contemporary 1997-Present
  customDbUrl:
  eissn: 1886-1784
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0054992
  issn: 1134-3060
  databaseCode: RSV
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR3JSsNA9OF20INLVawbc_CmgSQzSSbeilr0YBE3eguziqCtNGnFv3dmnLRVVNBjyMwjvP3lbQAHecixzpSt00lkQDKeBxyTMIgF4xHNmHmQbtlE1unQbje_8k1hZV3tXqcknaaeNLsZ7rOhry3xwYQEr7Mwb8wdteJ4fXNf618b8LgcZ4TtP_809K0y38P4bI4mPuaXtKizNu2V_33nKix77xK1PthhDWZUrwEr3tNEXo7LBixNjSFch-sWuhkORuoN9TU6VeoF-amrD4j1JLqoStSaynMfoxYyuhFdsQGTjw_PqOqjS1eTqcYXN-CufXZ7ch74XQuBwDSpgoQQ4yhJqiSXjGkqNNFKM0HsRD-dYR0ZTONEJyKkLOSGppzwRIuEsjzUDONNmOv1e2oLEGeaMCxTzUNJZBpzZnxGzgSPYwOPR02IapQXwg8it_swnorJCGWLwsKgsHAoLF6bcDi-8_IxhuPX07s1JQsvkmURE2y8t5RmpAlHNeUmr3-Gtv234zuwGNuY3NWh7cJcNRiqPVgQo-qxHOw7Vn0HaX_jGg
linkProvider Springer Nature
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LS8NAEB58gXrwURWrVffgTQNJdtMk3opaFNtS6oPewj5LQdvSxBb_vbtxY1VU0GPI7hBmZme-zbwAjmOXYRVKk6cTCIeELHYYJq7jc8q8KKT6QeTDJsJWK-p247YtCkuLbPciJJlb6lmxm9Y-c_U1KT6YEGc6D4tEeyyTyNe5fSjsr7nw5DFOD5t__lXXlsp8T-OzO5phzC9h0dzb1Nf_950bsGbRJaq9qcMmzMlBCdYt0kT2HKclWP3QhnALOjV0-zyeyBc0VOhCyhGyXVd7iA4Eus5SVPsQ5z5DNaRtI2rTMRX93hPKhqiZ52TK943bcF-_vDu_cuysBYfjKMicgBANlEQkBROUqogroqSinJiOfirEynNjDdZUwN2IukzLlBEWKB5ENHYVxXgHFgbDgdwFxKgiFIuqYq4gouozqjEjo5z5vqbHvDJ4BcsTbhuRm3kYj8mshbJhYaJZmOQsTKZlOHnfM3prw_Hr6kohycQeyTTxCdborRqFpAynheRmr3-mtve35UewfHXXbCSN69bNPqz45n6e56RVYCEbP8sDWOKTrJ-OD3O1fQUXJ-X-
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8NAEB60iujBR1WsVt2DNw0m2U2TeCvWYlFL8YW3sM9S0LS0aYv_3t00aauoIB5DdpcwM7v7TWbmG4CT0GZY-dLk6XjCIj4LLYaJbbmcMifwqX4QabMJv9kMXl7C1lwVf5rtnockJzUNhqUpTs57Qp3PCt-0JRo32KT7YEKs8SIsEdM0yPjrD8_5WWycnzTe6WDz_79iZ2Uz36_x-Wqa4c0vIdL05qlv_P-bN2E9Q52oOjGTLViQcRE2MgSKsv09KMLaHD3hNtxX0cOwP5LvqKtQTcoeythY24jGAjWSAarOxb8vUBXpMxO1aJ-KTvsNJV10l-ZqyunEHXiqXz1eXltZDwaL48BLLI8QDaBEIAUTlKqAK6KkopwYpj_lY-XYoQZxyuN2QG2mdc0I8xT3AhraimK8C4W4G8s9QIwqQrGoKGYLIiouoxpLMsqZ6-r1mFMCJxd_xDOCctMn4zWaUSsbEUZahFEqwmhcgtPpnN6EnuPX0eVcq1G2VQeRS7BGdZXAJyU4y7U4e_3zavt_G34MK61aPbptNG8OYNU1bnuaqlaGQtIfykNY5qOkM-gfpRb8ARSB7uI
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Survey+of+Deep+Learning+and+Its+Applications%3A+A+New+Paradigm+to+Machine+Learning&rft.jtitle=Archives+of+computational+methods+in+engineering&rft.au=Dargan%2C+Shaveta&rft.au=Kumar%2C+Munish&rft.au=Ayyagari%2C+Maruthi+Rohit&rft.au=Kumar%2C+Gulshan&rft.date=2020-09-01&rft.issn=1134-3060&rft.eissn=1886-1784&rft.volume=27&rft.issue=4&rft.spage=1071&rft.epage=1092&rft_id=info:doi/10.1007%2Fs11831-019-09344-w&rft.externalDBID=n%2Fa&rft.externalDocID=10_1007_s11831_019_09344_w
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1134-3060&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1134-3060&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1134-3060&client=summon