Bidirectional LSTM-RNN-based hybrid deep learning frameworks for univariate time series classification
Time series classification (TSC) has been around for recent decades as a significant research problem for industry practitioners as well as academic researchers. Due to the rapid increase in temporal data in a wide range of disciplines, an incredible amount of algorithms have been proposed. This pap...
Uložené v:
| Vydané v: | The Journal of supercomputing Ročník 77; číslo 7; s. 7021 - 7045 |
|---|---|
| Hlavní autori: | , , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
New York
Springer US
01.07.2021
Springer Nature B.V |
| Predmet: | |
| ISSN: | 0920-8542, 1573-0484 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | Time series classification (TSC) has been around for recent decades as a significant research problem for industry practitioners as well as academic researchers. Due to the rapid increase in temporal data in a wide range of disciplines, an incredible amount of algorithms have been proposed. This paper proposes robust approaches based on state-of-the-art techniques, bidirectional long short-term memory (BiLSTM), fully convolutional network (FCN), and attention mechanism. A BiLSTM considers both forward and backward dependencies, and FCN is proven to be good at feature extraction as a TSC baseline. Therefore, we augment BiLSTM and FCN in a hybrid deep learning architecture, BiLSTM-FCN. Moreover, we similarly explore the use of the attention mechanism to check its efficiency on BiLSTM-FCN and propose another model ABiLSTM-FCN. We validate the performance on 85 datasets from the University of California Riverside (UCR) univariate time series archive. The proposed models are evaluated in terms of classification testing error and f1-score and also provide performance comparison with various existing state-of-the-art techniques. The experimental results show that our proposed models perform comprehensively better than the existing state-of-the-art methods and baselines. |
|---|---|
| AbstractList | Time series classification (TSC) has been around for recent decades as a significant research problem for industry practitioners as well as academic researchers. Due to the rapid increase in temporal data in a wide range of disciplines, an incredible amount of algorithms have been proposed. This paper proposes robust approaches based on state-of-the-art techniques, bidirectional long short-term memory (BiLSTM), fully convolutional network (FCN), and attention mechanism. A BiLSTM considers both forward and backward dependencies, and FCN is proven to be good at feature extraction as a TSC baseline. Therefore, we augment BiLSTM and FCN in a hybrid deep learning architecture, BiLSTM-FCN. Moreover, we similarly explore the use of the attention mechanism to check its efficiency on BiLSTM-FCN and propose another model ABiLSTM-FCN. We validate the performance on 85 datasets from the University of California Riverside (UCR) univariate time series archive. The proposed models are evaluated in terms of classification testing error and f1-score and also provide performance comparison with various existing state-of-the-art techniques. The experimental results show that our proposed models perform comprehensively better than the existing state-of-the-art methods and baselines. |
| Author | Elfatyany, Aya Riaz, Adnan Khan, Mehak Karim, Sajida Wang, Hongzhi |
| Author_xml | – sequence: 1 givenname: Mehak orcidid: 0000-0001-8959-6872 surname: Khan fullname: Khan, Mehak email: mehakkhan@hit.edu.cn organization: School of Computer Science and Technology, Harbin Institute of Technology – sequence: 2 givenname: Hongzhi surname: Wang fullname: Wang, Hongzhi organization: School of Computer Science and Technology, Harbin Institute of Technology – sequence: 3 givenname: Adnan surname: Riaz fullname: Riaz, Adnan organization: School of Computer Science and Technology, Dalian University of Technology – sequence: 4 givenname: Aya surname: Elfatyany fullname: Elfatyany, Aya organization: School of Computer Science and Technology, Harbin Institute of Technology – sequence: 5 givenname: Sajida surname: Karim fullname: Karim, Sajida organization: School of Computer Science and Technology, Harbin Institute of Technology |
| BookMark | eNp9kEtPwzAQhC1UJErhD3CyxNmwtuPEOULFSyog8ThbjrMGQ5oUOwWVX09KkZA4cNrDzjeamV0yarsWCTngcMQBiuPEuRAFAwEMpMqBfW6RMVeFZJDpbETGUA4vrTKxQ3ZTegGATBZyTPxpqENE14eutQ2d3T9cs7ubG1bZhDV9XlUx1LRGXNAGbWxD-0R9tHP86OJror6LdNmGdxuD7ZH2YY40YQyYqGtsSsEHZ9fWe2Tb2ybh_s-dkMfzs4fpJZvdXlxNT2bMSa16lmWVRi-VKqvCOQ6FrVGrAsEBKO2rEqUouYTcurqWpcgRSp3nUvqsEMArOSGHG99F7N6WmHrz0i3j0CwZoYbGGddKDCq9UbnYpRTRGxf675x9tKExHMx6VbNZ1Qyrmu9VzeeAij_oIoa5jav_IbmB0iBunzD-pvqH-gLMGo1B |
| CitedBy_id | crossref_primary_10_1007_s11071_024_10244_3 crossref_primary_10_1109_JIOT_2024_3521248 crossref_primary_10_1016_j_comcom_2022_10_024 crossref_primary_10_3390_app12199723 crossref_primary_10_1016_j_eswa_2022_119073 crossref_primary_10_1016_j_csl_2025_101886 crossref_primary_10_3233_JCM_237124 crossref_primary_10_1007_s41666_025_00209_5 crossref_primary_10_1016_j_ecoinf_2024_102653 crossref_primary_10_1016_j_apenergy_2024_125212 crossref_primary_10_1097_MAT_0000000000002299 crossref_primary_10_3390_ai6040077 crossref_primary_10_1016_j_measurement_2025_118494 crossref_primary_10_1016_j_nexres_2025_100597 crossref_primary_10_1016_j_asoc_2024_111643 crossref_primary_10_1109_ACCESS_2024_3474848 crossref_primary_10_1109_JSEN_2025_3562705 crossref_primary_10_1016_j_atmosenv_2022_119362 crossref_primary_10_3390_diagnostics13111923 crossref_primary_10_3390_ani13030546 crossref_primary_10_1016_j_engappai_2023_106296 crossref_primary_10_1109_ACCESS_2024_3416755 crossref_primary_10_1109_TKDE_2022_3177724 crossref_primary_10_1038_s41598_024_52240_y crossref_primary_10_1111_coin_12556 crossref_primary_10_1186_s40854_024_00637_z crossref_primary_10_1109_TAI_2024_3430236 crossref_primary_10_1007_s43538_024_00286_x crossref_primary_10_1088_1361_6501_ad8d70 crossref_primary_10_1016_j_inffus_2025_103458 crossref_primary_10_1109_TII_2021_3136562 crossref_primary_10_12677_hjbm_2025_153058 crossref_primary_10_32604_cmc_2022_024490 crossref_primary_10_3390_electronics13163204 crossref_primary_10_1109_ACCESS_2025_3577499 crossref_primary_10_1007_s11063_021_10484_z crossref_primary_10_1016_j_amjoto_2023_104155 crossref_primary_10_1016_j_compag_2023_108379 crossref_primary_10_1016_j_jksuci_2024_102227 crossref_primary_10_3390_app14062254 crossref_primary_10_1016_j_eswa_2025_127202 |
| Cites_doi | 10.1007/978-3-319-94340-4_11 10.1109/TPAMI.2020.2986319 10.1016/j.ijleo.2017.12.038 10.1007/s10618-007-0064-z 10.18653/v1/W18-6226 10.1109/ICDMW.2016.0078 10.1007/s10115-004-0154-9 10.1145/2339530.2339576 10.1007/s10618-014-0377-7 10.1007/s10618-014-0361-2 10.1109/ASRU.2013.6707742 10.1109/TKDE.2015.2416723 10.1109/JAS.2019.1911747 10.1016/j.eswa.2016.10.065 10.1007/s10618-019-00619-1 10.1007/s10618-015-0441-y 10.1109/ACCESS.2017.2779939 10.3390/s18114019 10.1016/j.asoc.2019.105765 10.1109/TPAMI.2008.137 10.1109/78.650093 10.1109/TPAMI.2013.72 10.1109/IJCNN.2017.7966039 10.1109/ICPR.2018.8545288 10.1016/j.swevo.2020.100650 10.1016/j.neunet.2005.06.042 10.1162/neco.1997.9.8.1735 10.1109/IWCIA47330.2019.8955030 10.1145/1150402.1150498 |
| ContentType | Journal Article |
| Copyright | The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021 The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021. |
| Copyright_xml | – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021 – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021. |
| DBID | AAYXX CITATION JQ2 |
| DOI | 10.1007/s11227-020-03560-z |
| DatabaseName | CrossRef ProQuest Computer Science Collection |
| DatabaseTitle | CrossRef ProQuest Computer Science Collection |
| DatabaseTitleList | ProQuest Computer Science Collection |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1573-0484 |
| EndPage | 7045 |
| ExternalDocumentID | 10_1007_s11227_020_03560_z |
| GrantInformation_xml | – fundername: National Natural Science Foundation of China (NSFC) grantid: U1509216, U1866602, 61602129, and Microsoft Research Asia |
| GroupedDBID | -4Z -59 -5G -BR -EM -Y2 -~C .4S .86 .DC .VR 06D 0R~ 0VY 123 199 1N0 1SB 2.D 203 28- 29L 2J2 2JN 2JY 2KG 2KM 2LR 2P1 2VQ 2~H 30V 4.4 406 408 409 40D 40E 5QI 5VS 67Z 6NX 78A 8TC 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYOK AAYQN AAYTO AAYZH ABAKF ABBBX ABBXA ABDBF ABDPE ABDZT ABECU ABFTD ABFTV ABHLI ABHQN ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACHSB ACHXU ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACUHS ACZOJ ADHHG ADHIR ADIMF ADINQ ADKNI ADKPE ADMLS ADQRH ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFEXP AFGCZ AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHSBF AHYZX AI. AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARCSS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN B-. B0M BA0 BBWZM BDATZ BGNMA BSONS CAG COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DU5 EAD EAP EAS EBD EBLON EBS EDO EIOEI EJD EMK EPL ESBYG ESX F5P FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNWQR GQ6 GQ7 GQ8 GXS H13 HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ H~9 I-F I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ KDC KOV KOW LAK LLZTM M4Y MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM OVD P19 P2P P9O PF0 PT4 PT5 QOK QOS R4E R89 R9I RHV RNI ROL RPX RSV RZC RZE RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TEORI TSG TSK TSV TUC TUS U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW VH1 W23 W48 WH7 WK8 YLTOR Z45 Z7R Z7X Z7Z Z83 Z88 Z8M Z8N Z8R Z8T Z8W Z92 ZMTXR ~8M ~EX AAPKM AAYXX ABBRH ABDBE ABFSG ABJCF ABRTQ ACSTC ADHKG ADKFA AEZWR AFDZB AFFHD AFHIU AFKRA AFOHR AGQPQ AHPBZ AHWEU AIXLP ARAPS ATHPR AYFIA BENPR BGLVJ CCPQU CITATION HCIFZ K7- M7S PHGZM PHGZT PQGLB PTHSS JQ2 |
| ID | FETCH-LOGICAL-c385t-44b8ef3559b7cc107ade857e0c0058fb9e3291306acdd3926e0986633f47201b3 |
| IEDL.DBID | RSV |
| ISICitedReferencesCount | 53 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000604819500010&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0920-8542 |
| IngestDate | Thu Sep 25 00:50:44 EDT 2025 Tue Nov 18 19:39:50 EST 2025 Sat Nov 29 04:27:39 EST 2025 Fri Feb 21 02:49:18 EST 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 7 |
| Keywords | Deep learning Attention mechanism Bidirectional long short-term memory recurrent neural network Time series classification Convolutional neural network |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c385t-44b8ef3559b7cc107ade857e0c0058fb9e3291306acdd3926e0986633f47201b3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0001-8959-6872 |
| PQID | 2543741852 |
| PQPubID | 2043774 |
| PageCount | 25 |
| ParticipantIDs | proquest_journals_2543741852 crossref_citationtrail_10_1007_s11227_020_03560_z crossref_primary_10_1007_s11227_020_03560_z springer_journals_10_1007_s11227_020_03560_z |
| PublicationCentury | 2000 |
| PublicationDate | 2021-07-01 |
| PublicationDateYYYYMMDD | 2021-07-01 |
| PublicationDate_xml | – month: 07 year: 2021 text: 2021-07-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | New York |
| PublicationPlace_xml | – name: New York |
| PublicationSubtitle | An International Journal of High-Performance Computer Design, Analysis, and Use |
| PublicationTitle | The Journal of supercomputing |
| PublicationTitleAbbrev | J Supercomput |
| PublicationYear | 2021 |
| Publisher | Springer US Springer Nature B.V |
| Publisher_xml | – name: Springer US – name: Springer Nature B.V |
| References | Sch P et al. (2017) Fast and accurate time series classification with WEASEL, in Proceedings of the 2017 ACM on Conference On Information And Knowledge Management. ACM Singapore. 637–646 SchusterMPaliwalKKBidirectional recurrent neural networksIEEE Trans Sign Proc199745112673268110.1109/78.650093 Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint https://arxiv.org/abs/1409.0473 Lin M, Chen Q, Yan S (2013) Network in network. arXiv preprint https://arxiv.org/abs/1312.4400 Hashida S, Tamura K (2019) Multi-channel MHLF: LSTM-FCN using MACD-histogram with multi-channel input for time series classification. in (2019) IEEE 11th International Workshop on Computational Intelligence and Applications (IWCIA) Powers DM (2020) Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation. arXiv preprint https://arxiv.org/abs/2010.16061 Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint https://arxiv.org/abs/1502.03167 Jiang F, Chen H, Zhang L-J (2018) FCN-biLSTM based VAT invoice recognition and processing. IN INTERNATIONAL CONFERENCE ON EDGE COMPUTING, Springer SchäferPScalable time series classificationData Min Knowl Disc201630512731298353998110.1007/s10618-015-0441-y LinesJBagnallATime series classification with ensembles of elastic distance measuresData Min Knowl Disc2015293565592333431510.1007/s10618-014-0361-2 ChenTImproving sentiment analysis via sentence type classification using BiLSTM-CRF and CNNExpert Syst Appl20177222123010.1016/j.eswa.2016.10.065 LinJExperiencing SAX: a novel symbolic representation of time seriesData Min Knowl Disc2007152107144240978310.1007/s10618-007-0064-z Xu K et al. (2015) Show attend and tell: Neural image caption generation with visual attention. In International Conference On Machine Learning ChorowskiJKAttention-based models for speech recognitionAdvances in neural information processing systems.201528577585 Abadi M et al. (2016) Tensorflow: a system for large-scale machine learning. In 12th {USENIX} Symposium on operating systems design and implementation ({OSDI} 16) Nolan JR (1997) Estimating the true performance of classification-based nlp technology. In: From research to commercial applications: Making NLP Work in Practice EslingPAgonCTime-series data mining ACM Computing Surveys (CSUR)201245112 Lipton ZC, Berkowitz J, Elkan C (2015) A critical review of recurrent neural networks for sequence learning. arXiv preprint https://arxiv.org/abs/1506.00019 GravesASchmidhuberJFramewise phoneme classification with bidirectional LSTM and other neural network architecturesNeur Networks2005185–660261010.1016/j.neunet.2005.06.042 Cui Z, Chen W,Chen Y (2016) Multi-scale convolutional neural networks for time series classification. arXiv preprint https://arxiv.org/abs/1603.06995 BudakÜComputer-aided diagnosis system combining FCN and Bi-LSTM model for efficient breast cancer detection from histopathological imagesApplied Soft Computing20198510576510.1016/j.asoc.2019.105765 KarimFLSTM fully convolutional networks for time series classificationIEEE Access201861662166910.1109/ACCESS.2017.2779939 Chollet F, Keras (2015) Available from: https://github.com/fchollet/keras Wang Z, Yan W, Oates T (2017) Time series classification from scratch with deep neural networks: A strong baseline. In International Joint Conference On Neural Networks (IJCNN) ZhaoYApplying deep bidirectional LSTM and mixture density network for basketball trajectory predictionOptik201815826627210.1016/j.ijleo.2017.12.038 Nair V , Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference On Machine Learning (ICML-10) Ismail Fawaz H et al. (2019) Deep learning for time series classification: a review. Data Mining and Knowledge Discovery HochreiterSSchmidhuberJLong short-term memoryNeural Comput1997981735178010.1162/neco.1997.9.8.1735 Kingma DP, Ba Adam J (2014) A method for stochastic optimization. arXiv preprint https://arxiv.org/abs/1412.6980 Tang Y et al. (2016) Sequence-to-sequence model with attention for time series classification. In 2016 IEEE 16th International Conference On Data Mining Workshops (ICDMW) DauHAThe ucr time series archive.201810.1109/JAS.2019.1911747 KarimFMajumdarSDarabiHAdversarial Attacks on Time Series201910.1109/TPAMI.2020.2986319 Chen Y, Keogh E, Hu B, Begum N, Bagnall A, Mueen A, Batista G (2015) The UCR time series classification archive. https://www.cs.ucr.edu/~eamonn/time_series_data KimYResource-efficient pet dog sound events classification using LSTM-FCN based on time-series dataSensors20181811401910.3390/s18114019 GravesAA novel connectionist system for unconstrained handwriting recognitionIEEE Trans Pattern Anal Mach Intell200931585586810.1109/TPAMI.2008.137 Zhou Q, Wu H (2018) NLP at IEST 2018: BiLSTM-Attention and LSTM-Attention via Soft Voting in Emotion Classification. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis Rakthanmanon T et al. (2012) Searching and mining trillions of time series subsequences under dynamic time warping. Proceedings of the 18th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining SrivastavaNDropout: a simple way to prevent neural networks from overfittingJ Machine Learn Res20141511929195832315921318.68153 Graves A, Jaitly N, Mohamed Ar (2013) Hybrid speech recognition with deep bidirectional LSTM. in 2013 IEEE workshop on automatic speech recognition and understanding Vinayavekhin P et al. (2018) Focusing on what is relevant: time-series learning and understanding using attention. In 2018 24th International Conference On Pattern Recognition (ICPR) SchäferPThe BOSS is concerned with time series classification in the presence of noiseData Min Knowl Disc201529615051530340181810.1007/s10618-014-0377-7 OrtegoPEvolutionary LSTM-FCN networks for pattern classification in industrial processesSwarm and Evolutionary Computation20205410065010.1016/j.swevo.2020.100650 BaydoganMGRungerGTuvEA bag-of-features framework to classify time seriesIEEE Trans Pattern Anal Mach Intell201335112796280210.1109/TPAMI.2013.72 Wei L, Keogh E (2006) Semi-supervised time series classification. In proceedings of the 12th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining KeoghERatanamahatanaCAExact indexing of dynamic time warpingKnowl Inf Syst20057335838610.1007/s10115-004-0154-9 BagnallATime-series classification with COTE: the collective of transformation-based ensemblesIEEE Trans Knowl Data Eng20152792522253510.1109/TKDE.2015.2416723 3560_CR12 3560_CR34 3560_CR33 3560_CR32 3560_CR16 3560_CR38 3560_CR15 3560_CR36 3560_CR35 3560_CR39 P Ortego (3560_CR18) 2020; 54 JK Chorowski (3560_CR37) 2015; 28 P Schäfer (3560_CR10) 2015; 29 HA Dau (3560_CR5) 2018 J Lin (3560_CR8) 2007; 15 F Karim (3560_CR3) 2019 A Graves (3560_CR25) 2005; 18 P Schäfer (3560_CR11) 2016; 30 N Srivastava (3560_CR23) 2014; 15 S Hochreiter (3560_CR28) 1997; 9 M Schuster (3560_CR26) 1997; 45 A Graves (3560_CR30) 2009; 31 A Bagnall (3560_CR14) 2015; 27 Y Kim (3560_CR19) 2018; 18 3560_CR45 J Lines (3560_CR13) 2015; 29 3560_CR22 3560_CR44 3560_CR43 3560_CR20 3560_CR42 3560_CR27 MG Baydogan (3560_CR9) 2013; 35 3560_CR24 3560_CR46 E Keogh (3560_CR6) 2005; 7 Ü Budak (3560_CR21) 2019; 85 T Chen (3560_CR31) 2017; 72 P Esling (3560_CR1) 2012; 45 3560_CR7 Y Zhao (3560_CR29) 2018; 158 3560_CR2 3560_CR41 F Karim (3560_CR17) 2018; 6 3560_CR40 3560_CR4 |
| References_xml | – reference: EslingPAgonCTime-series data mining ACM Computing Surveys (CSUR)201245112 – reference: Jiang F, Chen H, Zhang L-J (2018) FCN-biLSTM based VAT invoice recognition and processing. IN INTERNATIONAL CONFERENCE ON EDGE COMPUTING, Springer – reference: LinJExperiencing SAX: a novel symbolic representation of time seriesData Min Knowl Disc2007152107144240978310.1007/s10618-007-0064-z – reference: BagnallATime-series classification with COTE: the collective of transformation-based ensemblesIEEE Trans Knowl Data Eng20152792522253510.1109/TKDE.2015.2416723 – reference: Wei L, Keogh E (2006) Semi-supervised time series classification. In proceedings of the 12th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining – reference: Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint https://arxiv.org/abs/1409.0473 – reference: SchäferPThe BOSS is concerned with time series classification in the presence of noiseData Min Knowl Disc201529615051530340181810.1007/s10618-014-0377-7 – reference: Sch P et al. (2017) Fast and accurate time series classification with WEASEL, in Proceedings of the 2017 ACM on Conference On Information And Knowledge Management. ACM Singapore. 637–646 – reference: HochreiterSSchmidhuberJLong short-term memoryNeural Comput1997981735178010.1162/neco.1997.9.8.1735 – reference: Powers DM (2020) Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation. arXiv preprint https://arxiv.org/abs/2010.16061 – reference: Zhou Q, Wu H (2018) NLP at IEST 2018: BiLSTM-Attention and LSTM-Attention via Soft Voting in Emotion Classification. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis – reference: GravesAA novel connectionist system for unconstrained handwriting recognitionIEEE Trans Pattern Anal Mach Intell200931585586810.1109/TPAMI.2008.137 – reference: KarimFLSTM fully convolutional networks for time series classificationIEEE Access201861662166910.1109/ACCESS.2017.2779939 – reference: ZhaoYApplying deep bidirectional LSTM and mixture density network for basketball trajectory predictionOptik201815826627210.1016/j.ijleo.2017.12.038 – reference: Abadi M et al. (2016) Tensorflow: a system for large-scale machine learning. In 12th {USENIX} Symposium on operating systems design and implementation ({OSDI} 16) – reference: SchäferPScalable time series classificationData Min Knowl Disc201630512731298353998110.1007/s10618-015-0441-y – reference: Lipton ZC, Berkowitz J, Elkan C (2015) A critical review of recurrent neural networks for sequence learning. arXiv preprint https://arxiv.org/abs/1506.00019 – reference: SrivastavaNDropout: a simple way to prevent neural networks from overfittingJ Machine Learn Res20141511929195832315921318.68153 – reference: KarimFMajumdarSDarabiHAdversarial Attacks on Time Series201910.1109/TPAMI.2020.2986319 – reference: KeoghERatanamahatanaCAExact indexing of dynamic time warpingKnowl Inf Syst20057335838610.1007/s10115-004-0154-9 – reference: KimYResource-efficient pet dog sound events classification using LSTM-FCN based on time-series dataSensors20181811401910.3390/s18114019 – reference: Kingma DP, Ba Adam J (2014) A method for stochastic optimization. arXiv preprint https://arxiv.org/abs/1412.6980 – reference: BudakÜComputer-aided diagnosis system combining FCN and Bi-LSTM model for efficient breast cancer detection from histopathological imagesApplied Soft Computing20198510576510.1016/j.asoc.2019.105765 – reference: Hashida S, Tamura K (2019) Multi-channel MHLF: LSTM-FCN using MACD-histogram with multi-channel input for time series classification. in (2019) IEEE 11th International Workshop on Computational Intelligence and Applications (IWCIA) – reference: SchusterMPaliwalKKBidirectional recurrent neural networksIEEE Trans Sign Proc199745112673268110.1109/78.650093 – reference: BaydoganMGRungerGTuvEA bag-of-features framework to classify time seriesIEEE Trans Pattern Anal Mach Intell201335112796280210.1109/TPAMI.2013.72 – reference: Lin M, Chen Q, Yan S (2013) Network in network. arXiv preprint https://arxiv.org/abs/1312.4400 – reference: Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint https://arxiv.org/abs/1502.03167 – reference: LinesJBagnallATime series classification with ensembles of elastic distance measuresData Min Knowl Disc2015293565592333431510.1007/s10618-014-0361-2 – reference: ChorowskiJKAttention-based models for speech recognitionAdvances in neural information processing systems.201528577585 – reference: ChenTImproving sentiment analysis via sentence type classification using BiLSTM-CRF and CNNExpert Syst Appl20177222123010.1016/j.eswa.2016.10.065 – reference: Graves A, Jaitly N, Mohamed Ar (2013) Hybrid speech recognition with deep bidirectional LSTM. in 2013 IEEE workshop on automatic speech recognition and understanding – reference: Xu K et al. (2015) Show attend and tell: Neural image caption generation with visual attention. In International Conference On Machine Learning – reference: Chen Y, Keogh E, Hu B, Begum N, Bagnall A, Mueen A, Batista G (2015) The UCR time series classification archive. https://www.cs.ucr.edu/~eamonn/time_series_data/ – reference: OrtegoPEvolutionary LSTM-FCN networks for pattern classification in industrial processesSwarm and Evolutionary Computation20205410065010.1016/j.swevo.2020.100650 – reference: GravesASchmidhuberJFramewise phoneme classification with bidirectional LSTM and other neural network architecturesNeur Networks2005185–660261010.1016/j.neunet.2005.06.042 – reference: Nair V , Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference On Machine Learning (ICML-10) – reference: DauHAThe ucr time series archive.201810.1109/JAS.2019.1911747 – reference: Cui Z, Chen W,Chen Y (2016) Multi-scale convolutional neural networks for time series classification. arXiv preprint https://arxiv.org/abs/1603.06995 – reference: Chollet F, Keras (2015) Available from: https://github.com/fchollet/keras – reference: Nolan JR (1997) Estimating the true performance of classification-based nlp technology. In: From research to commercial applications: Making NLP Work in Practice – reference: Vinayavekhin P et al. (2018) Focusing on what is relevant: time-series learning and understanding using attention. In 2018 24th International Conference On Pattern Recognition (ICPR) – reference: Ismail Fawaz H et al. (2019) Deep learning for time series classification: a review. Data Mining and Knowledge Discovery – reference: Rakthanmanon T et al. (2012) Searching and mining trillions of time series subsequences under dynamic time warping. Proceedings of the 18th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining – reference: Wang Z, Yan W, Oates T (2017) Time series classification from scratch with deep neural networks: A strong baseline. In International Joint Conference On Neural Networks (IJCNN) – reference: Tang Y et al. (2016) Sequence-to-sequence model with attention for time series classification. In 2016 IEEE 16th International Conference On Data Mining Workshops (ICDMW) – ident: 3560_CR22 doi: 10.1007/978-3-319-94340-4_11 – ident: 3560_CR42 – year: 2019 ident: 3560_CR3 publication-title: Adversarial Attacks on Time Series doi: 10.1109/TPAMI.2020.2986319 – ident: 3560_CR38 – volume: 158 start-page: 266 year: 2018 ident: 3560_CR29 publication-title: Optik doi: 10.1016/j.ijleo.2017.12.038 – ident: 3560_CR46 – volume: 15 start-page: 1929 issue: 1 year: 2014 ident: 3560_CR23 publication-title: J Machine Learn Res – volume: 15 start-page: 107 issue: 2 year: 2007 ident: 3560_CR8 publication-title: Data Min Knowl Disc doi: 10.1007/s10618-007-0064-z – ident: 3560_CR39 doi: 10.18653/v1/W18-6226 – ident: 3560_CR40 doi: 10.1109/ICDMW.2016.0078 – ident: 3560_CR44 – volume: 7 start-page: 358 issue: 3 year: 2005 ident: 3560_CR6 publication-title: Knowl Inf Syst doi: 10.1007/s10115-004-0154-9 – ident: 3560_CR7 doi: 10.1145/2339530.2339576 – volume: 29 start-page: 1505 issue: 6 year: 2015 ident: 3560_CR10 publication-title: Data Min Knowl Disc doi: 10.1007/s10618-014-0377-7 – volume: 29 start-page: 565 issue: 3 year: 2015 ident: 3560_CR13 publication-title: Data Min Knowl Disc doi: 10.1007/s10618-014-0361-2 – ident: 3560_CR32 doi: 10.1109/ASRU.2013.6707742 – ident: 3560_CR33 – ident: 3560_CR35 – volume: 27 start-page: 2522 issue: 9 year: 2015 ident: 3560_CR14 publication-title: IEEE Trans Knowl Data Eng doi: 10.1109/TKDE.2015.2416723 – volume: 28 start-page: 577 year: 2015 ident: 3560_CR37 publication-title: Advances in neural information processing systems. – volume: 45 start-page: 12 issue: 1 year: 2012 ident: 3560_CR1 publication-title: Time-series data mining ACM Computing Surveys (CSUR) – year: 2018 ident: 3560_CR5 publication-title: The ucr time series archive. doi: 10.1109/JAS.2019.1911747 – ident: 3560_CR12 – volume: 72 start-page: 221 year: 2017 ident: 3560_CR31 publication-title: Expert Syst Appl doi: 10.1016/j.eswa.2016.10.065 – ident: 3560_CR16 – ident: 3560_CR24 doi: 10.1007/s10618-019-00619-1 – volume: 30 start-page: 1273 issue: 5 year: 2016 ident: 3560_CR11 publication-title: Data Min Knowl Disc doi: 10.1007/s10618-015-0441-y – ident: 3560_CR43 – volume: 6 start-page: 1662 year: 2018 ident: 3560_CR17 publication-title: IEEE Access doi: 10.1109/ACCESS.2017.2779939 – volume: 18 start-page: 4019 issue: 11 year: 2018 ident: 3560_CR19 publication-title: Sensors doi: 10.3390/s18114019 – ident: 3560_CR45 – volume: 85 start-page: 105765 year: 2019 ident: 3560_CR21 publication-title: Applied Soft Computing doi: 10.1016/j.asoc.2019.105765 – volume: 31 start-page: 855 issue: 5 year: 2009 ident: 3560_CR30 publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2008.137 – volume: 45 start-page: 2673 issue: 11 year: 1997 ident: 3560_CR26 publication-title: IEEE Trans Sign Proc doi: 10.1109/78.650093 – volume: 35 start-page: 2796 issue: 11 year: 2013 ident: 3560_CR9 publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2013.72 – ident: 3560_CR15 doi: 10.1109/IJCNN.2017.7966039 – ident: 3560_CR27 – ident: 3560_CR41 doi: 10.1109/ICPR.2018.8545288 – ident: 3560_CR34 – volume: 54 start-page: 100650 year: 2020 ident: 3560_CR18 publication-title: Swarm and Evolutionary Computation doi: 10.1016/j.swevo.2020.100650 – volume: 18 start-page: 602 issue: 5–6 year: 2005 ident: 3560_CR25 publication-title: Neur Networks doi: 10.1016/j.neunet.2005.06.042 – ident: 3560_CR36 – volume: 9 start-page: 1735 issue: 8 year: 1997 ident: 3560_CR28 publication-title: Neural Comput doi: 10.1162/neco.1997.9.8.1735 – ident: 3560_CR4 – ident: 3560_CR20 doi: 10.1109/IWCIA47330.2019.8955030 – ident: 3560_CR2 doi: 10.1145/1150402.1150498 |
| SSID | ssj0004373 |
| Score | 2.5249546 |
| Snippet | Time series classification (TSC) has been around for recent decades as a significant research problem for industry practitioners as well as academic... |
| SourceID | proquest crossref springer |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 7021 |
| SubjectTerms | Algorithms Art techniques Classification Compilers Computer Science Deep learning Feature extraction Interpreters Machine learning Processor Architectures Programming Languages Time series |
| Title | Bidirectional LSTM-RNN-based hybrid deep learning frameworks for univariate time series classification |
| URI | https://link.springer.com/article/10.1007/s11227-020-03560-z https://www.proquest.com/docview/2543741852 |
| Volume | 77 |
| WOSCitedRecordID | wos000604819500010&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAVX databaseName: SpringerLINK Contemporary 1997-Present customDbUrl: eissn: 1573-0484 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0004373 issn: 0920-8542 databaseCode: RSV dateStart: 19970101 isFulltext: true titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22 providerName: Springer Nature |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3PS8MwFH7o9ODF-ROnU3LwpoE1bZf2qOLwoEW2KbuVJE10IHWs28D99SZpalFU0HPTUF5e8n2vee99AKeZdlpPMQ8LRhgOpGCYyU4H04xa7W3KFLdiEzRJotEovndFYUWV7V5dSdqTui528wih2IQ7HV_jNF6uwpqGu8gINvQHj3U1pF_eK8d6ZBQGxJXKfD_HZziqOeaXa1GLNr3m_75zCzYdu0QXpTtsw4rMd6BZKTcgt5F3QV2OSyyzPwLR7WB4h_tJgg2mZej5zZRxoUzKCXKqEk9IVVlcBdI8F83z8UKH2ZqpIiNPj4wnywIJQ8ZN9pFd8D146F0Pr26wU1zAwo_CGQ4CHkmlKUjMqRA6MmSZjEIqO8LIDyoeS5_EGvW6TGSZZlZdqe2vOYuvAqqZBPf3oZG_5vIAUEi9OBKK6PiKBSHjXPi-EfkIuDQ9_bwWeJXhU-HakRtVjJe0bqRsDJlqQ6bWkOmyBWcf70zKZhy_jm5X65m6jVmkpvbfNuwhLTiv1q9-_PNsh38bfgQbxGS_2MTeNjRm07k8hnWxmI2L6Yl12HcATeVB |
| linkProvider | Springer Nature |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fS8MwED50Cvri_InTqXnwTQNtky7to4pj4lZkm7K3kqapDqQOuw3cX2-SthZFBX1uGsrlku-75u4-gNNYOa2dcBsL7nBMpeCYS8vCLGZGe5vxJDJiEywIvNHIvyuKwrIy2728kjQndVXsZjsOwzrcsYjCabxYhhWqEEt3zO8PHqpqSJLfK_tqpOdSpyiV-X6Oz3BUccwv16IGbdr1_33nJmwU7BJd5O6wBUsy3YZ6qdyAio28A8nlOMcy8yMQdQfDHu4HAdaYFqOnN13GhWIpJ6hQlXhESZnFlSHFc9EsHc9VmK2YKtLy9Eh7ssyQ0GRcZx-ZBd-F-_b18KqDC8UFLIjnTjGlkScTRUH8iAmhIkMeS89l0hJafjCJfEkcX6Fei4s4VsyqJS3fU5yFJJQpJhGRPailL6ncB-Qy2_dE4qj4ilOXR5EgRIt80Ejqnn52A-zS8KEo2pFrVYznsGqkrA0ZKkOGxpDhogFnH-9M8mYcv45ulusZFhszC3Xtv2nY4zTgvFy_6vHPsx38bfgJrHWGvW7YvQluD2Hd0ZkwJsm3CbXp60wewaqYT8fZ67Fx3nfHougl |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8QwEB50FfHi-sT1mYM3DfZp2qOvRVGL-MJbSdNEF6Qu27rg_nozaWtVVBDPnYYymTDfNDPfB7CV6qC1Fbep4A6nnhSccmlZlKXMaG8zrhIjNsGiKLi_Dy8_TPGbbvf6SrKcaUCWpqzY7adqtxl8sx2HUSx9LFfnbDoahwkPG-mxXr--ayYj3fKOOdSWge851djM92t8Tk0N3vxyRWoyT7f9_2-ehZkKdZL9MkzmYExm89CuFR1IdcAXQB30yhxnfhCS8-ubC3oVRRRzXUoeX3G8i6RS9kmlNvFAVN3dlRONf8lL1hvq8lsjWIKy9QQjXOZEIEjHriQTCItw2z2-OTyhlRIDFW7gF9TzkkAqDU3ChAmhK0aeysBn0hIoS6iSULpOqLPhHhdpqhHXnrTCQGMZV3lMI4zEXYJW9pzJZSA-s8NAKEfXXdzzeZII10XxDy-RyPVnd8CuNyEWFU05qmU8xQ3BMjoy1o6MjSPjUQe239_plyQdv1qv1XsbVwc2j5ETwBD5OB3Yqfeyefzzait_M9-Eqcujbnx-Gp2twrSDDTKm93cNWsXgRa7DpBgWvXywYeL4DQbP8Qk |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Bidirectional+LSTM-RNN-based+hybrid+deep+learning+frameworks+for+univariate+time+series+classification&rft.jtitle=The+Journal+of+supercomputing&rft.au=Khan%2C+Mehak&rft.au=Wang%2C+Hongzhi&rft.au=Riaz%2C+Adnan&rft.au=Elfatyany%2C+Aya&rft.date=2021-07-01&rft.pub=Springer+US&rft.issn=0920-8542&rft.eissn=1573-0484&rft.volume=77&rft.issue=7&rft.spage=7021&rft.epage=7045&rft_id=info:doi/10.1007%2Fs11227-020-03560-z&rft.externalDocID=10_1007_s11227_020_03560_z |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0920-8542&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0920-8542&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0920-8542&client=summon |