Attending From Foresight: A Novel Attention Mechanism for Neural Machine Translation
Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the dominant approach to MT for decades, but recently neural machine translation achieves increasing interest because of its appealing model arch...
Uloženo v:
| Vydáno v: | IEEE/ACM transactions on audio, speech, and language processing Ročník 29; s. 2606 - 2616 |
|---|---|
| Hlavní autoři: | , , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Piscataway
IEEE
2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 2329-9290, 2329-9304 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the dominant approach to MT for decades, but recently neural machine translation achieves increasing interest because of its appealing model architecture and impressive translation performance. In neural machine translation, an attention model is used to identify the aligned source words for the next target word, i.e., target foresight word, to select translation context. However, it does not make use of any information about this target foresight word at all. Previous work proposed an approach to improve the attention model by explicitly accessing this target foresight word and demonstrating substantial alignment tasks. However, this approach cannot be applied in machine translation tasks where the target foresight word is unavailable. This paper proposes several novel enhanced attention models by introducing hidden information (such as part-of-speech) of the target foresight word for the translation task. We incorporate the novel enhanced attention employing hidden information about the target foresight word into both recurrent and self-attention-based neural translation models and theoretically justify that such hidden information can make translation prediction easier. Empirical experiments on four datasets further verify that the proposed attention models deliver significant improvements in translation quality. |
|---|---|
| AbstractList | Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the dominant approach to MT for decades, but recently neural machine translation achieves increasing interest because of its appealing model architecture and impressive translation performance. In neural machine translation, an attention model is used to identify the aligned source words for the next target word, i.e., target foresight word, to select translation context. However, it does not make use of any information about this target foresight word at all. Previous work proposed an approach to improve the attention model by explicitly accessing this target foresight word and demonstrating substantial alignment tasks. However, this approach cannot be applied in machine translation tasks where the target foresight word is unavailable. This paper proposes several novel enhanced attention models by introducing hidden information (such as part-of-speech) of the target foresight word for the translation task. We incorporate the novel enhanced attention employing hidden information about the target foresight word into both recurrent and self-attention-based neural translation models and theoretically justify that such hidden information can make translation prediction easier. Empirical experiments on four datasets further verify that the proposed attention models deliver significant improvements in translation quality. Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the dominant approach to MT for decades, but recently neural machine translation achieves increasing interest because of its appealing model architecture and impressive translation performance. In neural machine translation, an attention model is used to identify the aligned source words for the next target word, i.e., target foresight word, to select translation context. However, it does not make use of any information about this target foresight word at all. Previous work proposed an approach to improve the attention model by explicitly accessing this target foresight word and demonstrating substantial alignment tasks. However, this approach cannot be applied in machine translation tasks where the target foresight word is unavailable. This paper proposes several novel enhanced attention models by introducing hidden information (such as part-of-speech) of the target foresight word for the translation task. We incorporate the novel enhanced attention employing hidden information about the target foresight word into both recurrent and self-attention-based neural translation models and theoretically justify that such hidden information can make translation prediction easier. Empirical experiments on four datasets further verify that the proposed attention models deliver significant improvements in translation quality. |
| Author | Li, Xintong Liu, Lemao Tu, Zhaopeng Li, Guanlin Meng, Max Q.-H. Shi, Shuming |
| Author_xml | – sequence: 1 givenname: Xintong orcidid: 0000-0001-9303-1110 surname: Li fullname: Li, Xintong email: znculee@gmail.com organization: Linguisitics, Ohio State University, Columbus, Ohio, USA – sequence: 2 givenname: Lemao surname: Liu fullname: Liu, Lemao email: lemaoliu@gmail.com organization: Tencent AI Lab, Shenzhen, Guangdong, China – sequence: 3 givenname: Zhaopeng surname: Tu fullname: Tu, Zhaopeng email: tuzhaopeng@gmail.com organization: Tencent AI Lab, Shenzhen, Guangdong, China – sequence: 4 givenname: Guanlin orcidid: 0000-0003-3142-5928 surname: Li fullname: Li, Guanlin email: epsilonlee.green@gmail.com organization: Tencent AI Lab, Shenzhen, Guangdong, China – sequence: 5 givenname: Shuming surname: Shi fullname: Shi, Shuming email: shumingshi@tencent.com organization: Tencent AI Lab, Shenzhen, Guangdong, China – sequence: 6 givenname: Max Q.-H. orcidid: 0000-0002-5255-5898 surname: Meng fullname: Meng, Max Q.-H. email: max.meng@cuhk.edu.hk organization: Electronic Engineering, Chinese University of Hong Kong, Hong Kong |
| BookMark | eNp9kMtOwzAQRS0EEqX0B2BjiXXK2M5r2EUVBaS2IJF95CRO6yq1i50i8fekD1iwYDWzOGce94qcG2sUITcMxowB3ufZ--xtzIGzsQBMUOAZGXDBMUAB4flPzxEuycj7NQAwSBCTcEDyrOuUqbVZ0qmzGzq1Tnm9XHUPNKML-6laeiA6bQ2dq2oljfYb2lhHF2rnZEvnslppo2jupPGt3IPX5KKRrVejUx2SfPqYT56D2evTyySbBRXHqAtYzDgkMS9lw1OEuokFC-sqLUGyJInCpGkwjEsOUGIlUykE1jJKY8A4VUkphuTuOHbr7MdO-a5Y250z_caCR3H_tOAQ91R6pCpnvXeqKSrdHc7snNRtwaDYp1gcUiz2KRanFHuV_1G3Tm-k-_pfuj1KWin1K2CIIKJIfAO1X36m |
| CODEN | ITASD8 |
| CitedBy_id | crossref_primary_10_3390_app14156848 crossref_primary_10_1007_s13042_022_01759_5 crossref_primary_10_1002_int_22909 crossref_primary_10_3390_systems12100420 crossref_primary_10_1155_acis_6234949 crossref_primary_10_1007_s10489_023_04848_2 crossref_primary_10_1109_TASLP_2022_3221040 crossref_primary_10_1145_3549937 |
| Cites_doi | 10.1613/jair.1.12008 10.21236/ADA461156 10.18653/v1/2020.acl-main.35 10.3115/1557769.1557821 10.18653/v1/D15-1166 10.18653/v1/N19-1187 10.1109/CVPR.2016.90 10.18653/v1/P16-1008 10.18653/v1/P19-1124 10.3115/1220355.1220511 10.1109/TNNLS.2013.2263557 10.18653/v1/N16-1046 10.1515/pralin-2017-0006 10.18653/v1/E17-3017 10.18653/v1/D16-1249 10.1145/1014052.1014067 10.18653/v1/2020.acl-main.757 10.18653/v1/P18-2053 10.1162/tacl_a_00097 10.3115/1073445.1073478 10.1109/TNNLS.2015.2497149 10.1109/TASLP.2018.2789721 10.18653/v1/W16-2209 10.18653/v1/P18-1163 10.1162/tacl_a_00011 10.3115/1219840.1219873 10.1109/TNNLS.2018.2813306 10.18653/v1/N18-1125 10.1109/TNNLS.2015.2499302 10.1017/CBO9780511815829 10.3115/1219840.1219897 10.18653/v1/N16-1102 10.18653/v1/D18-1036 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| DBID | 97E RIA RIE AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
| DOI | 10.1109/TASLP.2021.3097939 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
| DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
| DatabaseTitleList | Computer and Information Systems Abstracts |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2329-9304 |
| EndPage | 2616 |
| ExternalDocumentID | 10_1109_TASLP_2021_3097939 9490355 |
| Genre | orig-research |
| GroupedDBID | 0R~ 4.4 6IK 97E AAJGR AAKMM AALFJ AARMG AASAJ AAWTH AAWTV ABAZT ABQJQ ABVLG ACIWK ACM ADBCU AEBYY AEFXT AEJOY AENSD AFWIH AFWXC AGQYO AGSQL AHBIQ AIKLT AKJIK AKQYR AKRVB ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CCLIF EBS EJD GUFHI HGAVV IFIPE IPLJI JAVBF LHSKQ M43 OCL PQQKQ RIA RIE RNS ROL AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
| ID | FETCH-LOGICAL-c295t-16120762baf2890df6314dc8b0a177547ff946b200b9ca8a339da5860968e7b3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 9 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000685887100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2329-9290 |
| IngestDate | Sun Nov 30 03:58:04 EST 2025 Sat Nov 29 02:43:55 EST 2025 Tue Nov 18 22:20:17 EST 2025 Wed Aug 27 02:25:46 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c295t-16120762baf2890df6314dc8b0a177547ff946b200b9ca8a339da5860968e7b3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0003-3142-5928 0000-0001-9303-1110 0000-0002-5255-5898 |
| PQID | 2562323206 |
| PQPubID | 85426 |
| PageCount | 11 |
| ParticipantIDs | crossref_citationtrail_10_1109_TASLP_2021_3097939 proquest_journals_2562323206 crossref_primary_10_1109_TASLP_2021_3097939 ieee_primary_9490355 |
| PublicationCentury | 2000 |
| PublicationDate | 20210000 2021-00-00 20210101 |
| PublicationDateYYYYMMDD | 2021-01-01 |
| PublicationDate_xml | – year: 2021 text: 20210000 |
| PublicationDecade | 2020 |
| PublicationPlace | Piscataway |
| PublicationPlace_xml | – name: Piscataway |
| PublicationTitle | IEEE/ACM transactions on audio, speech, and language processing |
| PublicationTitleAbbrev | TASLP |
| PublicationYear | 2021 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | deng (ref33) 2018 ref13 ref14 ref53 ref52 ref11 ref54 ref10 dutil (ref58) 2017 zeiler (ref43) 2012 ref16 ref19 kingma (ref45) 2014 ref51 ref50 brown (ref41) 1992; 18 ref47 ba (ref27) 2016 ref49 ref8 ref9 ref4 zhou (ref59) 2017 chen (ref60) 2018 ref40 sennrich (ref44) 2015 ref35 meng (ref55) 2016 ref37 vinyals (ref3) 2015 ref36 ref30 li (ref23) 2019 ref32 clark (ref46) 2011 ref2 ref1 ref38 zhang (ref57) 2017 marie (ref18) 2018 vaswani (ref17) 2017 liu (ref5) 2018 ref24 serban (ref6) 2016 ref26 chen (ref62) 2016 ref25 luong (ref31) 2015 ref20 ref22 ref21 sankaran (ref56) 2016 ref29 bahdanau (ref12) 2014 liu (ref28) 2016 goto (ref34) 2013 liu (ref48) 2015 liang (ref42) 2005 zhang (ref39) 2017 feng (ref15) 2016 ref61 xu (ref7) 2015 |
| References_xml | – start-page: 3776 year: 2016 ident: ref6 article-title: Building end-to-end dialogue systems using generative hierarchical neural network models publication-title: Proc AAAI Conf Artif Intell – start-page: 2295 year: 2015 ident: ref48 article-title: Contrastive unsupervised word alignment with non-local features publication-title: Proc 29th AAAI Conf Artif Intell – ident: ref36 doi: 10.1613/jair.1.12008 – ident: ref11 doi: 10.21236/ADA461156 – start-page: 4792 year: 2018 ident: ref60 article-title: Syntax-Directed attention for neural machine translation publication-title: Proc AAAI Conf Artif Intell – start-page: 466 year: 2019 ident: ref23 article-title: Understanding and improving hidden representations for neural machine translation publication-title: Proc Conf North Amer Chapter Assoc Comput Linguistics Hum Lang Technol – year: 2018 ident: ref33 article-title: Latent alignment and variational attention publication-title: Proc Int Conf Neural Inf Process – ident: ref25 doi: 10.18653/v1/2020.acl-main.35 – year: 2005 ident: ref42 article-title: Semi-supervised learning for natural language – ident: ref37 doi: 10.3115/1557769.1557821 – ident: ref13 doi: 10.18653/v1/D15-1166 – ident: ref19 doi: 10.18653/v1/N19-1187 – ident: ref26 doi: 10.1109/CVPR.2016.90 – year: 2014 ident: ref45 article-title: Adam: A method for stochastic optimization – ident: ref14 doi: 10.18653/v1/P16-1008 – ident: ref49 doi: 10.18653/v1/P19-1124 – start-page: 176 year: 2011 ident: ref46 article-title: Better hypothesis testing for statistical machine translation: Controlling for optimizer instability publication-title: Proc 49th Annu Meeting Assoc Comput Linguistics Hum Lang Technol – ident: ref50 doi: 10.3115/1220355.1220511 – year: 2016 ident: ref28 article-title: Neural machine translation with supervised attention – ident: ref9 doi: 10.1109/TNNLS.2013.2263557 – ident: ref35 doi: 10.18653/v1/N16-1046 – year: 2015 ident: ref44 article-title: Neural machine translation of rare words with subword units – year: 2016 ident: ref56 article-title: Temporal attention model for neural machine translation – ident: ref22 doi: 10.1515/pralin-2017-0006 – ident: ref38 doi: 10.18653/v1/E17-3017 – start-page: 4873 year: 2018 ident: ref5 article-title: Improving sequence-to-sequence constituency parsing publication-title: Proc 32nd AAAI Conf Artif Intell – ident: ref61 doi: 10.18653/v1/D16-1249 – ident: ref30 doi: 10.1145/1014052.1014067 – year: 2016 ident: ref55 article-title: Interactive attention for neural machine translation – year: 2014 ident: ref12 article-title: Neural machine translation by jointly learning to align and translate – start-page: 5998 year: 2017 ident: ref17 article-title: Attention is all you need publication-title: Proc Int Conf Neural Inf Process – ident: ref32 doi: 10.18653/v1/2020.acl-main.757 – year: 2012 ident: ref43 article-title: Adadelta: An adaptive learning rate method – ident: ref51 doi: 10.18653/v1/P18-2053 – start-page: 2773 year: 2015 ident: ref3 article-title: Grammar as a foreign language publication-title: Proc Int Conf Neural Inf Process – ident: ref54 doi: 10.1162/tacl_a_00097 – ident: ref40 doi: 10.3115/1073445.1073478 – ident: ref4 doi: 10.1109/TNNLS.2015.2497149 – ident: ref20 doi: 10.1109/TASLP.2018.2789721 – ident: ref47 doi: 10.18653/v1/W16-2209 – ident: ref53 doi: 10.18653/v1/P18-1163 – ident: ref29 doi: 10.1162/tacl_a_00011 – year: 2016 ident: ref62 article-title: Guided alignment training for topic-aware neural machine translation – ident: ref21 doi: 10.3115/1219840.1219873 – ident: ref8 doi: 10.1109/TNNLS.2018.2813306 – start-page: 260 year: 2013 ident: ref34 article-title: Overview of the patent machine translation task at the NTCIR-10 workshop publication-title: Proc NTCIR Workshop – ident: ref1 doi: 10.18653/v1/N18-1125 – start-page: 211 year: 2017 ident: ref59 article-title: Look-Ahead attention for generation in neural machine translation publication-title: Nat CCF Conf Natural Lang Process Chin Comput – year: 2017 ident: ref58 article-title: Plan, Attend, Generate: Planning for sequence-to-sequence models publication-title: Proc Int Conf Neural Inf Process – year: 2017 ident: ref57 article-title: A gru-gated attention model for neural machine translation – start-page: 2048 year: 2015 ident: ref7 article-title: Show, attend and tell: Neural image caption generation with visual attention publication-title: Proc Int Conf Mach Learn – year: 2016 ident: ref27 article-title: Layer normalization – start-page: 975 year: 2018 ident: ref18 article-title: Combination of statistical and neural machine translation for myanmar-english publication-title: Proc 32nd Pacific Asia Conf Lang Inf Comput 5th Workshop Asian Transl 5th Workshop Asian Transl – volume: 18 start-page: 467 year: 1992 ident: ref41 article-title: Class-based n-gram models of natural language publication-title: Comput Linguistics – ident: ref10 doi: 10.1109/TNNLS.2015.2499302 – ident: ref2 doi: 10.1017/CBO9780511815829 – ident: ref24 doi: 10.3115/1219840.1219897 – year: 2017 ident: ref39 article-title: Thumt: An open source toolkit for neural machine translation – year: 2015 ident: ref31 article-title: Multi-task sequence to sequence learning – ident: ref16 doi: 10.18653/v1/N16-1102 – ident: ref52 doi: 10.18653/v1/D18-1036 – start-page: 3082 year: 2016 ident: ref15 article-title: Improving attention modeling with implicit distortion and fertility for machine translation publication-title: Proc 26th Int Conf Comput Linguistics Tech Papers |
| SSID | ssj0001079974 |
| Score | 2.221881 |
| Snippet | Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the... Machines translation (MT) is an essential task in natural language processing or even in artificial intelligence. Statistical machine translation has been the... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 2606 |
| SubjectTerms | Artificial intelligence Attention Context modeling Decoding Machine translation Natural language processing NMT Predictive models Recurrent neural networks Task analysis Unemployment Word Alignment Words (language) |
| Title | Attending From Foresight: A Novel Attention Mechanism for Neural Machine Translation |
| URI | https://ieeexplore.ieee.org/document/9490355 https://www.proquest.com/docview/2562323206 |
| Volume | 29 |
| WOSCitedRecordID | wos000685887100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2329-9304 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0001079974 issn: 2329-9290 databaseCode: RIE dateStart: 20140101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8MwDLa2iQMceA3EYKAcuEG3bOmamluFmDhs0yR62K3qw5UmwYb2-v04aTdAICTUSw52VNlJ_NmJbYBbmSIxDFIO6Uw6bpxnTqxQOR5pt8cQXyPZROGBHo38yQTHFbjf5cIQkX18Ri0ztHf52Txdm1BZG12UbB-rUNXaK3K1PuMpUiPaosuMEdBhqy-3OTIS22HwMhizN9jttJREXpP4zQ7Zxio_TmNrYvpH__u5YzgsoaQICt2fQIVmp3DwpcBgHcJgZWLcPBb9xfxNmEacS-OOP4hAjOYbYnZDYbQjhmSygKfLN8FAVpiqHTz70D62JGFtWvFu7gzC_lP4-OyUfRSctIu9lcOgriv50Evi3FwrZrmnOm6W-omMO6YAns5zdL2E90uCaezHSmEW93yPvRufdKLOoTabz-gCREo-86GXeVoxG5_z5GaaWQkxT7RqQGcr1Cgta4ybVhevkfU1JEZWEZFRRFQqogF3O573osLGn9R1I_odZSn1BjS3uovKTbiMugbb8Se9y9-5rmDfzF1EVJpQWy3WdA176WY1XS5u7Pr6AO1rzAs |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8MwDLbGQAIOvBGDATlwg7Ks6Zqa24SYQGwTEj3sVvXhSpPYhrbB78dJuwECIaFecrDbKk7sz05sA1zIFIlhkHJIZ9Lx4jxzYoXK8Ul7LYb4GskmCnd1vx8MBvhUgatlLgwR2ctndG2G9iw_m6RvJlTWQA8l28cVWG15niuLbK3PiIrUiLbsMqMEdNjuy0WWjMRG2H7uPrE_6DavlURelfjNEtnWKj_0sTUyne3__d4ObJVgUrQL6e9ChcZ7sPmlxOA-hO25iXLzWHSmk5EwrThnxiG_EW3Rn7wTsxsKIx_RI5MHPJyNBENZYep28Nt79rolCWvViptzBxB27sLbe6fspOCkLrbmDsM6V7LaS-LcHCxmua-aXpYGiYybpgSeznP0_IR3TIJpHMRKYRa3Ap_9m4B0og6hOp6M6QhESgHzoZ_5WjEba3ryMs2shJgnWtWguZjUKC2rjJtmFy-R9TYkRlYQkRFEVAqiBpdLnteixsaf1Ptm6peU5azXoL6QXVRuw1nkGnTHj_SPf-c6h_X7sNeNug_9xxPYMN8p4it1qM6nb3QKa-n7fDibntm19gG2Is9S |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Attending+From+Foresight%3A+A+Novel+Attention+Mechanism+for+Neural+Machine+Translation&rft.jtitle=IEEE%2FACM+transactions+on+audio%2C+speech%2C+and+language+processing&rft.au=Li%2C+Xintong&rft.au=Liu%2C+Lemao&rft.au=Tu%2C+Zhaopeng&rft.au=Li%2C+Guanlin&rft.date=2021&rft.issn=2329-9290&rft.eissn=2329-9304&rft.volume=29&rft.spage=2606&rft.epage=2616&rft_id=info:doi/10.1109%2FTASLP.2021.3097939&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TASLP_2021_3097939 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2329-9290&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2329-9290&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2329-9290&client=summon |