TENT: Technique-Embedded Note Tracking for Real-World Guitar Solo Recordings.

Gespeichert in:
Bibliographische Detailangaben
Titel: TENT: Technique-Embedded Note Tracking for Real-World Guitar Solo Recordings.
Autoren: Ting-Wei Su, Yuan-Ping Chen, Li Su, Yi-Hsuan Yang
Quelle: International Society for Music Information Retrieval Conference Proceedings; 2019, p15-28, 14p
Schlagwörter: MUSICAL analysis, GUITAR playing, SIGNAL processing, MUSICAL performance
Abstract: The employment of playing techniques such as string bend and vibrato in electric guitar performance makes it difficult to transcribe the note events using general note tracking methods. These methods analyze the contour of fundamental frequency computed from a given audio signal, but they do not consider the variation in the contour caused by the playing techniques. To address this issue, we present a model called technique-embedded note tracking (TENT) that uses the result of playing technique detection to inform note event estimation. We evaluate the proposed model on a dataset of 42 unaccompanied lead guitar phrases. Our experiments showed that TENT can nicely recognize complicated skills in monophonic guitar solos and improve the F-score of note event estimation by 14.7% compared to an existing method. For reproducibility, we share the Python source code of our implementation of TENT at the following GitHub repo: https://github.com/srviest/SoloLa. [ABSTRACT FROM AUTHOR]
Copyright of International Society for Music Information Retrieval Conference Proceedings is the property of Ubiquity Press and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Datenbank: Complementary Index
FullText Text:
  Availability: 0
Header DbId: edb
DbLabel: Complementary Index
An: 139146746
RelevancyScore: 889
AccessLevel: 6
PubType: Conference
PubTypeId: conference
PreciseRelevancyScore: 888.718200683594
IllustrationInfo
Items – Name: Title
  Label: Title
  Group: Ti
  Data: TENT: Technique-Embedded Note Tracking for Real-World Guitar Solo Recordings.
– Name: Author
  Label: Authors
  Group: Au
  Data: <searchLink fieldCode="AR" term="%22Ting-Wei+Su%22">Ting-Wei Su</searchLink><br /><searchLink fieldCode="AR" term="%22Yuan-Ping+Chen%22">Yuan-Ping Chen</searchLink><br /><searchLink fieldCode="AR" term="%22Li+Su%22">Li Su</searchLink><br /><searchLink fieldCode="AR" term="%22Yi-Hsuan+Yang%22">Yi-Hsuan Yang</searchLink>
– Name: TitleSource
  Label: Source
  Group: Src
  Data: International Society for Music Information Retrieval Conference Proceedings; 2019, p15-28, 14p
– Name: Subject
  Label: Subject Terms
  Group: Su
  Data: <searchLink fieldCode="DE" term="%22MUSICAL+analysis%22">MUSICAL analysis</searchLink><br /><searchLink fieldCode="DE" term="%22GUITAR+playing%22">GUITAR playing</searchLink><br /><searchLink fieldCode="DE" term="%22SIGNAL+processing%22">SIGNAL processing</searchLink><br /><searchLink fieldCode="DE" term="%22MUSICAL+performance%22">MUSICAL performance</searchLink>
– Name: Abstract
  Label: Abstract
  Group: Ab
  Data: The employment of playing techniques such as string bend and vibrato in electric guitar performance makes it difficult to transcribe the note events using general note tracking methods. These methods analyze the contour of fundamental frequency computed from a given audio signal, but they do not consider the variation in the contour caused by the playing techniques. To address this issue, we present a model called technique-embedded note tracking (TENT) that uses the result of playing technique detection to inform note event estimation. We evaluate the proposed model on a dataset of 42 unaccompanied lead guitar phrases. Our experiments showed that TENT can nicely recognize complicated skills in monophonic guitar solos and improve the F-score of note event estimation by 14.7% compared to an existing method. For reproducibility, we share the Python source code of our implementation of TENT at the following GitHub repo: https://github.com/srviest/SoloLa. [ABSTRACT FROM AUTHOR]
– Name: Abstract
  Label:
  Group: Ab
  Data: <i>Copyright of International Society for Music Information Retrieval Conference Proceedings is the property of Ubiquity Press and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract.</i> (Copyright applies to all Abstracts.)
PLink https://erproxy.cvtisr.sk/sfx/access?url=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edb&AN=139146746
RecordInfo BibRecord:
  BibEntity:
    Identifiers:
      – Type: doi
        Value: 10.5334/tismir.23
    Languages:
      – Code: eng
        Text: English
    PhysicalDescription:
      Pagination:
        PageCount: 14
        StartPage: 15
    Subjects:
      – SubjectFull: MUSICAL analysis
        Type: general
      – SubjectFull: GUITAR playing
        Type: general
      – SubjectFull: SIGNAL processing
        Type: general
      – SubjectFull: MUSICAL performance
        Type: general
    Titles:
      – TitleFull: TENT: Technique-Embedded Note Tracking for Real-World Guitar Solo Recordings.
        Type: main
  BibRelationships:
    HasContributorRelationships:
      – PersonEntity:
          Name:
            NameFull: Ting-Wei Su
      – PersonEntity:
          Name:
            NameFull: Yuan-Ping Chen
      – PersonEntity:
          Name:
            NameFull: Li Su
      – PersonEntity:
          Name:
            NameFull: Yi-Hsuan Yang
    IsPartOfRelationships:
      – BibEntity:
          Dates:
            – D: 01
              M: 01
              Text: 2019
              Type: published
              Y: 2019
          Titles:
            – TitleFull: International Society for Music Information Retrieval Conference Proceedings
              Type: main
ResultId 1