Fast Neural Style Transfer for Motion Data

Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-f...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE computer graphics and applications Ročník 37; číslo 4; s. 42 - 49
Hlavní autori: Holden, Daniel, Habibie, Ikhsanul, Kusajima, Ikuo, Komura, Taku
Médium: Magazine Article
Jazyk:English
Vydavateľské údaje: United States IEEE 2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0272-1716, 1558-1756, 1558-1756
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization.
AbstractList Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization.
Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization.Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization.
Author Kusajima, Ikuo
Habibie, Ikhsanul
Holden, Daniel
Komura, Taku
Author_xml – sequence: 1
  givenname: Daniel
  surname: Holden
  fullname: Holden, Daniel
  email: s0822954@sms.ed.ac.uk
  organization: Univ. of Edinburgh, Edinburgh, UK
– sequence: 2
  givenname: Ikhsanul
  surname: Habibie
  fullname: Habibie, Ikhsanul
  email: abie.ikhsan@gmail.com
  organization: Univ. of Edinburgh, Edinburgh, UK
– sequence: 3
  givenname: Ikuo
  surname: Kusajima
  fullname: Kusajima, Ikuo
  email: kusajima@ynl.t.u-tokyo.ac.jp
  organization: Univ. of Tokyo, Tokyo, Japan
– sequence: 4
  givenname: Taku
  surname: Komura
  fullname: Komura, Taku
  email: tkomura@ed.ac.uk
  organization: Univ. of Edinburgh, Edinburgh, UK
BackLink https://www.ncbi.nlm.nih.gov/pubmed/28829292$$D View this record in MEDLINE/PubMed
BookMark eNp9kM1LwzAYh4NMdJveBUEKXkTYzJuvJkeZbgqbHpznkrZvoNK1M2kP--9t2dzBg-Tw5vD83o9nRAZVXSEhV0CnANQ8rGaLKaMQTzmLQShxQoYgpZ5ALNWADCmLWfcHdU5GIXxRSqUEekbOmdbMdG9I7uc2NNEbtt6W0UezKzFae1sFhz5ytY9WdVPUVfRkG3tBTp0tA14e6ph8zp_Xs5fJ8n3xOntcTjIBvJnEVGiTchM7kbk0lSozDBQYk-VGGYlKOrQogHFjUNDMOe1ymavcxiitS_mY3O37bn393WJokk0RMixLW2HdhgQMByY4NbpDb_-gX3Xrq267nqJa90d21M2BatMN5snWFxvrd8mvhQ6geyDzdQge3REBmvSik0500otODqK7iPoTyYrG9q4ab4vyv-D1Plgg4nGOpsBFLPkPkzSHEw
CODEN ICGADZ
CitedBy_id crossref_primary_10_1016_j_cogsys_2021_11_003
crossref_primary_10_1145_3386569_3392469
crossref_primary_10_1145_3463499
crossref_primary_10_1016_j_entcom_2023_100625
crossref_primary_10_1016_j_patcog_2022_108894
crossref_primary_10_1111_cgf_15169
crossref_primary_10_1016_j_cogsys_2023_05_010
crossref_primary_10_3233_JIFS_224175
crossref_primary_10_1109_ACCESS_2019_2917609
crossref_primary_10_1016_j_cag_2022_11_008
crossref_primary_10_1111_cgf_13946
crossref_primary_10_1145_3522618
crossref_primary_10_1145_3516429
crossref_primary_10_1016_j_procs_2023_10_124
crossref_primary_10_1109_TAFFC_2019_2906167
crossref_primary_10_1016_j_ins_2020_08_060
crossref_primary_10_1016_j_entcom_2019_100300
crossref_primary_10_1111_cgf_70093
crossref_primary_10_3390_computers10030038
crossref_primary_10_1109_TSMC_2024_3502498
crossref_primary_10_3390_electronics13101970
crossref_primary_10_1080_10826068_2021_1980799
crossref_primary_10_1007_s00138_023_01399_x
crossref_primary_10_1111_cgf_13555
crossref_primary_10_1111_cgf_14645
crossref_primary_10_1109_TVCG_2023_3320216
crossref_primary_10_1111_cgf_14426
crossref_primary_10_3390_s23052597
crossref_primary_10_1145_3432199
crossref_primary_10_1007_s11042_024_18238_4
crossref_primary_10_1145_3480145
crossref_primary_10_1007_s13042_021_01304_w
crossref_primary_10_1109_TAFFC_2022_3226252
crossref_primary_10_1145_3340254
crossref_primary_10_1002_cav_1996
Cites_doi 10.1145/566654.566608
10.1145/2820903.2820918
10.1145/2766999
10.1109/CVPR.2015.7298965
10.1109/ICCV.2015.494
10.1145/1730804.1730811
10.1145/1073204.1073315
10.1145/1553374.1553505
10.1145/2897824.2925975
10.1145/2897824.2925955
10.1145/218380.218419
10.1145/218380.218421
10.1109/ICCV.2015.55
ContentType Magazine Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2017
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2017
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
DOI 10.1109/MCG.2017.3271464
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Xplore
CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList PubMed
Computer and Information Systems Abstracts

MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1558-1756
EndPage 49
ExternalDocumentID 28829292
10_1109_MCG_2017_3271464
8013475
Genre orig-research
Journal Article
GroupedDBID ---
-DZ
-~X
0R~
29I
4.4
5GY
5VS
6IK
85S
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACIWK
ACNCT
AENEX
AETEA
AETIX
AFFNX
AFOGA
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
AZLTO
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
D0L
DU5
EBS
EJD
F5P
HZ~
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MVM
O9-
OCL
P2P
RIA
RIE
RNI
RNS
RZB
TN5
VH1
XJT
YYQ
YZZ
ZY4
AAYXX
CITATION
AAYOK
ABTAH
NPM
RIG
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c413t-70489b397f4cfbb56c9216199cd9695e65feae412399e40cff8fd5d6da7e5afb3
IEDL.DBID RIE
ISICitedReferencesCount 71
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000411626600007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0272-1716
1558-1756
IngestDate Sun Nov 09 12:06:04 EST 2025
Sun Jun 29 12:42:01 EDT 2025
Thu Apr 03 06:58:30 EDT 2025
Tue Nov 18 21:35:38 EST 2025
Sat Nov 29 03:27:45 EST 2025
Wed Aug 27 02:47:53 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c413t-70489b397f4cfbb56c9216199cd9695e65feae412399e40cff8fd5d6da7e5afb3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
PMID 28829292
PQID 1930889292
PQPubID 85490
PageCount 8
ParticipantIDs crossref_citationtrail_10_1109_MCG_2017_3271464
proquest_miscellaneous_1931243098
proquest_journals_1930889292
pubmed_primary_28829292
ieee_primary_8013475
crossref_primary_10_1109_MCG_2017_3271464
PublicationCentury 2000
PublicationDate 2017-00-00
PublicationDateYYYYMMDD 2017-01-01
PublicationDate_xml – year: 2017
  text: 2017-00-00
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Los Alamitos
PublicationTitle IEEE computer graphics and applications
PublicationTitleAbbrev CG-M
PublicationTitleAlternate IEEE Comput Graph Appl
PublicationYear 2017
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
sutskever (ref17) 0
harvey (ref15) 2015
gatys (ref21) 0
hsu (ref8) 2005; 24
ref2
nair (ref20) 0
xia (ref10) 2015; 34
gatys (ref1) 2015
dong (ref11) 0
ref16
ref19
ref18
ref7
zeiler (ref14) 0
ref9
kingma (ref22) 2014
ref4
ref6
ref5
johnson (ref3) 2016
References_xml – ident: ref6
  doi: 10.1145/566654.566608
– ident: ref19
  doi: 10.1145/2820903.2820918
– start-page: 184
  year: 0
  ident: ref11
  article-title: Learning a Deep Convolutional Network for Image Super-Resolution
  publication-title: Proc European Conf Computer Vision
– year: 2015
  ident: ref1
  article-title: A Neural Algorithm of Artistic Style
  publication-title: ArXiv Preprint
– start-page: 807
  year: 0
  ident: ref20
  article-title: Rectified Linear Units Improve Restricted Boltzmann Machines
  publication-title: Proc 27th Int'l Conf Machine Learning (ICML)
– volume: 34
  year: 2015
  ident: ref10
  article-title: Realtime Style Transfer for Unlabeled Heterogeneous Human Motion
  publication-title: ACM Trans Graphics
  doi: 10.1145/2766999
– ident: ref13
  doi: 10.1109/CVPR.2015.7298965
– ident: ref18
  doi: 10.1109/ICCV.2015.494
– ident: ref9
  doi: 10.1145/1730804.1730811
– start-page: 262
  year: 0
  ident: ref21
  article-title: Texture Synthesis Using Convolutional Neural Networks
  publication-title: Proc Advances in Neural Information Processing Systems (NIPS)
– year: 2016
  ident: ref3
  article-title: Perceptual Losses for Real-Time Style Transfer and Super-Resolution
  publication-title: ArXiv Preprint
– volume: 24
  start-page: 1082
  year: 2005
  ident: ref8
  article-title: Style Translation for Human Motion
  publication-title: ACM Trans Graphics
  doi: 10.1145/1073204.1073315
– ident: ref16
  doi: 10.1145/1553374.1553505
– year: 2015
  ident: ref15
  article-title: Semi-supervised Learning with Encoder-Decoder Recurrent Neural Networks: Experiments with Motion Capture Sequences
  publication-title: ArXiv Preprint
– start-page: 818
  year: 0
  ident: ref14
  article-title: Visualizing and Understanding Convolutional Networks
  publication-title: Proc European Conf Computer Vision
– ident: ref2
  doi: 10.1145/2897824.2925975
– year: 2014
  ident: ref22
  article-title: Adam: A Method for Stochastic Optimization
  publication-title: ArXiv Preprint
– ident: ref7
  doi: 10.1145/2897824.2925955
– start-page: 1601
  year: 0
  ident: ref17
  article-title: The Recurrent Temporal Restricted Boltzmann Machine
  publication-title: Proc Advances in Neural Information Processing Systems (NIPS)
– ident: ref4
  doi: 10.1145/218380.218419
– ident: ref5
  doi: 10.1145/218380.218421
– ident: ref12
  doi: 10.1109/ICCV.2015.55
SSID ssj0005510
Score 1.4096133
Snippet Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 42
SubjectTerms Analytical models
Animation
computer graphics
Data models
Data transfer (computers)
deep learning
Human motion
machine learning
motion capture
Motion control
Neural networks
Optimization
style transfer
Training
Virtual reality
Title Fast Neural Style Transfer for Motion Data
URI https://ieeexplore.ieee.org/document/8013475
https://www.ncbi.nlm.nih.gov/pubmed/28829292
https://www.proquest.com/docview/1930889292
https://www.proquest.com/docview/1931243098
Volume 37
WOSCitedRecordID wos000411626600007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fa9swED7a0IftJV3Trt6y4sJeVuZGkWXLeizdsr60DNpB3owsnWAQkpI4hf73Pck_2sE62IsxWLbF3Un36XS6D-BzYfNUaGESlmtaoAiGCfl1ljjMuZUpmbUWgWxC3twU87n6uQNf-7MwiBiSz_Dc34a9fLsyWx8qm9BsmgqZ7cKulLI5q_WczpFNm3iK5IkvAdNtSTI1ub784XO45HnKJU0MnoqHE7AkYMD_8EaBXuV1pBk8zmz4f33dh2FXKTq-aGzhHezg8gDevig4OIKzmd7UsS_IoRfxbf24wDg4K4frmNBrfB04feJvutaH8Gv2_e7yKmnZEhJDjqhOJI1FVRG8cMK4qspyozjBOaWMVbnKMM8cahRTf5gVBTPOFc5mNrdaYqZdlR7BYLla4jHEhUN0tihSugrDKo3cMOGknRqRWZ5HMOmkVpq2lLhntFiUYUnBVEkiL73Iy1bkEXzp37hvymj8o-3Ii7Nv10oygnGnmLIdZ5uS4KfP0yJNRnDaP6YR4rc99BJX29CGQEzKVBHB-0ah_bc7O_jw939-hDe-Z03IZQyDer3FT7BnHurfm_UJmeG8OAlm-AT_0NR8
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3faxQxEB5qK6gvVat2teoKvihuL5dNNpvHUnu2tHcIVujbkk0mIBx35W5P8L93kv1hhVrwZVnY2d0wmcl8SSbzAbwvXZELI2zGCkMTFMEwo7jOMo8FdyonszYikk2o2ay8utJft-DTcBYGEWPyGR6G27iX75Z2E5bKRjSa5kLJe7AjheDj9rTWn4QOOW5XVBTPQhGYflOS6dH0-EvI4lKHOVc0NAQyHk7QkqAB_yseRYKVf2PNGHMmu__X2sew29eKTo9aa3gCW7h4Co9ulBzcg48Ts27SUJLDzNNvza85pjFceVylhF_TaWT1ST-bxjyD75OTy-PTrONLyCyFoiZT5I26JoDhhfV1LQurOQE6ra3ThZZYSI8GxTgcZ0XBrPeld9IVziiUxtf5c9heLBe4D2npEb0ry5yuwrLaILdMeOXGVkjHiwRGvdYq2xUTD5wW8ypOKpiuSOVVUHnVqTyBD8Mb120hjTtk94I6B7lOkwkc9B1TdZ62rgiAhkwt6skE3g2PyUfCxodZ4HITZQjG5EyXCbxoO3T4dm8HL2__51t4cHo5vaguzmbnr-BhaGW7AHMA281qg6_hvv3Z_Fiv3kRj_A01-9bb
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Fast+Neural+Style+Transfer+for+Motion+Data&rft.jtitle=IEEE+computer+graphics+and+applications&rft.au=Holden%2C+Daniel&rft.au=Habibie%2C+Ikhsanul&rft.au=Kusajima%2C+Ikuo&rft.au=Komura%2C+Taku&rft.date=2017-01-01&rft.eissn=1558-1756&rft.volume=37&rft.issue=4&rft.spage=42&rft_id=info:doi/10.1109%2FMCG.2017.3271464&rft_id=info%3Apmid%2F28829292&rft.externalDocID=28829292
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0272-1716&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0272-1716&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0272-1716&client=summon