Evaluation in Neural Style Transfer: A Review

The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and photorealistic images and videos of exceptional quality. To evaluate such results, a diverse landscape of evaluation methods and metrics is used, in...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 43; no. 6
Main Authors: Ioannou, Eleftherios, Maddock, Steve
Format: Journal Article
Language:English
Published: Oxford Blackwell Publishing Ltd 01.09.2024
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and photorealistic images and videos of exceptional quality. To evaluate such results, a diverse landscape of evaluation methods and metrics is used, including authors' opinions based on side‐by‐side comparisons, human evaluation studies that quantify the subjective judgements of participants, and a multitude of quantitative computational metrics which objectively assess the different aspects of an algorithm's performance. However, there is no consensus regarding the most suitable and effective evaluation procedure that can guarantee the reliability of the results. In this review, we provide an in‐depth analysis of existing evaluation techniques, identify the inconsistencies and limitations of current evaluation methods, and give recommendations for standardized evaluation practices. We believe that the development of a robust evaluation framework will not only enable more meaningful and fairer comparisons among NST methods but will also enhance the comprehension and interpretation of research findings in the field. This review paper examines the different evaluation techniques in neural style transfer. It provides an in‐depth analysis of existing evaluation techniques and gives recommendations for standardized evaluation practices.
AbstractList The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and photorealistic images and videos of exceptional quality. To evaluate such results, a diverse landscape of evaluation methods and metrics is used, including authors' opinions based on side‐by‐side comparisons, human evaluation studies that quantify the subjective judgements of participants, and a multitude of quantitative computational metrics which objectively assess the different aspects of an algorithm's performance. However, there is no consensus regarding the most suitable and effective evaluation procedure that can guarantee the reliability of the results. In this review, we provide an in‐depth analysis of existing evaluation techniques, identify the inconsistencies and limitations of current evaluation methods, and give recommendations for standardized evaluation practices. We believe that the development of a robust evaluation framework will not only enable more meaningful and fairer comparisons among NST methods but will also enhance the comprehension and interpretation of research findings in the field.
The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and photorealistic images and videos of exceptional quality. To evaluate such results, a diverse landscape of evaluation methods and metrics is used, including authors' opinions based on side‐by‐side comparisons, human evaluation studies that quantify the subjective judgements of participants, and a multitude of quantitative computational metrics which objectively assess the different aspects of an algorithm's performance. However, there is no consensus regarding the most suitable and effective evaluation procedure that can guarantee the reliability of the results. In this review, we provide an in‐depth analysis of existing evaluation techniques, identify the inconsistencies and limitations of current evaluation methods, and give recommendations for standardized evaluation practices. We believe that the development of a robust evaluation framework will not only enable more meaningful and fairer comparisons among NST methods but will also enhance the comprehension and interpretation of research findings in the field. This review paper examines the different evaluation techniques in neural style transfer. It provides an in‐depth analysis of existing evaluation techniques and gives recommendations for standardized evaluation practices.
The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and photorealistic images and videos of exceptional quality. To evaluate such results, a diverse landscape of evaluation methods and metrics is used, including authors' opinions based on side‐by‐side comparisons, human evaluation studies that quantify the subjective judgements of participants, and a multitude of quantitative computational metrics which objectively assess the different aspects of an algorithm's performance. However, there is no consensus regarding the most suitable and effective evaluation procedure that can guarantee the reliability of the results. In this review, we provide an in‐depth analysis of existing evaluation techniques, identify the inconsistencies and limitations of current evaluation methods, and give recommendations for standardized evaluation practices. We believe that the development of a robust evaluation framework will not only enable more meaningful and fairer comparisons among NST methods but will also enhance the comprehension and interpretation of research findings in the field.
Author Ioannou, Eleftherios
Maddock, Steve
Author_xml – sequence: 1
  givenname: Eleftherios
  orcidid: 0000-0003-3892-2492
  surname: Ioannou
  fullname: Ioannou, Eleftherios
  email: eioannou1@sheffield.ac.uk
  organization: The University of Sheffield
– sequence: 2
  givenname: Steve
  orcidid: 0000-0003-3179-0263
  surname: Maddock
  fullname: Maddock, Steve
  email: s.maddock@sheffield.ac.uk
  organization: The University of Sheffield
BookMark eNp1kDFPwzAQhS1UJNrCwD-IxMSQ1o5zscNWVW1BqkCCMluu7SBXISl20ir_HtN0QnDL3fC993RvhAZVXRmEbgmekDBT9VFMCJAMLtCQpBmLeQb5AA0xCTfDAFdo5P0OY5yyDIYoXhxk2crG1lVkq-jZtE6W0VvTlSbaOFn5wriHaBa9moM1x2t0WcjSm5vzHqP35WIzf4zXL6un-WwdK0oTiAupFZgQkQNPNKbJVqZSMsoTDAoIy7VOOFWMq4xumebAVIJNChKYBK0NHaO73nfv6q_W-Ebs6tZVIVJQgjljlGEcqGlPKVd770whlG1OrzRO2lIQLH46EaETceokKO5_KfbOfkrX_cme3Y-2NN3_oJivlr3iG9wmcB4
CitedBy_id crossref_primary_10_1631_FITEE_2400904
Cites_doi 10.1109/CVPR42600.2020.00813
10.1109/CVPR.2017.36
10.5244/C.31.153
10.1109/ICCV51070.2023.02080
10.1109/ICCV.2019.00467
10.1109/CVPR.2012.6247954
10.1145/3503161.3547939
10.1109/TVCG.2019.2921336
10.5244/C.31.114
10.1109/CVPR46437.2021.00092
10.1109/CVPR.2019.00603
10.1109/CVPR.2018.00858
10.1080/01621459.1937.10503522
10.1109/ICCV.2017.167
10.1038/s41598-022-08078-3
10.1007/978-3-031-16788-1_34
10.1016/j.cviu.2021.103203
10.1007/978-3-319-46487-9_43
10.1109/CVPR.2017.740
10.1109/ICCV.2019.00913
10.1007/978-3-319-70139-4
10.1109/ICCV.2015.316
10.1109/CVPR46437.2021.01140
10.1109/CVPR52729.2023.01758
10.1109/CVPR46437.2021.00370
10.24963/ijcai.2017/310
10.1109/CVPR.2018.00068
10.1109/TIP.2003.819861
10.1109/TIP.2020.3024018
10.1007/978-3-030-11018-5_32
10.1609/aaai.v37i1.25212
10.1109/CVPR52729.2023.02144
10.1109/TMM.2024.3410672
10.1109/CVPR.2016.265
10.1007/978-3-319-10602-1_48
10.1109/ICCV51070.2023.02112
10.1609/aaai.v35i2.16208
10.1145/3123266.3123425
10.1109/CVPR52729.2023.01362
10.1007/978-3-319-46475-6_43
10.1109/ICCV48922.2021.01459
10.1145/3605548
10.1037/0033-2909.114.3.494
10.1109/ICCV.2017.355
10.1007/978-3-030-01219-9_28
10.1109/ACCESS.2021.3112996
10.1109/ICCV.2015.164
10.1126/science.adf6369
10.1109/CVPR42600.2020.01383
10.1109/ICCV48922.2021.00658
10.1007/978-3-030-01267-0_11
10.1109/CVPR.2017.296
10.1007/978-3-031-19787-1_11
10.1007/978-1-4612-4380-9_16
10.1109/ICCV.2019.00452
10.1109/WACV48630.2021.00113
10.1145/3306346.3323006
10.1109/TIP.2018.2831899
10.1109/CVPR.2017.397
10.1037/h0026141
10.1109/TMM.2021.3063605
10.1007/s11263-018-1089-z
10.1007/978-3-030-01237-3_43
10.1109/CVPR52729.2023.00972
10.1109/CVPR.2017.622
10.3390/jimaging8110310
10.1007/978-3-030-20876-9_40
10.1016/j.cag.2017.05.025
10.1109/CVPR.2016.272
10.1080/10447318.2022.2049081
10.1109/ICCVW.2019.00396
10.3390/computers12040069
10.1145/3092919.3092924
10.24963/ijcai.2022/687
10.1109/CVPR52688.2022.01104
10.1109/CVPR.2017.745
10.5244/C.28.122
10.1109/CVPR.2018.00841
10.1109/WACV45572.2020.9093420
10.1017/CBO9780511761676
10.1109/CVPR.2013.271
10.1109/CVPR52729.2023.00576
10.1561/0600000106
10.1007/978-3-319-46475-6_7
10.1109/TIP.2019.2936746
10.1109/CVPR.2019.00393
10.1109/ICCV.2017.136
10.1609/aaai.v34i07.6614
10.1145/3394171.3413853
10.1109/WACV56688.2023.00041
10.1007/978-3-319-45886-1_3
10.1109/CVPR.2017.437
10.1007/s10618-008-0114-1
10.1016/j.aiopen.2023.08.012
ContentType Journal Article
Copyright 2024 The Author(s). published by Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd.
2024. This article is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2024 The Author(s). published by Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd.
– notice: 2024. This article is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 24P
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
DOI 10.1111/cgf.15165
DatabaseName Wiley Online Library Open Access
CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList CrossRef

Computer and Information Systems Abstracts
Database_xml – sequence: 1
  dbid: 24P
  name: Wiley Online Library Open Access
  url: https://authorservices.wiley.com/open-science/open-access/browse-journals.html
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1467-8659
EndPage n/a
ExternalDocumentID 10_1111_cgf_15165
CGF15165
Genre article
GrantInformation_xml – fundername: Engineering and Physical Sciences Research Council
  funderid: EP/R513313/1
GroupedDBID .3N
.4S
.DC
.GA
.Y3
05W
0R~
10A
15B
1OB
1OC
24P
29F
31~
33P
3SF
4.4
50Y
50Z
51W
51X
52M
52N
52O
52P
52S
52T
52U
52W
52X
5GY
5HH
5LA
5VS
66C
6J9
702
7PT
8-0
8-1
8-3
8-4
8-5
8UM
8VB
930
A03
AAESR
AAEVG
AAHHS
AAHQN
AAMNL
AANHP
AANLZ
AAONW
AASGY
AAXRX
AAYCA
AAZKR
ABCQN
ABCUV
ABDBF
ABDPE
ABEML
ABPVW
ACAHQ
ACBWZ
ACCFJ
ACCZN
ACFBH
ACGFS
ACPOU
ACRPL
ACSCC
ACUHS
ACXBN
ACXQS
ACYXJ
ADBBV
ADEOM
ADIZJ
ADKYN
ADMGS
ADNMO
ADOZA
ADXAS
ADZMN
ADZOD
AEEZP
AEGXH
AEIGN
AEIMD
AEMOZ
AENEX
AEQDE
AEUQT
AEUYR
AFBPY
AFEBI
AFFNX
AFFPM
AFGKR
AFPWT
AFWVQ
AFZJQ
AHBTC
AHEFC
AHQJS
AITYG
AIURR
AIWBW
AJBDE
AJXKR
AKVCP
ALAGY
ALMA_UNASSIGNED_HOLDINGS
ALUQN
ALVPJ
AMBMR
AMYDB
ARCSS
ASPBG
ATUGU
AUFTA
AVWKF
AZBYB
AZFZN
AZVAB
BAFTC
BDRZF
BFHJK
BHBCM
BMNLL
BMXJE
BNHUX
BROTX
BRXPI
BY8
CAG
COF
CS3
CWDTD
D-E
D-F
DCZOG
DPXWK
DR2
DRFUL
DRSTM
DU5
EAD
EAP
EBA
EBO
EBR
EBS
EBU
EDO
EJD
EMK
EST
ESX
F00
F01
F04
F5P
FEDTE
FZ0
G-S
G.N
GODZA
H.T
H.X
HF~
HGLYW
HVGLF
HZI
HZ~
I-F
IHE
IX1
J0M
K1G
K48
LATKE
LC2
LC3
LEEKS
LH4
LITHE
LOXES
LP6
LP7
LUTES
LW6
LYRES
MEWTI
MK4
MRFUL
MRSTM
MSFUL
MSSTM
MXFUL
MXSTM
N04
N05
N9A
NF~
O66
O9-
OIG
P2W
P2X
P4D
PALCI
PQQKQ
Q.N
Q11
QB0
QWB
R.K
RDJ
RIWAO
RJQFR
ROL
RX1
SAMSI
SUPJJ
TH9
TN5
TUS
UB1
V8K
W8V
W99
WBKPD
WIH
WIK
WOHZO
WQJ
WRC
WXSBR
WYISQ
WZISG
XG1
ZL0
ZZTAW
~IA
~IF
~WT
AAMMB
AAYXX
ADMLS
AEFGJ
AEYWJ
AGHNM
AGQPQ
AGXDD
AGYGG
AIDQK
AIDYY
AIQQE
CITATION
O8X
7SC
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c3325-fadc5e0049582d032ba4aa738205c5179dd283c78c63b7d857c20e45a57a5dde3
IEDL.DBID 24P
ISICitedReferencesCount 1
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001281438900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0167-7055
IngestDate Sat Jul 26 01:37:27 EDT 2025
Sat Nov 29 03:41:23 EST 2025
Tue Nov 18 22:21:53 EST 2025
Wed Jan 22 17:14:27 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 6
Language English
License Attribution
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c3325-fadc5e0049582d032ba4aa738205c5179dd283c78c63b7d857c20e45a57a5dde3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-3892-2492
0000-0003-3179-0263
OpenAccessLink https://onlinelibrary.wiley.com/doi/abs/10.1111%2Fcgf.15165
PQID 3108773700
PQPubID 30877
PageCount 26
ParticipantIDs proquest_journals_3108773700
crossref_citationtrail_10_1111_cgf_15165
crossref_primary_10_1111_cgf_15165
wiley_primary_10_1111_cgf_15165_CGF15165
PublicationCentury 2000
PublicationDate September 2024
2024-09-00
20240901
PublicationDateYYYYMMDD 2024-09-01
PublicationDate_xml – month: 09
  year: 2024
  text: September 2024
PublicationDecade 2020
PublicationPlace Oxford
PublicationPlace_xml – name: Oxford
PublicationTitle Computer graphics forum
PublicationYear 2024
Publisher Blackwell Publishing Ltd
Publisher_xml – name: Blackwell Publishing Ltd
References 2021; 9
2023; 380
2023; 12
2012
2023; 39
2023; 15
2010
2023; 37
2018; 126
2021; 207
2017; 67
2019; 38
2022; 24
1995
2020; 34
1992
2024
2018; 27
2023; 42
2021; 35
2017; 30
2016; 1
2021; 34
2023
2022
2021
2020
2022; 8
2004; 13
2022; 12
2019
2018
2020; 26
2022; 36
2019; 29
2017
2016
2015
1937; 32
2014
2020; 44
2013
1993; 114
1968; 70
2009; 18
1988
2020; 29
e_1_2_9_52_2
e_1_2_9_98_2
e_1_2_9_71_2
e_1_2_9_10_2
e_1_2_9_33_2
e_1_2_9_56_2
e_1_2_9_94_2
e_1_2_9_75_2
e_1_2_9_90_2
e_1_2_9_107_2
e_1_2_9_126_2
e_1_2_9_122_2
e_1_2_9_103_2
e_1_2_9_14_2
e_1_2_9_37_2
e_1_2_9_79_2
e_1_2_9_87_2
e_1_2_9_60_2
e_1_2_9_45_2
e_1_2_9_22_2
e_1_2_9_64_2
e_1_2_9_6_2
e_1_2_9_119_2
e_1_2_9_2_2
Chen H. (e_1_2_9_18_2) 2021; 34
e_1_2_9_111_2
e_1_2_9_115_2
e_1_2_9_49_2
e_1_2_9_26_2
e_1_2_9_68_2
e_1_2_9_30_2
e_1_2_9_72_2
e_1_2_9_99_2
e_1_2_9_34_2
e_1_2_9_76_2
e_1_2_9_11_2
e_1_2_9_53_2
e_1_2_9_91_2
e_1_2_9_129_2
Ioannou E. (e_1_2_9_44_2) 2022
e_1_2_9_102_2
e_1_2_9_125_2
Malpica S. (e_1_2_9_83_2) 2023
e_1_2_9_106_2
e_1_2_9_121_2
e_1_2_9_38_2
e_1_2_9_57_2
e_1_2_9_19_2
Chen H. (e_1_2_9_17_2) 2023
e_1_2_9_61_2
e_1_2_9_88_2
e_1_2_9_23_2
e_1_2_9_42_2
e_1_2_9_65_2
e_1_2_9_84_2
e_1_2_9_5_2
e_1_2_9_80_2
e_1_2_9_118_2
e_1_2_9_110_2
e_1_2_9_9_2
e_1_2_9_114_2
Heusel M. (e_1_2_9_41_2) 2017; 30
e_1_2_9_27_2
e_1_2_9_46_2
e_1_2_9_69_2
e_1_2_9_73_2
e_1_2_9_50_2
e_1_2_9_77_2
e_1_2_9_12_2
e_1_2_9_31_2
e_1_2_9_54_2
e_1_2_9_96_2
e_1_2_9_109_2
e_1_2_9_92_2
e_1_2_9_101_2
e_1_2_9_128_2
e_1_2_9_105_2
e_1_2_9_124_2
e_1_2_9_16_2
e_1_2_9_35_2
e_1_2_9_58_2
e_1_2_9_120_2
e_1_2_9_39_2
e_1_2_9_62_2
e_1_2_9_89_2
e_1_2_9_20_2
e_1_2_9_66_2
e_1_2_9_43_2
e_1_2_9_85_2
e_1_2_9_4_2
e_1_2_9_81_2
e_1_2_9_113_2
e_1_2_9_117_2
e_1_2_9_8_2
e_1_2_9_24_2
e_1_2_9_47_2
e_1_2_9_28_2
e_1_2_9_51_2
e_1_2_9_74_2
e_1_2_9_97_2
e_1_2_9_78_2
e_1_2_9_93_2
e_1_2_9_55_2
e_1_2_9_32_2
e_1_2_9_108_2
e_1_2_9_70_2
e_1_2_9_100_2
e_1_2_9_127_2
e_1_2_9_104_2
e_1_2_9_123_2
e_1_2_9_13_2
e_1_2_9_59_2
e_1_2_9_36_2
e_1_2_9_40_2
e_1_2_9_63_2
e_1_2_9_86_2
e_1_2_9_21_2
e_1_2_9_67_2
e_1_2_9_82_2
e_1_2_9_7_2
Ranftl R. (e_1_2_9_95_2) 2020
e_1_2_9_3_2
e_1_2_9_112_2
e_1_2_9_116_2
Cohen J. (e_1_2_9_15_2) 1988
e_1_2_9_25_2
e_1_2_9_48_2
e_1_2_9_29_2
References_xml – start-page: 14861
  year: 2021
  end-page: 14869
  article-title: Manifold alignment for semantically aligned style transfer
– year: 2018
  article-title: Multi‐style generative network for real‐time transfer
– start-page: 1716
  year: 2017
  end-page: 1724
  article-title: Laplacian‐steered neural style transfer
– volume: 1
  start-page: 4
  year: 2016
  article-title: Texture networks: Feed‐forward synthesis of textures and stylized images
– start-page: 3920
  year: 2017
  end-page: 3928
  article-title: Diversified texture synthesis with feed‐forward networks
– volume: 126
  start-page: 1199
  issue: 11
  year: 2018
  end-page: 1219
  article-title: Artistic style transfer for videos and spherical images
  publication-title: International Journal of Computer Vision
– start-page: 22388
  year: 2023
  end-page: 22397
  article-title: Towards artistic image aesthetics assessment: A large‐scale dataset and a new method
– start-page: 10083
  year: 2023
  end-page: 10092
  article-title: Learning dynamic style kernels for artistic style transfer
– start-page: 170
  year: 2018
  end-page: 185
  article-title: Learning blind video temporal consistency
– start-page: 1089
  year: 2021
  end-page: 1098
  article-title: Real‐time localized photorealistic video style transfer
– volume: 207
  year: 2021
  article-title: Evaluate and improve the quality of neural style transfer
  publication-title: Computer Vision and Image Understanding
– start-page: 6924
  year: 2017
  end-page: 6932
  article-title: Improved texture networks: Maximizing quality and diversity in feed‐forward stylization and texture synthesis
– start-page: 586
  year: 2018
  end-page: 595
  article-title: The unreasonable effectiveness of deep features as a perceptual metric
– start-page: 23109
  year: 2023
  end-page: 23119
  article-title: All‐to‐key attention for arbitrary style transfer
– volume: 34
  start-page: 10443
  issue: 07
  year: 2020
  end-page: 10450
  article-title: Ultrafast photorealistic style transfer via neural architecture search
– start-page: 22758
  year: 2023
  end-page: 22767
  article-title: AesPA‐Net: Aesthetic pattern‐aware style transfer networks
– volume: 15
  start-page: 201
  issue: 3
  year: 2023
  end-page: 252
  article-title: Towards better user studies in computer graphics and vision
  publication-title: Foundations and Trends® in Computer Graphics and Vision
– start-page: 23545
  year: 2024
  end-page: 23554
– year: 2022
– start-page: 637
  year: 2018
  end-page: 653
– start-page: 3277
  year: 2017
  end-page: 3285
  article-title: DSLR‐quality photos on mobile devices with deep convolutional networks
– volume: 30
  year: 2017
  article-title: Universal style transfer via feature transforms
  publication-title: Advances in Neural Information Processing Systems
– start-page: 2758
  year: 2015
  end-page: 2766
  article-title: Flownet: Learning optical flow with convolutional networks
– volume: 13
  start-page: 600
  issue: 4
  year: 2004
  end-page: 612
  article-title: Image quality assessment: From error visibility to structural similarity
  publication-title: IEEE Transactions on Image Processing
– start-page: 3702
  year: 2021
  end-page: 3711
  article-title: Learning to warp for style transfer
– year: 2019
– start-page: 694
  year: 2016
  end-page: 711
  article-title: Perceptual losses for real‐time style transfer and super‐resolution
– start-page: 2230
  year: 2017
  end-page: 2236
  article-title: Demystifying neural style transfer
– start-page: 18329
  year: 2023
  end-page: 18338
  article-title: Master: Meta style transformer for controllable zero‐shot and few‐shot artistic style transfer
– volume: 26
  start-page: 3365
  year: 2020
  end-page: 3385
  article-title: Neural style transfer: A review
  publication-title: IEEE Transactions on Visualization and Computer Graphics
– start-page: 3320
  year: 2020
  end-page: 3329
  article-title: Aesthetic‐aware image style transfer
– volume: 380
  start-page: 136
  issue: 6641
  year: 2023
  end-page: 138
  article-title: Rethink reporting of evaluation results in ai
  publication-title: Science
– volume: 30
  year: 2017
  article-title: GANs trained by a two time‐scale update rule converge to a local nash equilibrium
  publication-title: Advances in Neural Information Processing Systems
– start-page: 1
  year: 2017
  end-page: 10
  article-title: Depth‐aware neural style transfer
– start-page: 11326
  year: 2022
  end-page: 11336
  article-title: Stytr2: Image style transfer with transformers
– start-page: 4422
  year: 2019
  end-page: 4431
  article-title: Content and style disentanglement for artistic style transfer
– start-page: 11569
  year: 2021
  end-page: 11579
  article-title: Artemis: Affective language for visual art
– start-page: 740
  year: 2014
  end-page: 755
  article-title: Microsoft coco: Common objects in context
– start-page: 3809
  year: 2019
  end-page: 3817
  article-title: Learning linear transformations for fast image and video style transfer
– year: 2016
– start-page: 4990
  year: 2017
  end-page: 4998
  article-title: Deep photo style transfer
– start-page: 611
  year: 2012
  end-page: 625
  article-title: A naturalistic open source movie for optical flow evaluation
– year: 2010
– start-page: 862
  year: 2021
  end-page: 871
  article-title: Artflow: Unbiased image style transfer via reversible neural flows
– start-page: 2083
  year: 2013
  end-page: 2090
  article-title: Salient object detection: A discriminative regional feature integration approach
– volume: 42
  issue: 5
  year: 2023
  article-title: A unified arbitrary style transfer framework via adaptive contrastive learning
  publication-title: ACM Transactions on Graphics
– year: 2016
  article-title: Combining Markov random fields and convolutional neural networks for image synthesis
– year: 1995
– start-page: 5947
  year: 2023
  end-page: 5956
  article-title: Quantart: Quantizing image style transfer towards high visual fidelity
– start-page: 1
  year: 2024
  end-page: 10
– volume: 36
  start-page: 4957
  year: 2022
  end-page: 4965
  article-title: Universal video style transfer via crystallization, separation, and blending
– volume: 29
  start-page: 9125
  year: 2020
  end-page: 9139
  article-title: Consistent video style transfer via relaxation and regularization
  publication-title: IEEE Transactions on Image Processing
– volume: 38
  issue: 4
  year: 2019
  article-title: Stylizing video by example
  publication-title: ACM Transactions on Graphics
– start-page: 698
  year: 2018
  end-page: 714
  article-title: A style‐aware content loss for real‐time hd style transfer
– year: 2021
– year: 2024
– volume: 8
  start-page: 310
  issue: 11
  year: 2022
  article-title: A review of synthetic image data and its use in computer vision
  publication-title: Journal of Imaging
– start-page: 1395
  year: 2015
  end-page: 1403
  article-title: Holistically‐nested edge detection
– year: 2018
– start-page: 3222
  year: 2020
  end-page: 3230
  article-title: Fast video multi‐style transfer
– volume: 34
  start-page: 26561
  year: 2021
  end-page: 26573
  article-title: Artistic style transfer with internal‐external learning and contrastive learning
  publication-title: Advances in Neural Information Processing Systems
– start-page: 102
  year: 2016
  end-page: 118
  article-title: Playing for data: Ground truth from computer games
– volume: 44
  issue: 3
  year: 2020
– start-page: 8061
  year: 2018
  end-page: 8069
  article-title: Neural style transfer via meta networks
– start-page: 783
  year: 2017
  end-page: 791
  article-title: Real‐time neural style transfer for videos
– volume: 35
  start-page: 1210
  issue: 2
  year: 2021
  end-page: 1217
  article-title: Arbitrary video style transfer via multi‐channel correlation
– volume: 27
  start-page: 3998
  issue: 8
  year: 2018
  end-page: 4011
  article-title: NIMA: Neural image assessment
  publication-title: IEEE Transactions on Image Processing
– start-page: 1095
  year: 2022
  end-page: 1106
  article-title: AesUST: towards aesthetic‐enhanced universal style transfer
– volume: 18
  start-page: 140
  year: 2009
  end-page: 181
  article-title: Controlled experiments on the web: Survey and practical guide
  publication-title: Data Mining and Knowledge Discovery
– year: 2023
  article-title: GPT understands, too
  publication-title: AI Open
– volume: 114
  start-page: 494
  issue: 3
  year: 1993
  article-title: Dominance statistics: Ordinal analyses to answer ordinal questions
  publication-title: Psychological Bulletin
– start-page: 8222
  year: 2018
  end-page: 8231
  article-title: Arbitrary style transfer with deep feature reshuffle
– year: 2015
– start-page: 3985
  year: 2017
  end-page: 3993
  article-title: Controlling perceptual factors in neural style transfer
– start-page: 9036
  year: 2019
  end-page: 9045
  article-title: Photorealistic style transfer via wavelet transforms
– volume: 29
  start-page: 909
  year: 2019
  end-page: 920
  article-title: Structure‐preserving neural style transfer
  publication-title: IEEE Transactions on Image Processing
– start-page: 1897
  year: 2017
  end-page: 1906
  article-title: Stylebank: An explicit representation for neural image style transfer
– start-page: 8748
  year: 2021
  end-page: 8763
  article-title: Learning transferable visual models from natural language supervision
– volume: 12
  start-page: 69
  issue: 4
  year: 2023
  article-title: Depth‐aware neural style transfer for videos
  publication-title: Computers
– start-page: 8110
  year: 2020
  end-page: 8119
  article-title: Analyzing and improving the image quality of stylegan
– year: 2023
  article-title: Collaborative learning and style‐adaptive pooling network for perceptual evaluation of arbitrary style transfer
  publication-title: IEEE Transactions on Neural Networks and Learning Systems.
– year: 2018
  article-title: WikiArt emotions: An annotated dataset of emotions evoked by art
– start-page: 2408
  year: 2012
  end-page: 2415
  article-title: AVA: A large‐scale database for aesthetic visual analysis
– start-page: 4570
  year: 2019
  end-page: 4580
  article-title: Singan: Learning a generative model from a single natural image
– start-page: 2414
  year: 2016
  end-page: 2423
  article-title: Image style transfer using convolutional neural networks
– start-page: 1202
  year: 2017
  end-page: 1211
  article-title: BAM! the behance artistic media dataset for recognition beyond photography
– year: 2017
  article-title: A learned representation for artistic style
– start-page: 453
  year: 2018
  end-page: 468
  article-title: A closed‐form solution to photorealistic image stylization
– volume: 39
  start-page: 755
  issue: 4
  year: 2023
  end-page: 775
  article-title: Measuring aesthetic preferences of neural style transfer: More precision with the two‐alternative‐forced‐choice task
  publication-title: International Journal of Human–Computer Interaction
– start-page: 331
  year: 2023
  end-page: 340
  article-title: RAST: Restorable arbitrary style transfer via multi‐restoration
– start-page: 189
  year: 2022
  end-page: 206
  article-title: CCPL: Contrastive coherence preserving loss for versatile style transfer
– start-page: 560
  year: 2022
  end-page: 576
  article-title: Artfid: Quantitative evaluation of neural style transfer
– volume: 67
  start-page: 58
  year: 2017
  end-page: 76
  article-title: Developing and applying a benchmark for evaluating image stylization
  publication-title: Computers & Graphics
– start-page: 5880
  year: 2019
  end-page: 5888
  article-title: Arbitrary style transfer with style‐attentional networks
– start-page: 196
  year: 1992
  end-page: 202
– start-page: 13816
  year: 2020
  end-page: 13825
  article-title: Two‐stage peer‐regularized feature recombination for arbitrary image style transfer
– volume: 32
  start-page: 675
  issue: 200
  year: 1937
  end-page: 701
  article-title: The use of ranks to avoid the assumption of normality implicit in the analysis of variance
  publication-title: Journal of the American Statistical Association
– start-page: 3000
  year: 2017
  end-page: 3009
  article-title: Richer convolutional features for edge detection
– year: 2014
  article-title: Recognizing image style
– year: 1988
– year: 2020
– year: 2023
– start-page: 702
  year: 2016
  end-page: 716
  article-title: Precomputed real‐time texture synthesis with Markovian generative adversarial networks
– volume: 12
  start-page: 5444
  issue: 1
  year: 2022
  article-title: A framework for rigorous evaluation of human performance in human and machine learning comparison studies
  publication-title: Scientific Reports
– start-page: 1501
  year: 2017
  end-page: 1510
  article-title: Arbitrary style transfer in real‐time with adaptive instance normalization
– volume: 70
  start-page: 151
  issue: 3
  year: 1968
  end-page: 159
  article-title: Statistical significance in psychological research
  publication-title: Psychological Bulletin
– start-page: 14173
  year: 2023
  end-page: 14182
  article-title: Neural preset for color style transfer
– start-page: 6649
  year: 2021
  end-page: 6658
  article-title: AdaAttN: Revisit attention mechanism in arbitrary neural style transfer
– year: 2017
– volume: 37
  start-page: 1287
  year: 2023
  end-page: 1295
  article-title: Frequency domain disentanglement for arbitrary neural style transfer
– volume: 24
  start-page: 1299
  year: 2022
  end-page: 1312
  article-title: Structure‐guided arbitrary style transfer for artistic image and video
  publication-title: IEEE Transactions on Multimedia
– start-page: 26
  year: 2016
  end-page: 36
– volume: 9
  start-page: 131583
  year: 2021
  end-page: 131613
  article-title: Neural style transfer: A critical review
  publication-title: IEEE Access
– ident: e_1_2_9_24_2
– volume-title: Eurographics 2023 ‐ Tutorials
  year: 2023
  ident: e_1_2_9_83_2
– ident: e_1_2_9_2_2
– year: 2023
  ident: e_1_2_9_17_2
  article-title: Collaborative learning and style‐adaptive pooling network for perceptual evaluation of arbitrary style transfer
  publication-title: IEEE Transactions on Neural Networks and Learning Systems.
– ident: e_1_2_9_51_2
  doi: 10.1109/CVPR42600.2020.00813
– volume-title: Computer Graphics & Visual Computing (CGVC)
  year: 2022
  ident: e_1_2_9_44_2
– ident: e_1_2_9_59_2
  doi: 10.1109/CVPR.2017.36
– ident: e_1_2_9_84_2
  doi: 10.5244/C.31.153
– ident: e_1_2_9_86_2
– ident: e_1_2_9_40_2
  doi: 10.1109/ICCV51070.2023.02080
– ident: e_1_2_9_99_2
  doi: 10.1109/ICCV.2019.00467
– ident: e_1_2_9_81_2
  doi: 10.1109/CVPR.2012.6247954
– ident: e_1_2_9_119_2
  doi: 10.1145/3503161.3547939
– ident: e_1_2_9_49_2
  doi: 10.1109/TVCG.2019.2921336
– ident: e_1_2_9_34_2
  doi: 10.5244/C.31.114
– ident: e_1_2_9_3_2
  doi: 10.1109/CVPR46437.2021.00092
– ident: e_1_2_9_87_2
– ident: e_1_2_9_88_2
  doi: 10.1109/CVPR.2019.00603
– ident: e_1_2_9_28_2
  doi: 10.1109/CVPR.2018.00858
– ident: e_1_2_9_27_2
  doi: 10.1080/01621459.1937.10503522
– ident: e_1_2_9_37_2
  doi: 10.1109/ICCV.2017.167
– ident: e_1_2_9_14_2
  doi: 10.1038/s41598-022-08078-3
– ident: e_1_2_9_115_2
  doi: 10.1007/978-3-031-16788-1_34
– ident: e_1_2_9_117_2
  doi: 10.1016/j.cviu.2021.103203
– ident: e_1_2_9_111_2
– volume-title: IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
  year: 2020
  ident: e_1_2_9_95_2
– ident: e_1_2_9_70_2
  doi: 10.1007/978-3-319-46487-9_43
– ident: e_1_2_9_68_2
  doi: 10.1109/CVPR.2017.740
– ident: e_1_2_9_125_2
  doi: 10.1109/ICCV.2019.00913
– ident: e_1_2_9_60_2
  doi: 10.1007/978-3-319-70139-4
– ident: e_1_2_9_26_2
– ident: e_1_2_9_20_2
  doi: 10.1109/ICCV.2015.316
– ident: e_1_2_9_4_2
  doi: 10.1109/CVPR46437.2021.01140
– ident: e_1_2_9_106_2
  doi: 10.1109/CVPR52729.2023.01758
– ident: e_1_2_9_74_2
  doi: 10.1109/CVPR46437.2021.00370
– ident: e_1_2_9_9_2
– ident: e_1_2_9_11_2
– ident: e_1_2_9_72_2
  doi: 10.24963/ijcai.2017/310
– ident: e_1_2_9_128_2
  doi: 10.1109/CVPR.2018.00068
– ident: e_1_2_9_112_2
  doi: 10.1109/TIP.2003.819861
– ident: e_1_2_9_116_2
  doi: 10.1109/TIP.2020.3024018
– ident: e_1_2_9_126_2
  doi: 10.1007/978-3-030-11018-5_32
– ident: e_1_2_9_66_2
  doi: 10.1609/aaai.v37i1.25212
– ident: e_1_2_9_90_2
– ident: e_1_2_9_124_2
  doi: 10.1109/CVPR52729.2023.02144
– ident: e_1_2_9_10_2
  doi: 10.1109/TMM.2024.3410672
– ident: e_1_2_9_93_2
– volume: 34
  start-page: 26561
  year: 2021
  ident: e_1_2_9_18_2
  article-title: Artistic style transfer with internal‐external learning and contrastive learning
  publication-title: Advances in Neural Information Processing Systems
– ident: e_1_2_9_30_2
  doi: 10.1109/CVPR.2016.265
– ident: e_1_2_9_67_2
  doi: 10.1007/978-3-319-10602-1_48
– ident: e_1_2_9_127_2
  doi: 10.1109/ICCV51070.2023.02112
– ident: e_1_2_9_105_2
– ident: e_1_2_9_108_2
– ident: e_1_2_9_22_2
  doi: 10.1609/aaai.v35i2.16208
– ident: e_1_2_9_73_2
  doi: 10.1145/3123266.3123425
– ident: e_1_2_9_53_2
  doi: 10.1109/CVPR52729.2023.01362
– ident: e_1_2_9_46_2
  doi: 10.1007/978-3-319-46475-6_43
– ident: e_1_2_9_122_2
– ident: e_1_2_9_39_2
  doi: 10.1109/ICCV48922.2021.01459
– ident: e_1_2_9_129_2
  doi: 10.1145/3605548
– ident: e_1_2_9_12_2
  doi: 10.1037/0033-2909.114.3.494
– ident: e_1_2_9_43_2
  doi: 10.1109/ICCV.2017.355
– ident: e_1_2_9_65_2
  doi: 10.1007/978-3-030-01219-9_28
– ident: e_1_2_9_100_2
  doi: 10.1109/ACCESS.2021.3112996
– ident: e_1_2_9_121_2
  doi: 10.1109/ICCV.2015.164
– ident: e_1_2_9_7_2
  doi: 10.1126/science.adf6369
– ident: e_1_2_9_98_2
  doi: 10.1109/CVPR42600.2020.01383
– ident: e_1_2_9_8_2
– ident: e_1_2_9_63_2
  doi: 10.1109/ICCV48922.2021.00658
– ident: e_1_2_9_94_2
– ident: e_1_2_9_29_2
– ident: e_1_2_9_61_2
  doi: 10.1007/978-3-030-01267-0_11
– ident: e_1_2_9_80_2
– ident: e_1_2_9_19_2
  doi: 10.1109/CVPR.2017.296
– ident: e_1_2_9_118_2
  doi: 10.1007/978-3-031-19787-1_11
– ident: e_1_2_9_114_2
  doi: 10.1007/978-1-4612-4380-9_16
– ident: e_1_2_9_50_2
– ident: e_1_2_9_54_2
  doi: 10.1109/ICCV.2019.00452
– ident: e_1_2_9_123_2
  doi: 10.1109/WACV48630.2021.00113
– ident: e_1_2_9_47_2
  doi: 10.1145/3306346.3323006
– ident: e_1_2_9_62_2
– ident: e_1_2_9_104_2
– ident: e_1_2_9_107_2
  doi: 10.1109/TIP.2018.2831899
– ident: e_1_2_9_89_2
– ident: e_1_2_9_31_2
  doi: 10.1109/CVPR.2017.397
– ident: e_1_2_9_75_2
  doi: 10.1037/h0026141
– ident: e_1_2_9_76_2
  doi: 10.1109/TMM.2021.3063605
– ident: e_1_2_9_92_2
  doi: 10.1007/s11263-018-1089-z
– ident: e_1_2_9_16_2
– ident: e_1_2_9_101_2
  doi: 10.1007/978-3-030-01237-3_43
– ident: e_1_2_9_32_2
– volume: 30
  year: 2017
  ident: e_1_2_9_41_2
  article-title: GANs trained by a two time‐scale update rule converge to a local nash equilibrium
  publication-title: Advances in Neural Information Processing Systems
– ident: e_1_2_9_120_2
  doi: 10.1109/CVPR52729.2023.00972
– ident: e_1_2_9_57_2
  doi: 10.1109/CVPR.2017.622
– ident: e_1_2_9_21_2
– ident: e_1_2_9_78_2
  doi: 10.3390/jimaging8110310
– ident: e_1_2_9_33_2
  doi: 10.1007/978-3-030-20876-9_40
– ident: e_1_2_9_82_2
  doi: 10.1016/j.cag.2017.05.025
– ident: e_1_2_9_69_2
  doi: 10.1109/CVPR.2016.272
– ident: e_1_2_9_102_2
  doi: 10.1080/10447318.2022.2049081
– ident: e_1_2_9_56_2
  doi: 10.1109/ICCVW.2019.00396
– ident: e_1_2_9_45_2
  doi: 10.3390/computers12040069
– ident: e_1_2_9_58_2
  doi: 10.1145/3092919.3092924
– volume-title: Statistical Power Analysis for the Behavioral Sciences
  year: 1988
  ident: e_1_2_9_15_2
– ident: e_1_2_9_97_2
– ident: e_1_2_9_71_2
  doi: 10.24963/ijcai.2022/687
– ident: e_1_2_9_23_2
  doi: 10.1109/CVPR52688.2022.01104
– ident: e_1_2_9_42_2
  doi: 10.1109/CVPR.2017.745
– ident: e_1_2_9_79_2
– ident: e_1_2_9_55_2
  doi: 10.5244/C.28.122
– ident: e_1_2_9_103_2
  doi: 10.1109/CVPR.2018.00841
– ident: e_1_2_9_35_2
  doi: 10.1109/WACV45572.2020.9093420
– ident: e_1_2_9_25_2
  doi: 10.1017/CBO9780511761676
– ident: e_1_2_9_48_2
  doi: 10.1109/CVPR.2013.271
– ident: e_1_2_9_36_2
  doi: 10.1109/CVPR52729.2023.00576
– ident: e_1_2_9_6_2
  doi: 10.1561/0600000106
– ident: e_1_2_9_96_2
  doi: 10.1007/978-3-319-46475-6_7
– ident: e_1_2_9_13_2
  doi: 10.1109/TIP.2019.2936746
– ident: e_1_2_9_110_2
– ident: e_1_2_9_64_2
  doi: 10.1109/CVPR.2019.00393
– ident: e_1_2_9_113_2
  doi: 10.1109/ICCV.2017.136
– ident: e_1_2_9_5_2
  doi: 10.1609/aaai.v34i07.6614
– ident: e_1_2_9_38_2
  doi: 10.1145/3394171.3413853
– ident: e_1_2_9_85_2
  doi: 10.1109/WACV56688.2023.00041
– ident: e_1_2_9_91_2
  doi: 10.1007/978-3-319-45886-1_3
– ident: e_1_2_9_109_2
  doi: 10.1109/CVPR.2017.437
– ident: e_1_2_9_52_2
  doi: 10.1007/s10618-008-0114-1
– ident: e_1_2_9_77_2
  doi: 10.1016/j.aiopen.2023.08.012
SSID ssj0004765
Score 2.4433184
Snippet The field of neural style transfer (NST) has witnessed remarkable progress in the past few years, with approaches being able to synthesize artistic and...
SourceID proquest
crossref
wiley
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
SubjectTerms Algorithms
image and video processing
Image quality
Performance evaluation
rendering; non‐photorealistic rendering
Title Evaluation in Neural Style Transfer: A Review
URI https://onlinelibrary.wiley.com/doi/abs/10.1111%2Fcgf.15165
https://www.proquest.com/docview/3108773700
Volume 43
WOSCitedRecordID wos001281438900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVWIB
  databaseName: Wiley Online Library Full Collection 2020
  customDbUrl:
  eissn: 1467-8659
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0004765
  issn: 0167-7055
  databaseCode: DRFUL
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://onlinelibrary.wiley.com
  providerName: Wiley-Blackwell
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpZ3PS8MwFMcfc_OgB3-L0zmKePBSafOjSfU0ptWDjKEOditJmspgVNmm4H9v0l-boCB46yFpy-t7ed80yecBnHNBZZBawiU34UYk91zpB8iVRGkzaQmUL2VebIINBnw8DocNuK7OwhR8iPqHm42MfLy2AS7kfCXI1Ut6adJVQNeg5fuYWZdGZLg8FMkCWoG9LTKmxArZbTx11-_JaKkwV3Vqnmii7X-94g5slfrS6RUOsQsNne3B5gp1cB_c25rw7Uwyx-I5TI-nxedUO3nqSvXsyuk5xbLBAYyi2-f-vVtWTXAVxoi6qUgU1Vb5U44SDyMpiBAMm1RPlQVyJYmRFIpxFWDJEk6ZQp4mVFAmqBns8CE0s9dMH4HD0zT0lfaoIITgQEgRhAjxhKSpSvxQtuGiMl-sSqS4rWwxjauphbFAnFugDWd107eCo_FTo071DeIylOax0Z-cMcw8zzwut_bvN4j7d1F-cfz3piewgYxQKfaNdaC5mL3rU1hXH4vJfNbNfaoLrZvHaPTwBflJzTU
linkProvider Wiley-Blackwell
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpZ3PS8MwFMcfcxPUg7_F6dQiHrxU2jRpUvEypnXiHKIb7FaSNJXBmLJNwf_epL82QUHw1sNLW17z8r5Jk88DOGOcCD8xhEumww0L5tjC9ZEtsFR60uJLV4i02ATtdtlgEDxW4Ko4C5PxIcoFNxMZ6XhtAtwsSC9EuXxJLnS-8skS1LDuRqQKteunsN-Zn4ukPinY3oYak5OFzE6esvH3fDQXmYtSNc014cb_3nIT1nONaTWzTrEFFTXehrUF8uAO2Dcl5dsaji2D6NAtnmefI2Wl6StRk0uraWW_DnahH970Wm07r5xgS89DxE54LIky6p8wFDseEhxzTj2d7ok0UK441rJCUiZ9T9CYESqRozDhhHKiBzxvD6rj17HaB4slSeBK5RCOMfZ8LrgfIMRinCQydgNRh_PCf5HMseKmusUoKqYX2gNR6oE6nJambxlL4yejRvERojycppHWoIxSjzqOflzq7t9vELVuw_Ti4O-mJ7DS7j10os5d9_4QVpEWLtk-sgZUZ5N3dQTL8mM2nE6O8y72BZmy0RI
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpZ1bS8MwFMcPc4rog3dxOrWID75UekmaVHwZc1VRxkCFvZVcZTDq2KbgtzfpZZugIPjWh6Qtpznn_NMkvwNwThnmkbaES2rcDXHqudyPApcjocykJRI-53mxCdLt0n4_7tXgujoLU_AhZj_crGfk8do6uBpJveDl4lVfmnwV4SVYRtjEWMt1Rr35qUgS4YrsbZkxJVfI7uOZdf2ejeYSc1Go5pkm2fzfO27BRqkwnVYxJLahprIdWF_gDu6C25kxvp1B5lhAh-nxNP0cKidPXlqNr5yWUywc7MFL0nlu37ll3QRXhGGAXc2kwMpqf0wD6YUBZ4gxEppkj4VFcklpRIUgVEQhJ5JiIgJPIcwwYdiEu3Af6tlbpg7AoVrHvlAeZgihMGKcRXEQUIm0FtKPeQMuKvulooSK29oWw7SaXBgLpLkFGnA2azoqSBo_NWpWHyEtnWmSGgVKCQmJ55nH5eb-_QZp-zbJLw7_3vQUVns3Sfp43304grXAqJZiE1kT6tPxuzqGFfExHUzGJ_n4-gKjuc77
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Evaluation+in+Neural+Style+Transfer%3A+A+Review&rft.jtitle=Computer+graphics+forum&rft.au=Ioannou%2C+Eleftherios&rft.au=Maddock%2C+Steve&rft.date=2024-09-01&rft.issn=0167-7055&rft.eissn=1467-8659&rft.volume=43&rft.issue=6&rft.epage=n%2Fa&rft_id=info:doi/10.1111%2Fcgf.15165&rft.externalDBID=10.1111%252Fcgf.15165&rft.externalDocID=CGF15165
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0167-7055&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0167-7055&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0167-7055&client=summon