Visual-Tactile Fusion for Object Recognition

The camera provides rich visual information regarding objects and becomes one of the most mainstream sensors in the automation community. However, it is often difficult to be applicable when the objects are not visually distinguished. On the other hand, tactile sensors can be used to capture multipl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automation science and engineering Jg. 14; H. 2; S. 996 - 1008
Hauptverfasser: Huaping Liu, Yuanlong Yu, Fuchun Sun, Gu, Jason
Format: Journal Article
Sprache:Englisch
Veröffentlicht: IEEE 01.04.2017
Schlagworte:
ISSN:1545-5955, 1558-3783
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract The camera provides rich visual information regarding objects and becomes one of the most mainstream sensors in the automation community. However, it is often difficult to be applicable when the objects are not visually distinguished. On the other hand, tactile sensors can be used to capture multiple object properties, such as textures, roughness, spatial features, compliance, and friction, and therefore provide another important modality for the perception. Nevertheless, effective combination of the visual and tactile modalities is still a challenging problem. In this paper, we develop a visual-tactile fusion framework for object recognition tasks. This paper uses the multivariate-time-series model to represent the tactile sequence and the covariance descriptor to characterize the image. Further, we design a joint group kernel sparse coding (JGKSC) method to tackle the intrinsically weak pairing problem in visual-tactile data samples. Finally, we develop a visual-tactile data set, composed of 18 household objects for validation. The experimental results show that considering both visual and tactile inputs is beneficial and the proposed method indeed provides an effective strategy for fusion.
AbstractList The camera provides rich visual information regarding objects and becomes one of the most mainstream sensors in the automation community. However, it is often difficult to be applicable when the objects are not visually distinguished. On the other hand, tactile sensors can be used to capture multiple object properties, such as textures, roughness, spatial features, compliance, and friction, and therefore provide another important modality for the perception. Nevertheless, effective combination of the visual and tactile modalities is still a challenging problem. In this paper, we develop a visual-tactile fusion framework for object recognition tasks. This paper uses the multivariate-time-series model to represent the tactile sequence and the covariance descriptor to characterize the image. Further, we design a joint group kernel sparse coding (JGKSC) method to tackle the intrinsically weak pairing problem in visual-tactile data samples. Finally, we develop a visual-tactile data set, composed of 18 household objects for validation. The experimental results show that considering both visual and tactile inputs is beneficial and the proposed method indeed provides an effective strategy for fusion.
Author Gu, Jason
Fuchun Sun
Huaping Liu
Yuanlong Yu
Author_xml – sequence: 1
  surname: Huaping Liu
  fullname: Huaping Liu
  email: hpliu@tsinghua.edu.cn
  organization: Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
– sequence: 2
  surname: Yuanlong Yu
  fullname: Yuanlong Yu
  email: yu.yuanlong@fzu.edu.cn
  organization: Coll. of Comput., Fuzhou Univ., Fuzhou, China
– sequence: 3
  surname: Fuchun Sun
  fullname: Fuchun Sun
  email: fcsun@tsinghua.edu.cn
  organization: Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
– sequence: 4
  givenname: Jason
  surname: Gu
  fullname: Gu, Jason
  email: jason.gu@dal.ca
  organization: Dept. of Electr. & Comput. Eng., Dalhousie Univ., Halifax, NS, Canada
BookMark eNp9j81OwzAMgCM0JLbBAyAufQAy8lMn6XGaNkCaNAkK1ypLHZSptKjJDrw9rTZx4MDJlu3P9jcjk7ZrkZBbzhacs-KhXL6uF4JxtRCQFwDigkw5gKFSGzkZ8xwoDI0rMovxwJjITcGm5P49xKNtaGldCg1mm2MMXZv5rs92-wO6lL2g6z7akIbyNbn0tol4c45z8rZZl6snut09Pq-WW-qEgkRFjspyCU56xjlKzTQyUysz3PessFbn0itdFzjM2BrtHgwX3AsANUAg54Sf9rq-i7FHX3314dP23xVn1ahbjbrVqFuddQdG_2FcSHb8OvU2NP-SdycyIOLvJZ0rIZiRP9SSYyg
CODEN ITASC7
CitedBy_id crossref_primary_10_1109_JPROC_2019_2930808
crossref_primary_10_1109_TCDS_2018_2819826
crossref_primary_10_1109_JSYST_2023_3317328
crossref_primary_10_1109_TSMC_2023_3279023
crossref_primary_10_1016_j_ins_2017_09_010
crossref_primary_10_3389_fnbot_2019_00073
crossref_primary_10_1016_j_patcog_2021_108176
crossref_primary_10_1007_s11432_017_9214_8
crossref_primary_10_1109_ACCESS_2019_2959011
crossref_primary_10_1016_j_neucom_2017_01_113
crossref_primary_10_1109_TMECH_2021_3080378
crossref_primary_10_1177_1729881417740711
crossref_primary_10_1109_TASE_2020_3017022
crossref_primary_10_1007_s11370_025_00628_8
crossref_primary_10_1016_j_cej_2022_140890
crossref_primary_10_1109_TNNLS_2020_2980892
crossref_primary_10_1177_1729881417716043
crossref_primary_10_1109_TASE_2018_2847222
crossref_primary_10_1038_s41598_024_78764_x
crossref_primary_10_1109_TSMC_2021_3096235
crossref_primary_10_1016_j_bspc_2019_101810
crossref_primary_10_1109_LRA_2021_3136657
crossref_primary_10_1108_IR_06_2018_0114
crossref_primary_10_1109_TCDS_2018_2868425
crossref_primary_10_1016_j_inffus_2019_09_004
crossref_primary_10_1093_comjnl_bxaa045
crossref_primary_10_1109_TNNLS_2018_2877468
crossref_primary_10_1109_TIE_2016_2643625
crossref_primary_10_3389_fnbot_2023_1181383
crossref_primary_10_3390_s21082716
crossref_primary_10_1109_LRA_2020_3038377
crossref_primary_10_1007_s12559_017_9471_7
crossref_primary_10_1109_TIM_2024_3386205
crossref_primary_10_1109_TOH_2020_3011899
crossref_primary_10_1109_LRA_2020_3010720
crossref_primary_10_1109_TCDS_2018_2883653
crossref_primary_10_1109_TASE_2019_2892081
crossref_primary_10_1007_s00138_023_01374_6
crossref_primary_10_1002_aisy_202100074
crossref_primary_10_1109_ACCESS_2020_3028740
crossref_primary_10_1109_TIE_2017_2745445
crossref_primary_10_1016_j_future_2020_08_016
crossref_primary_10_1016_j_compag_2024_108619
crossref_primary_10_1109_LRA_2018_2818933
crossref_primary_10_1109_TSMC_2016_2635141
crossref_primary_10_1109_TASE_2024_3440047
crossref_primary_10_1007_s12555_016_0720_4
crossref_primary_10_1109_LRA_2024_3505783
crossref_primary_10_1063_5_0176343
crossref_primary_10_1016_j_ins_2017_09_068
crossref_primary_10_1177_1729881418825093
crossref_primary_10_1016_j_engappai_2025_111064
crossref_primary_10_1016_j_ins_2017_08_035
crossref_primary_10_1007_s12559_017_9504_2
crossref_primary_10_1016_j_engappai_2025_111986
crossref_primary_10_1109_ACCESS_2020_3014879
crossref_primary_10_1109_TII_2018_2822670
crossref_primary_10_1109_TRO_2019_2914772
crossref_primary_10_3389_fnins_2022_1070645
crossref_primary_10_1007_s12559_018_9571_z
crossref_primary_10_1016_j_cviu_2022_103498
crossref_primary_10_1109_TASE_2017_2783342
crossref_primary_10_1016_j_robot_2019_01_009
crossref_primary_10_1007_s12559_017_9536_7
crossref_primary_10_1177_1729881417710793
crossref_primary_10_1177_1729881417710794
crossref_primary_10_1109_TASE_2019_2897791
crossref_primary_10_1108_IR_07_2018_0154
crossref_primary_10_1038_s41598_024_75607_7
crossref_primary_10_1109_TIE_2021_3068678
crossref_primary_10_1109_TII_2018_2879426
crossref_primary_10_3390_info12020092
crossref_primary_10_1038_s41598_025_16456_w
crossref_primary_10_1109_JSEN_2023_3319114
crossref_primary_10_1049_iet_its_2018_5351
crossref_primary_10_1109_TCDS_2023_3311871
crossref_primary_10_1016_j_neucom_2018_06_028
crossref_primary_10_1108_AA_11_2018_0228
crossref_primary_10_1007_s00170_022_09161_9
crossref_primary_10_1007_s10514_023_10091_y
crossref_primary_10_1016_j_ins_2017_08_004
crossref_primary_10_1109_TASE_2019_2910508
crossref_primary_10_1109_TCYB_2016_2609999
crossref_primary_10_1177_1729881417717059
crossref_primary_10_1007_s13042_018_0825_6
crossref_primary_10_3390_s20041050
crossref_primary_10_1017_S0263574721000023
crossref_primary_10_1177_1729881417717057
crossref_primary_10_3390_s17112653
crossref_primary_10_1007_s13042_017_0736_y
crossref_primary_10_1109_TIM_2021_3096858
crossref_primary_10_1109_TSP_2020_3003453
crossref_primary_10_1109_TASE_2018_2877499
crossref_primary_10_1038_s41467_024_51261_5
crossref_primary_10_1109_TII_2019_2898264
crossref_primary_10_1109_TMECH_2017_2775208
crossref_primary_10_1007_s13042_019_00962_1
crossref_primary_10_1016_j_ifacol_2020_12_2430
crossref_primary_10_1016_j_imavis_2018_09_016
crossref_primary_10_1109_ACCESS_2022_3174874
crossref_primary_10_1109_TASE_2019_2941230
crossref_primary_10_1007_s12559_017_9505_1
crossref_primary_10_1177_1729881417704544
crossref_primary_10_1109_TSMC_2018_2818184
crossref_primary_10_3390_e20070522
crossref_primary_10_1177_00368504211037771
crossref_primary_10_1109_TMECH_2020_3048441
crossref_primary_10_1007_s13042_017_0666_8
crossref_primary_10_1109_TCYB_2021_3080321
crossref_primary_10_3390_biomimetics8080591
crossref_primary_10_1109_TIM_2023_3326241
crossref_primary_10_1007_s00500_018_3108_y
crossref_primary_10_1007_s12559_018_9558_9
crossref_primary_10_1177_0278364920907688
crossref_primary_10_1109_TITS_2017_2754099
crossref_primary_10_1016_j_measurement_2025_118174
crossref_primary_10_1007_s11432_021_3512_6
crossref_primary_10_1109_TCSVT_2022_3206865
crossref_primary_10_1002_adem_202300469
crossref_primary_10_1186_s40638_017_0058_2
crossref_primary_10_1109_TASE_2017_2692271
crossref_primary_10_1016_j_ins_2017_11_038
crossref_primary_10_1016_j_rcim_2024_102792
crossref_primary_10_1007_s10846_022_01595_3
crossref_primary_10_1109_TIM_2017_2764298
crossref_primary_10_1109_TASE_2020_2969485
crossref_primary_10_1177_1729881420948727
crossref_primary_10_12688_cobot_17497_1
crossref_primary_10_1002_admt_202300322
crossref_primary_10_1007_s00521_021_06089_1
crossref_primary_10_1109_JSYST_2020_3039785
crossref_primary_10_1007_s12555_019_0186_2
crossref_primary_10_1016_j_robot_2017_11_015
crossref_primary_10_1109_TASE_2018_2883587
crossref_primary_10_1016_j_bspc_2022_104259
crossref_primary_10_1007_s12555_017_0060_z
crossref_primary_10_1109_ACCESS_2017_2698419
crossref_primary_10_3389_fnins_2022_1100812
crossref_primary_10_1109_TII_2019_2895602
Cites_doi 10.1109/TASE.2012.2206075
10.1109/ICHR.2008.4756005
10.1109/TIM.2014.2315734
10.1109/MFI.2012.6343036
10.1109/ROMAN.2010.5598659
10.1109/TASE.2013.2252617
10.1109/TASE.2012.2194143
10.1109/TRO.2011.2132870
10.1109/TII.2016.2542208
10.1016/j.robot.2012.07.021
10.1109/ROBOT.2006.1641793
10.1007/978-3-642-25116-0_28
10.1109/TRO.2009.2033627
10.1109/TIM.2016.2514779
10.1109/IROS.2014.6942622
10.1109/JPROC.2010.2044470
10.1109/TSP.2011.2157912
10.1109/HUMANOIDS.2014.7041493
10.1109/TPAMI.2013.109
10.1109/TIP.2012.2215620
10.1109/ICRA.2015.7139443
10.1109/IROS.2009.5354648
10.1109/TASE.2012.2214772
10.1016/j.ins.2014.08.066
10.1016/j.sigpro.2012.09.011
10.1109/TASE.2014.2371852
10.1177/027836498800700603
10.1109/TIP.2014.2324290
10.1109/ROBOT.2009.5152515
10.1109/TCYB.2015.2481713
10.1109/ROBOT.2009.5152709
10.1109/IROS.2012.6385992
10.1109/TNNLS.2015.2424995
10.1109/ICRA.2013.6631074
10.1109/TASE.2009.2020994
10.1007/s11432-015-5404-9
10.1068/p5850
10.1109/TRO.2011.2121130
10.1016/j.jfranklin.2015.07.002
10.1109/ROBOT.1993.291939
10.1109/TST.2013.6449409
10.1109/TIP.2012.2205006
10.1109/ICRA.2014.6907172
10.1109/IROS.2011.6094878
10.1109/ROBOT.2010.5509553
10.1109/TIP.2013.2282078
10.1109/TASE.2014.2320157
10.1109/ICCV.2013.202
10.1109/TRO.2011.2134130
10.1109/TIM.2012.2210460
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
DOI 10.1109/TASE.2016.2549552
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1558-3783
EndPage 1008
ExternalDocumentID 10_1109_TASE_2016_2549552
7462208
Genre orig-research
GrantInformation_xml – fundername: National Key Project for Basic Research of China
  grantid: 2013CB329403
– fundername: National Natural Science Foundation of China
  grantid: 61327809
  funderid: 10.13039/501100001809
– fundername: National High-Tech Research and Development Plan
  grantid: 2015AA042306
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AIBXA
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
ID FETCH-LOGICAL-c265t-24e6a135c3f011e3707e08d68595f09aa743f67d9e35cadeab58121f25565c353
IEDL.DBID RIE
ISICitedReferencesCount 216
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000399347500052&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1545-5955
IngestDate Sat Nov 29 04:12:45 EST 2025
Tue Nov 18 22:16:38 EST 2025
Tue Aug 26 17:03:04 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 2
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c265t-24e6a135c3f011e3707e08d68595f09aa743f67d9e35cadeab58121f25565c353
ORCID 0000-0002-4042-6044
PageCount 13
ParticipantIDs crossref_primary_10_1109_TASE_2016_2549552
crossref_citationtrail_10_1109_TASE_2016_2549552
ieee_primary_7462208
PublicationCentury 2000
PublicationDate 2017-April
2017-4-00
PublicationDateYYYYMMDD 2017-04-01
PublicationDate_xml – month: 04
  year: 2017
  text: 2017-April
PublicationDecade 2010
PublicationTitle IEEE transactions on automation science and engineering
PublicationTitleAbbrev TASE
PublicationYear 2017
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref56
ref12
ref15
ref14
ref53
ref52
ref55
ref11
ref54
ref10
ref17
ref16
ref19
ref18
ref51
lederman (ref25) 2004
ref46
ref45
ref48
zhou (ref57) 2009
ref42
ref41
ref44
ref43
ref49
ref8
ref7
ref9
ref4
ref3
nene (ref32) 1996
ref6
tuzel (ref47) 2006
ref5
ref40
jia (ref22) 2010; 7
ref35
ref34
ref37
ref36
ref30
ref33
ref2
ref1
ref39
ref38
xiao (ref50) 2013; 215
ref24
ref23
ref26
ref20
ref21
ref27
ref29
morales (ref31) 2007
liu (ref28) 2015
References_xml – start-page: 1
  year: 2009
  ident: ref57
  article-title: Canonical time warping for alignment of human behavior
  publication-title: Proc Neural Inf Process Syst (NIPS)
– ident: ref21
  doi: 10.1109/TASE.2012.2206075
– ident: ref8
  doi: 10.1109/ICHR.2008.4756005
– ident: ref2
  doi: 10.1109/TIM.2014.2315734
– ident: ref27
  doi: 10.1109/MFI.2012.6343036
– ident: ref4
  doi: 10.1109/ROMAN.2010.5598659
– ident: ref3
  doi: 10.1109/TASE.2013.2252617
– ident: ref19
  doi: 10.1109/TASE.2012.2194143
– ident: ref5
  doi: 10.1109/TRO.2011.2132870
– ident: ref55
  doi: 10.1109/TII.2016.2542208
– ident: ref15
  doi: 10.1016/j.robot.2012.07.021
– ident: ref34
  doi: 10.1109/ROBOT.2006.1641793
– ident: ref38
  doi: 10.1007/978-3-642-25116-0_28
– ident: ref14
  doi: 10.1109/TRO.2009.2033627
– ident: ref29
  doi: 10.1109/TIM.2016.2514779
– ident: ref44
  doi: 10.1109/IROS.2014.6942622
– start-page: 107
  year: 2004
  ident: ref25
  publication-title: Multisensory texture perception
– start-page: 3678
  year: 2015
  ident: ref28
  article-title: Robust kernel dictionary learning using a whole sequence convergent algorithm
  publication-title: Proc Int Conf Artif Intell (IJCAI)
– ident: ref49
  doi: 10.1109/JPROC.2010.2044470
– ident: ref42
  doi: 10.1109/TSP.2011.2157912
– ident: ref36
  doi: 10.1109/HUMANOIDS.2014.7041493
– ident: ref40
  doi: 10.1109/TPAMI.2013.109
– ident: ref16
  doi: 10.1109/TIP.2012.2215620
– ident: ref7
  doi: 10.1109/ICRA.2015.7139443
– ident: ref37
  doi: 10.1109/IROS.2009.5354648
– ident: ref53
  doi: 10.1109/TASE.2012.2214772
– ident: ref11
  doi: 10.1016/j.ins.2014.08.066
– ident: ref12
  doi: 10.1016/j.sigpro.2012.09.011
– ident: ref43
  doi: 10.1109/TASE.2014.2371852
– volume: 215
  start-page: 845
  year: 2013
  ident: ref50
  article-title: Dexterous robotic-hand grasp learning using piecewise linear dynamic systems model
  publication-title: Found Pract Appl Congnitive Syst Inf Process
– ident: ref1
  doi: 10.1177/027836498800700603
– ident: ref39
  doi: 10.1109/TIP.2014.2324290
– ident: ref35
  doi: 10.1109/ROBOT.2009.5152515
– ident: ref52
  doi: 10.1109/TCYB.2015.2481713
– ident: ref17
  doi: 10.1109/ROBOT.2009.5152709
– ident: ref41
  doi: 10.1109/IROS.2012.6385992
– ident: ref45
  doi: 10.1109/TNNLS.2015.2424995
– ident: ref20
  doi: 10.1109/ICRA.2013.6631074
– volume: 7
  start-page: 400
  year: 2010
  ident: ref22
  article-title: Surface patch reconstruction from 'one-dimensional' tactile data
  publication-title: IEEE Trans Autom Sci Eng
  doi: 10.1109/TASE.2009.2020994
– ident: ref56
  doi: 10.1007/s11432-015-5404-9
– start-page: 589
  year: 2006
  ident: ref47
  article-title: Region covariance: A fast descriptor for detection and classification
  publication-title: Proc Eur Conf Comput Vis (ECCV)
– ident: ref24
  doi: 10.1068/p5850
– ident: ref23
  doi: 10.1109/TRO.2011.2121130
– ident: ref10
  doi: 10.1016/j.jfranklin.2015.07.002
– ident: ref51
  doi: 10.1109/ROBOT.1993.291939
– ident: ref48
  doi: 10.1109/TST.2013.6449409
– ident: ref54
  doi: 10.1109/TIP.2012.2205006
– ident: ref30
  doi: 10.1109/ICRA.2014.6907172
– ident: ref6
  doi: 10.1109/IROS.2011.6094878
– ident: ref18
  doi: 10.1109/ROBOT.2010.5509553
– ident: ref33
  doi: 10.1109/TIP.2013.2282078
– ident: ref9
  doi: 10.1109/TASE.2014.2320157
– start-page: 1
  year: 2007
  ident: ref31
  article-title: An experiment in the use of manipulation primitives and tactile perception for reactive grasping
  publication-title: Proc Robot Sci Syst Workshop Robot Manipulation Sens Adapt Real World
– ident: ref26
  doi: 10.1109/ICCV.2013.202
– ident: ref13
  doi: 10.1109/TRO.2011.2134130
– ident: ref46
  doi: 10.1109/TIM.2012.2210460
– year: 1996
  ident: ref32
  article-title: Columbia object image library
SSID ssj0024890
Score 2.5543034
Snippet The camera provides rich visual information regarding objects and becomes one of the most mainstream sensors in the automation community. However, it is often...
SourceID crossref
ieee
SourceType Enrichment Source
Index Database
Publisher
StartPage 996
SubjectTerms Automation
Joint sparse coding
Manipulators
Object recognition
tactile perception
Tactile sensors
Training
visual perception
Visualization
Title Visual-Tactile Fusion for Object Recognition
URI https://ieeexplore.ieee.org/document/7462208
Volume 14
WOSCitedRecordID wos000399347500052&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1558-3783
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0024890
  issn: 1545-5955
  databaseCode: RIE
  dateStart: 20040101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEA61eNCDryrWF3vwJE27m2weeyzS4qmKVultSZNZEEorbdffbybd1goieFuWCSxfkp3JZL5vCLnlCmIGCadOJSn1fz9OtTNIfeJWaW5TaXRoNqEGAz0aZU810tpwYQAgFJ9BGx_DXb6b2RJTZR2VSsaQ2bujlFxxtb519XTIp2BEQEUmRHWDmcRZZ9h96WERl2zjaUgI9sMHbTVVCT6lf_i_rzkiB1XsGHVXk31MajA9IftbioIN0np7X5RmQofIV5hA1C8xGxb5yDR6HGPKJXpeVwzNpqfktd8b3j_QqiECtUyKJWUpSJNwYXnhtyVwFXuktZOoUVbEmTE-HCikchl4G-PAjIX330mBMmN-kOBnpD6dTeGcRJr7QMxvVquyIrU21oqzQkk_b8KlDqBJ4jVEua3UwrFpxSQPp4Y4yxHVHFHNK1Sb5G4z5GMllfGXcQMR3RhWYF78_vqS7DF0p6Fi5orUl_MSrsmu_Vy-L-Y3YSF8Adt2rQ4
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFA5DBfXB2xTntQ8-ybqlubaPQzYmzilaZW-lS1IYjE621d9vTtbNCSL4VsoJlC9Jz8nJ-b6D0A2VBhMTUF_LgPn270f9UKdAfaJKhlQxkYau2YTs98PBIHquoPqKC2OMccVnpgGP7i5fT1QBqbKmZIIQYPZucsYIXrC1vpX1QpdRgZjA5xHn5R1mgKNm3HptQxmXaMB5iHPywwuttVVxXqWz_7_vOUB7ZfTotRbTfYgqJj9Cu2uaglVUfx_NinTsx8BYGBuvU0A-zLOxqfc0hKSL97KsGZrkx-it047vun7ZEsFXRPC5T5gRaUC5opndmIZKbLEOtQCVsgxHaWoDgkxIHRlrk2qTDrn14EEGQmN2EKcnaCOf5OYUeSG1oZjdrkpGGVMKh5KSTAo7c1wzbUwN4SVEiSr1wqFtxThx5wYcJYBqAqgmJao1dLsa8rEQy_jLuAqIrgxLMM9-f32NtrvxYy_p3fcfztEOAefq6mcu0MZ8WphLtKU-56PZ9Motii8OnbBV
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Visual%E2%80%93Tactile+Fusion+for+Object+Recognition&rft.jtitle=IEEE+transactions+on+automation+science+and+engineering&rft.au=Liu%2C+Huaping&rft.au=Yu%2C+Yuanlong&rft.au=Sun%2C+Fuchun&rft.au=Gu%2C+Jason&rft.date=2017-04-01&rft.issn=1545-5955&rft.eissn=1558-3783&rft.volume=14&rft.issue=2&rft.spage=996&rft.epage=1008&rft_id=info:doi/10.1109%2FTASE.2016.2549552&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TASE_2016_2549552
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1545-5955&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1545-5955&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1545-5955&client=summon