Identifying Style of 3D Shapes using Deep Metric Learning

We present a method that expands on previous work in learning human perceived style similarity across objects with different structures and functionalities. Unlike previous approaches that tackle this problem with the help of hand‐crafted geometric descriptors, we make use of recent advances in metr...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 35; no. 5; pp. 207 - 215
Main Authors: Lim, Isaak, Gehre, Anne, Kobbelt, Leif
Format: Journal Article
Language:English
Published: Oxford Blackwell Publishing Ltd 01.08.2016
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a method that expands on previous work in learning human perceived style similarity across objects with different structures and functionalities. Unlike previous approaches that tackle this problem with the help of hand‐crafted geometric descriptors, we make use of recent advances in metric learning with neural networks (deep metric learning). This allows us to train the similarity metric on a shape collection directly, since any low‐ or high‐level features needed to discriminate between different styles are identified by the neural network automatically. Furthermore, we avoid the issue of finding and comparing sub‐elements of the shapes. We represent the shapes as rendered images and show how image tuples can be selected, generated and used efficiently for deep metric learning. We also tackle the problem of training our neural networks on relatively small datasets and show that we achieve style classification accuracy competitive with the state of the art. Finally, to reduce annotation effort we propose a method to incorporate heterogeneous data sources by adding annotated photos found online in order to expand or supplant parts of our training data.
Bibliography:ark:/67375/WNG-QCKM3TVW-0
ArticleID:CGF12977
istex:4430CDB01F0FBD88144FDF867F8961CF2493E3B5
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12977