Classification of Two Comic Books based on Convolutional Neural Networks

Unphotographic images are the powerful representations described various situations. Thus, understanding intellectual products such as comics and picture books is one of the important topics in the field of artificial intelligence. Hence, stepwise analysis of a comic story, i.e., features of a part...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal Jg. 6; H. 1; S. 5 - 12
Hauptverfasser: Ueno, Miki, Suenaga, Toshinori, Isahara, Hitoshi
Format: Journal Article
Sprache:Englisch
Japanisch
Veröffentlicht: Salamanca Ediciones Universidad de Salamanca 12.01.2017
Schlagworte:
ISSN:2255-2863, 2255-2863
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Unphotographic images are the powerful representations described various situations. Thus, understanding intellectual products such as comics and picture books is one of the important topics in the field of artificial intelligence. Hence, stepwise analysis of a comic story, i.e., features of a part of the image, information features, features relating to continuous scene etc., was pursued. Especially, the length and each scene of four-scene comics are limited so as to ensure a clear interpretation of the contents.In this study, as the first step in this direction, the problem to classify two four-scene comics by the same artists were focused as the example. Several classifiers were constructed by utilizing a Convolutional Neural Network(CNN), and the results of classification by a human annotator and by a computational method were compared.From these experiments, we have clearly shown that CNN is efficient way to classify unphotographic gray scaled images and found that characteristic features of images to classify incorrectly.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2255-2863
2255-2863
DOI:10.14201/adcaij201761512