Implementation of Facial Feature Extraction Using Viola-Jones Method for Mobile Robot System

To be able to detect faces, the system must be able to identify the characteristics of the face, so that the camera can distinguish which parts of the face and non-faces. In identifying facial features performed on certain parameters, such as skin color, facial contours, lighting, facial pose. At th...

Full description

Saved in:
Bibliographic Details
Published in:Journal of physics. Conference series Vol. 1500; no. 1; pp. 12011 - 12018
Main Authors: Zarkasi, Ahmad, Nurmaini, Siti, Setiawan, Deris, Kuswandi, Ahmad, Desy Siswanti, Sri
Format: Journal Article
Language:English
Published: Bristol IOP Publishing 01.04.2020
Subjects:
ISSN:1742-6588, 1742-6596
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To be able to detect faces, the system must be able to identify the characteristics of the face, so that the camera can distinguish which parts of the face and non-faces. In identifying facial features performed on certain parameters, such as skin color, facial contours, lighting, facial pose. At this time face detection system uses an algorithm Viola-Jones is considered the most accurate in the detection and face tracking. In this research, we will discuss the application of interactions between humans and machines with image media in the form of faces. Interactions between humans and computers can be found in various places. One of the media interaction between humans and computers is with the image. This time the system will indicate to find how r o bot can act through the position of a person's face recorded on camera robot. This system uses the Viola-Jones algorithm method which is a fairly popular object detection method because the detection process is carried out very quickly. Besides, the bot will also follow the face based on the identity it stores in the database. In recognizing the human face this system uses the Eigenface method. Both of these algorithms will adjust the robot's motion as it should. The results of the experiment will show the robot can move quite well following the face stored in the database.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1500/1/012011