Weber Law Based Approach for Multi-Class Image Forgery Detection

Today’s forensic science introduces a new research area for digital image analysis for multimedia security. So, Image authentication issues have been raised due to the wide use of image manipulation software to obtain an illegitimate benefit or create misleading publicity by using tempered images. E...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers, materials & continua Jg. 78; H. 1; S. 145 - 166
Hauptverfasser: Akram, Arslan, Rashid, Javed, Jaffar, Arfan, Hajjej, Fahima, Iqbal, Waseem, Sarwar, Nadeem
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Henderson Tech Science Press 2024
Schlagworte:
ISSN:1546-2226, 1546-2218, 1546-2226
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Today’s forensic science introduces a new research area for digital image analysis for multimedia security. So, Image authentication issues have been raised due to the wide use of image manipulation software to obtain an illegitimate benefit or create misleading publicity by using tempered images. Exiting forgery detection methods can classify only one of the most widely used Copy-Move and splicing forgeries. However, an image can contain one or more types of forgeries. This study has proposed a hybrid method for classifying Copy-Move and splicing images using texture information of images in the spatial domain. Firstly, images are divided into equal blocks to get scale-invariant features. Weber law has been used for getting texture features, and finally, XGBOOST is used to classify both Copy-Move and splicing forgery. The proposed method classified three types of forgeries, i.e., splicing, Copy-Move, and healthy. Benchmarked (CASIA 2.0, MICCF200) and RCMFD datasets are used for training and testing. On average, the proposed method achieved 97.3% accuracy on benchmarked datasets and 98.3% on RCMFD datasets by applying 10-fold cross-validation, which is far better than existing methods.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1546-2226
1546-2218
1546-2226
DOI:10.32604/cmc.2023.041074