Adversarial attacks in signature verification: a deep learning approach

Authors

Keywords:

Adversarial attack, Affine transformation, Convolutional neural network, Forensic document analysis, Image augmentation, Signature verification, Writer authentication

Abstract

Handwritten signature recognition in forensic science is crucial for identity and document authentication. While serving as a legal representation of a person’s agreement or consent to the contents of a document, handwritten signatures determine the authenticity of a document, identify forgeries, pinpoint the suspects and support other pieces of evidence like ink or document analysis. This work focuses on developing and evaluating a handwritten signature verification system using a convolutional neural network (CNN) and emphasising the model’s efficacy using hand-crafted adversarial attacks. Initially, handwritten signatures have been collected from sixteen volunteers, each contributing ten samples, followed by image normalization and augmentation to boost synthetic data samples and overcome the data scarcity. The proposed model achieved a testing accuracy of 91.35% using an 80:20 train-test split. Additionally, using the five-fold cross-validation, the model achieved a robust validation accuracy of nearly 98%. Finally, the introduction of manually constructed adversarial assaults on the signature images undermines the model’s accuracy, bringing the accuracy down to nearly 80%. This highlights the need to consider adversarial resilience while designing deep learning models for classification tasks. Exposing the model to real look-alike fake samples is critical while testing its robustness and refining the model using trial and error methods.

Downloads

Published

2024-11-01

How to Cite

[1]
A. Hazra, S. Maity, B. Pal, A. Bandyopadhyay, and A. Bandyopadhyay, “Adversarial attacks in signature verification: a deep learning approach”, Comput Sci Inf Technol, vol. 5, no. 3, pp. 215–226, Nov. 2024.

Issue

Section

Articles

Similar Articles

1 2 3 4 5 6 7 > >> 

You may also start an advanced similarity search for this article.