A feature-based knowledge distillation (FKD) for offline signature feature learning without signatures

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes a novel approach to harnessing the knowledge of pre-trained expert models for training new Convolutional Neural Networks, particularly in domains with limited or unavailable task-specific data. The method is applied to offline handwritten signature verification (OffSV), a biometric field that faces challenges related to data scarcity, often due to regulatory constraints. The proposed Student-Teacher (S-T) framework employs feature-based knowledge distillation (FKD), integrating graph-based similarity for local activations and global similarity measures to guide the student model's training, using only handwritten text data. Notably, the models trained with this method show performance on par with, or exceeding, that of the teacher model across three widely used signature datasets. More importantly, these results are attained without employing any signatures during the feature extraction training process. This study demonstrates the efficacy of leveraging existing expert models to overcome data scarcity challenges in OffSV and potentially other related domains. The proposed methodology is available at https://github.com/dimTsourounis/FKD.

Cite

CITATION STYLE

APA

Tsourounis, D., Theodorakopoulos, I., Zois, E. N., & Economou, G. (2026). A feature-based knowledge distillation (FKD) for offline signature feature learning without signatures. Expert Systems with Applications, 296. https://doi.org/10.1016/j.eswa.2025.129158

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free