Abstract
Rescaling the backpropagated gradient of contrastive loss has made significant progress in descriptor learning. However, current gradient modulation strategies have no regard for the varying distribution of global gradients, so they would suffer from changes in training phases or datasets. In this paper, we propose a dynamic gradient modulation, named SDGMNet, for contrastive local descriptor learning. The core of our method is formulating modulation functions with dynamically estimated statistical characteristics. Firstly, we introduce angle for distance measure after deep analysis on backpropagation of pair-wise loss. On this basis, auto-focus modulation is employed to moderate the impact of statistically uncommon individual pairs in stochastic gradient descent optimization; probabilistic margin cuts off the gradients of proportional triplets that have achieved enough optimization; power adjustment balances the total weights of negative pairs and positive pairs. Extensive experiments demonstrate that our novel descriptor surpasses previous state-of-the-art methods in several tasks including patch verification, retrieval, pose estimation, and 3D reconstruction.
Cite
CITATION STYLE
Deng, Y., & Ma, J. (2024). SDGMNet: Statistic-Based Dynamic Gradient Modulation for Local Descriptor Learning. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, pp. 1510–1518). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v38i2.27916
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.