Abstract
We consider neural network training, in applications in which there are many possible classes, but at test-time, the task is a binary classification task of determining whether the given example belongs to a specific class. We define the Single Logit Classification (SLC) task: training the network so that at test-time, it would be possible to accurately identify whether the example belongs to a given class in a computationally efficient manner, based only on the output logit for this class. We propose a natural principle, the Principle of Logit Separation, as a guideline for choosing and designing loss functions that are suitable for SLC. We show that the Principle of Logit Separation is a crucial ingredient for success in the SLC task, and that SLC results in considerable speedups when the number of classes is large.
Cite
CITATION STYLE
Keren, G., Sabato, S., & Schuller, B. (2019). A walkthrough for the principle of logit separation. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 6191–6195). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/861
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.