Authors
Xuezhi Liang, Xiaobo Wang, Zhen Lei, Shengcai Liao, Stan Z Li
Publication date
2017/10/26
Book
International Conference on Neural Information Processing
Pages
413-421
Publisher
Springer International Publishing
Description
In deep classification, the softmax loss (Softmax) is arguably one of the most commonly used components to train deep convolutional neural networks (CNNs). However, such a widely used loss is limited due to its lack of encouraging the discriminability of features. Recently, the large-margin softmax loss (L-Softmax [1]) is proposed to explicitly enhance the feature discrimination, with hard margin and complex forward and backward computation. In this paper, we propose a novel soft-margin softmax (SM-Softmax) loss to improve the discriminative power of features. Specifically, SM-Softamx only modifies the forward of Softmax by introducing a non-negative real number m, without changing the backward. Thus it can not only adjust the desired continuous soft margin but also be easily optimized by the typical stochastic gradient descent (SGD). Experimental results on three benchmark datasets have …
Total citations
20182019202020212022202320248171726141611
Scholar articles
X Liang, X Wang, Z Lei, S Liao, SZ Li - International Conference on Neural Information …, 2017