Authors
Badong Chen, Lei Xing, Bin Xu, Haiquan Zhao, Jose C Principe
Publication date
2016/12/22
Journal
IEEE transactions on neural networks and learning systems
Volume
29
Issue
3
Pages
731-737
Publisher
IEEE
Description
The minimum error entropy (MEE) is an important and highly effective optimization criterion in information theoretic learning (ITL). For regression problems, MEE aims at minimizing the entropy of the prediction error such that the estimated model preserves the information of the data generating system as much as possible. In many real world applications, the MEE estimator can outperform significantly the well-known minimum mean square error (MMSE) estimator and show strong robustness to noises especially when data are contaminated by non-Gaussian (multimodal, heavy tailed, discrete valued, and so on) noises. In this brief, we present some theoretical results on the robustness of MEE. For a one-parameter linear errors-in-variables (EIV) model and under some conditions, we derive a region that contains the MEE solution, which suggests that the MEE estimate can be very close to the true value of the …
Total citations
20172018201920202021202220232024346613151310
Scholar articles
B Chen, L Xing, B Xu, H Zhao, JC Principe - IEEE transactions on neural networks and learning …, 2016