Authors
Daniel J Hsu, Sham M Kakade, John Langford, Tong Zhang
Publication date
2009
Journal
Advances in neural information processing systems
Volume
22
Description
We consider multi-label prediction problems with large output spaces under the assumption of output sparsity–that the target (label) vectors have small support. We develop a general theory for a variant of the popular error correcting output code scheme, using ideas from compressed sensing for exploiting this sparsity. The method can be regarded as a simple reduction from multi-label regression problems to binary regression problems. We show that the number of subproblems need only be logarithmic in the total number of possible labels, making this approach radically more efficient than others. We also state and prove robustness guarantees for this method in the form of regret transform bounds (in general), and also provide a more detailed analysis for the linear prediction setting.
Total citations
20102011201220132014201520162017201820192020202120222023202411233532415653553661343325157
Scholar articles
DJ Hsu, SM Kakade, J Langford, T Zhang - Advances in neural information processing systems, 2009