Authors
Meng Tang, Federico Perazzi, Abdelaziz Djelouah, Ismail Ben Ayed, Christopher Schroers, Yuri Boykov
Publication date
2018
Conference
Proceedings of the European conference on computer vision (ECCV)
Pages
507-522
Description
Minimization of regularized losses is a principled approach to weak supervision well-established in deep learning, in general. However, it is largely overlooked in semantic segmentation currently dominated by methods mimicking full supervision via``fake''fully-labeled masks (proposals) generated from available partial input. To obtain such full masks the typical methods explicitly use standard regularization techniques for``shallow''segmentation, eg graph cuts or dense CRFs. In contrast, we integrate such standard regularizers directly into the loss functions over partial input. This approach simplifies weakly-supervised training by avoiding extra MRF/CRF inference steps or layers explicitly generating full masks, while improving both the quality and efficiency of training. This paper proposes and experimentally compares different losses integrating MRF/CRF regularization terms. We juxtapose our regularized losses with earlier proposal-generation methods. Our approach achieves state-of-the-art accuracy in semantic segmentation with near full-supervision quality.
Total citations
20182019202020212022202320246293976538452
Scholar articles
M Tang, F Perazzi, A Djelouah, I Ben Ayed, C Schroers… - Proceedings of the European conference on computer …, 2018