Welcome to the IKCEST

Pattern Recognition | Vol.58, Issue.0 | | Pages 149-158

Pattern Recognition

A non-parametric approach to extending generic binary classifiers for multi-classification

Venkataraman Santhanam   David Harwood   Vlad I. Morariu   Larry S. Davis  
Abstract

Ensemble methods, which combine generic binary classifier scores to generate a multi-classification output, are commonly used in state-of-the-art computer vision and pattern recognition systems that rely on multi-classification. In particular, we consider the one-vs-one decomposition of the multi-class problem, where binary classifier models are trained to discriminate every class pair. We describe a robust multi-classification pipeline, which at a high level involves projecting binary classifier scores into compact orthogonal subspaces, followed by a non-linear probabilistic multi-classification step, using Kernel Density Estimation (KDE). We compare our approach against state-of-the-art ensemble methods (DCS, DRCW) on 16 multi-class datasets. We also compare against the most commonly used ensemble methods (VOTE, NEST) on 6 real-world computer vision datasets. Finally, we measure the statistical significance of our approach using non-parametric tests. Experimental results show that our approach gives a statistically significant improvement in multi-classification performance over state-of-the-art.

Original Text (This is the original text for your reference.)

A non-parametric approach to extending generic binary classifiers for multi-classification

Ensemble methods, which combine generic binary classifier scores to generate a multi-classification output, are commonly used in state-of-the-art computer vision and pattern recognition systems that rely on multi-classification. In particular, we consider the one-vs-one decomposition of the multi-class problem, where binary classifier models are trained to discriminate every class pair. We describe a robust multi-classification pipeline, which at a high level involves projecting binary classifier scores into compact orthogonal subspaces, followed by a non-linear probabilistic multi-classification step, using Kernel Density Estimation (KDE). We compare our approach against state-of-the-art ensemble methods (DCS, DRCW) on 16 multi-class datasets. We also compare against the most commonly used ensemble methods (VOTE, NEST) on 6 real-world computer vision datasets. Finally, we measure the statistical significance of our approach using non-parametric tests. Experimental results show that our approach gives a statistically significant improvement in multi-classification performance over state-of-the-art.

+More

Cite this article
APA

APA

MLA

Chicago

Venkataraman Santhanam, David Harwood, Vlad I. Morariu, Larry S. Davis,.A non-parametric approach to extending generic binary classifiers for multi-classification. 58 (0),149-158.

Disclaimer: The translated content is provided by third-party translation service providers, and IKCEST shall not assume any responsibility for the accuracy and legality of the content.
Translate engine
Article's language
English
中文
Pусск
Français
Español
العربية
Português
Kikongo
Dutch
kiswahili
هَوُسَ
IsiZulu
Action
Recommended articles

Report

Select your report category*



Reason*



By pressing send, your feedback will be used to improve IKCEST. Your privacy will be protected.

Submit
Cancel