Fast learning rates for plug-in classifiers - Mathematics > Statistics TheoryReport as inadecuate




Fast learning rates for plug-in classifiers - Mathematics > Statistics Theory - Download this document for free, or read online. Document in PDF available to download.

Abstract: It has been recently shown that, under the margin or low noise assumption,there exist classifiers attaining fast rates of convergence of the excess Bayesrisk, that is, rates faster than $n^{-1-2}$. The work on this subject hassuggested the following two conjectures: i the best achievable fast rate isof the order $n^{-1}$, and ii the plug-in classifiers generally converge moreslowly than the classifiers based on empirical risk minimization. We show thatboth conjectures are not correct. In particular, we construct plug-inclassifiers that can achieve not only fast, but also super-fast rates, that is,rates faster than $n^{-1}$. We establish minimax lower bounds showing that theobtained rates cannot be improved.



Author: Jean-Yves Audibert, Alexandre B. Tsybakov

Source: https://arxiv.org/







Related documents