Gibbs posterior for variable selection in high-dimensional classification and data mining - Statistics > MethodologyReportar como inadecuado




Gibbs posterior for variable selection in high-dimensional classification and data mining - Statistics > Methodology - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Abstract: In the popular approach of -Bayesian variable selection- BVS, one usesprior and posterior distributions to select a subset of candidate variables toenter the model. A completely new direction will be considered here to studyBVS with a Gibbs posterior originating in statistical mechanics. The Gibbsposterior is constructed from a risk function of practical interest such asthe classification error and aims at minimizing a risk function withoutmodeling the data probabilistically. This can improve the performance over theusual Bayesian approach, which depends on a probability model which may bemisspecified. Conditions will be provided to achieve good risk performance,even in the presence of high dimensionality, when the number of candidatevariables -$K$- can be much larger than the sample size -$n$.- In addition, wedevelop a convenient Markov chain Monte Carlo algorithm to implement BVS withthe Gibbs posterior.



Autor: Wenxin Jiang, Martin A. Tanner

Fuente: https://arxiv.org/







Documentos relacionados