Stochastic Spectral Descent for Discrete Graphical ModelsReportar como inadecuado




Stochastic Spectral Descent for Discrete Graphical Models - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Published in: IEEE Journal of Selected Topics in Signal Processing (ISSN: 1932-4553), vol. 10, num. 2, p. 296-311 Piscataway: Ieee-Inst Electrical Electronics Engineers Inc, 2016

Interest in deep probabilistic graphical models has increased in recent years, due to their state-of-the-art perfor- mance on many machine learning applications. Such models are typically trained with the stochastic gradient method, which can take a significant number of iterations to converge. Since the computational cost of gradient estimation is prohibitive even for modestly-sized models, training becomes slow and practically- usable models are kept small. In this paper we propose a new, largely tuning-free algorithm to address this problem. Our approach derives novel majorization bounds based on the Schatten-∞ norm. Intriguingly, the minimizers of these bounds can be interpreted as gradient methods in a non-Euclidean space. We thus propose using a stochastic gradient method in non-Euclidean space. We both provide simple conditions under which our algorithm is guaranteed to converge, and demonstrate empirically that our algorithm leads to dramatically faster training and improved predictive ability compared to stochastic gradient descent for both directed and undirected graphical models.

Keywords: Deep Learning ; Spectral Descent ; Non-Euclidean Algorithms Reference EPFL-ARTICLE-214739doi:10.1109/Jstsp.2015.2505684View record in Web of Science





Autor: Carlson, David; Hsieh, Ya-Ping; Collins, Edo; Carin, Lawrence; Cevher, Volkan

Fuente: https://infoscience.epfl.ch/record/214739?ln=en







Documentos relacionados