Riemannian Optimization for Solving High-Dimensional Problems with Low-Rank Tensor StructureReportar como inadecuado

Riemannian Optimization for Solving High-Dimensional Problems with Low-Rank Tensor Structure - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Lausanne: EPFL, 2016

In this thesis, we present a Riemannian framework for the solution of high-dimensional optimization problems with an underlying low-rank tensor structure. Here, the high-dimensionality refers to the size of the search space, while the cost function is scalar-valued. Such problems arise, for example, in the reconstruction of high-dimensional data sets and in the solution of parameter dependent partial differential equations. As the degrees of freedom grow exponentially with the number of dimensions, the so-called curse of dimensionality, directly solving the optimization problem is computationally unfeasible even for moderately high-dimensional problems. We constrain the optimization problem by assuming a low-rank tensor structure of the solution; drastically reducing the degrees of freedom. We reformulate this constrained optimization as an optimization problem on a manifold using the smooth embedded Riemannian manifold structure of the low-rank representations of the Tucker and tensor train formats. Exploiting this smooth structure, we derive efficient gradient-based optimization algorithms. In particular, we propose Riemannian conjugate gradient schemes for the solution of the tensor completion problem, where we aim to reconstruct a high-dimensional data set for which the vast majority of entries is unknown. For the solution of linear systems, we show how we can precondition the Riemannian gradient and leverage second-order information in an approximate Newton scheme. Finally, we describe a preconditioned alternating optimization scheme with subspace correction for the solution of high-dimensional symmetric eigenvalue problems.

Keywords: Curse of dimensionality ; Riemannian optimization ; low-rank structure ; Tucker format ; Tensor Train ; preconditioning Thèse École polytechnique fédérale de Lausanne EPFL, n° 6958 (2016)Programme doctoral MathématiquesFaculté des sciences de baseInstitut de mathématiques des sciences computationnelles et ingénierieAlgorithmes numériques et calcul haute performance - Chaire CADMOSJury: Prof. Kathryn Hess Bellwald (présidente) ; Prof. Daniel Kressner (directeur de thèse) ; Prof. Fabio Nobile, Prof. André Uschmajew, Prof. Bart Vandereycken (rapporteurs) Public defense: 2016-4-11 Reference doi:10.5075/epfl-thesis-6958Print copy in library catalog

Autor: Steinlechner, Michael MaximilianAdvisor: Kressner, Daniel

Fuente: https://infoscience.epfl.ch/record/217938?ln=en

Documentos relacionados