Convex optimization learning of faithful Euclidean distance representations in nonlinear dimensionality reductionReportar como inadecuado




Convex optimization learning of faithful Euclidean distance representations in nonlinear dimensionality reduction - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Mathematical Programming

, Volume 164, Issue 1–2, pp 341–381

First Online: 11 November 2016Received: 03 May 2015Accepted: 05 November 2016DOI: 10.1007-s10107-016-1090-7

Cite this article as: Ding, C. & Qi, HD. Math. Program. 2017 164: 341. doi:10.1007-s10107-016-1090-7

Abstract

Classical multidimensional scaling only works well when the noisy distances observed in a high dimensional space can be faithfully represented by Euclidean distances in a low dimensional space. Advanced models such as Maximum Variance Unfolding MVU and Minimum Volume Embedding MVE use Semi-Definite Programming SDP to reconstruct such faithful representations. While those SDP models are capable of producing high quality configuration numerically, they suffer two major drawbacks. One is that there exist no theoretically guaranteed bounds on the quality of the configuration. The other is that they are slow in computation when the data points are beyond moderate size. In this paper, we propose a convex optimization model of Euclidean distance matrices. We establish a non-asymptotic error bound for the random graph model with sub-Gaussian noise, and prove that our model produces a matrix estimator of high accuracy when the order of the uniform sample size is roughly the degree of freedom of a low-rank matrix up to a logarithmic factor. Our results partially explain why MVU and MVE often work well. Moreover, the convex optimization model can be efficiently solved by a recently proposed 3-block alternating direction method of multipliers. Numerical experiments show that the model can produce configurations of high quality on large data points that the SDP approach would struggle to cope with.

KeywordsEuclidean distance matrix Convex matrix optimization Multidimensional scaling Nonlinear dimensionality reduction Low-rank matrix Error bounds Random graph models This work is supported by Engineering and Physical Science Research Council UK Project EP-K007645-1. The research of C. Ding is supported by the National Natural Science Foundation of China under Project No. 11671387.

Mathematics Subject Classification49M45 90C25 90C33 



Autor: Chao Ding - Hou-Duo Qi

Fuente: https://link.springer.com/







Documentos relacionados