关键词:主成分分析;块坐标下降法;降维模型;集对分析;表面等离子体共振
摘 要:In the first part of this thesis, we examine the Robust Principal Components Analysis (RPCA) problem: given a matrix X that is the sum of a low-rank matrix L* and a sparse noise matrix S*, recover L* and S*. We introduce a block coordinate descent algorithm for this problem and prove a convergence result. In addition, our iterative algorithm has low complexity per iteration and empirically performs well on synthetic datasets. In the second part of this thesis, we examine a variant of ridge regression: unlike in the classical setting where we know that the parameter of interest lies near a single point, we instead only know that it lies near a known low-dimensional subspace. We formulate this regression problem as a convex optimization problem, and introduce an efficient block coordinate descent algorithm for solving it.