solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看4722次
Find the dimension that counts: Fast dimension estimation and Krylov PCA. (arXiv:1810.03733v1 [cs.NA])
来源于:arXiv
High dimensional data and systems with many degrees of freedom are often
characterized by covariance matrices. In this paper, we consider the problem of
simultaneously estimating the dimension of the principal (dominant) subspace of
these covariance matrices and obtaining an approximation to the subspace. This
problem arises in the popular principal component analysis (PCA), and in many
applications of machine learning, data analysis, signal and image processing,
and others. We first present a novel method for estimating the dimension of the
principal subspace. We then show how this method can be coupled with a Krylov
subspace method to simultaneously estimate the dimension and obtain an
approximation to the subspace. The dimension estimation is achieved at no
additional cost. The proposed method operates on a model selection framework,
where the novel selection criterion is derived based on random matrix
perturbation theory ideas. We present theoretical analyses which (a) show that
th 查看全文>>