3260 papers • 126 benchmarks • 313 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix. Source: Universal Matrix Completion
(Image credit: Papersgraph)
These leaderboards are used to track progress in low-rank-matrix-completion-11
No benchmarks available.
Use these libraries to find low-rank-matrix-completion-11 models and implementations
No datasets available.
No subtasks available.
This work formulate and derive a highly efficient, conjugate gradient based alternating minimization scheme that solves optimizations with over 55 million observations up to 2 orders of magnitude faster than state-of-the-art (stochastic) gradient-descent based methods.
Results show that the SVP-Newton method is significantly robust to noise and performs impressively on a more realistic power-law sampling scheme for the matrix completion problem.
A tensor pattern which is an extension of matrix is introduced into modeling the traffic data for the first time, which can give full play to traffic spatial–temporal information and preserve the multi-way nature of traffic data.
An efficient and scalable low rank matrix completion algorithm to extend the orthogonal matching pursuit method from the vector case to the matrix case is proposed and an economic version is proposed by introducing a novel weight updating rule to reduce the time and storage complexity.
Experiments on two widely used datasets with different dimensions of textual features demonstrate that the low-rank matrix completion approach significantly outperforms the baseline and the state-of-the-art methods.
A low gradient regularization method is proposed in which the penalty for small gradients is reduced while penalizing the nonzero gradients to allow for gradual depth changes.
This paper proposes a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space and shows the developments on the Grassmann manifold.
This work considers a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i.e. each data point is a solution to a system of polynomial equations, and proposes an efficient matrix completion algorithm that minimizes a convex or non-convex surrogate of the rank of the matrix of monomial features.
Adding a benchmark result helps the community track progress.