Classical matrix completion methods are not effective in recovering missing entries of data drawn from multiple subspaces because the matrices are often of high-rank. Recently a few advanced matrix completion methods were proposed to solve the problem but they are not scalable to large matrices and big data problems. This paper proposes a sparse factorization method for matrix completion on multiple-subspace data. The method factorizes the given incomplete matrix into a dense matrix and a sparse matrix, while the factorization errors of the observed entries are minimized. To solve the optimization problem, an accelerated proximal alternating linearized minimization (APALM) algorithm is proposed. As a non-trivial task owing to the alternation, linearization, nonconvexity, and extrapolation, the convergence of APALM is proved. APALM can solve a large class of optimization problems such as matrix factorization with nonsmooth regularizations. In addition, we show that, to recover an m×n matrix consisting of data drawn from k subspaces of dimension r , the number of observed entries required in our matrix completion method is O(nrlogklogn) while that in conventional methods is O(nrklogn) , which theoretically proves the superiority of our method on multiple-subspace data and high-rank matrices. The proposed matrix completion method is compared with state-of-the-art on synthetic data and real collaborative filtering problems. The experimental results corroborate that the proposed method can handle large matrices efficiently and provide high recovery accuracy.