ABSTRACT: Let X(0) be an unknown M by N matrix. In matrix recovery, one takes n < MN linear measurements y(1),…,y(n) of X(0), where y(i) = Tr(A(T)iX(0)) and each A(i) is an M by N matrix. A popular approach for matrix recovery is nuclear norm minimization (NNM): solving the convex optimization problem min ||X||*subject to y(i) =Tr(A(T)(i)X) for all 1 ? i ? n, where || · ||* denotes the nuclear norm, namely, the sum of singular values. Empirical work reveals a phase transition curve, stated in terms of the undersampling fraction ?(n,M,N) = n/(MN), rank fraction ?=rank(X0)/min {M,N}, and aspect ratio ?=M/N. Specifically when the measurement matrices Ai have independent standard Gaussian random entries, a curve ?*(?) = ?*(?;?) exists such that, if ? > ?*(?), NNM typically succeeds for large M,N, whereas if ? < ?*(?), it typically fails. An apparently quite different problem is matrix denoising in Gaussian noise, in which an unknown M by N matrix X(0) is to be estimated based on direct noisy measurements Y =X(0) + Z, where the matrix Z has independent and identically distributed Gaussian entries. A popular matrix denoising scheme solves the unconstrained optimization problem min|| Y-X||(2)(F)/2+?||X||*. When optimally tuned, this scheme achieves the asymptotic minimax mean-squared error M(?;?) = lim(M,N ? ?)inf(?)sup(rank(X) ? ? · M)MSE(X,X(?)), where M/N ? . We report extensive experiments showing that the phase transition ?*(?) in the first problem, matrix recovery from Gaussian measurements, coincides with the minimax risk curve M(?)=M(?;?) in the second problem, matrix denoising in Gaussian noise: ?*(?)=M(?), for any rank fraction 0 < ? < 1 (at each common aspect ratio ?). Our experiments considered matrices belonging to two constraint classes: real M by N matrices, of various ranks and aspect ratios, and real symmetric positive-semidefinite N by N matrices, of various ranks.