Local reconstruction of low‐rank matrices and subspaces

We study the problem of reconstructing a low‐rank matrix, where the input is an n × m matrix M over a field F and the goal is to reconstruct a (near‐optimal) matrix M′ that is low‐rank and close to M under some distance function Δ. Furthermore, the reconstruction must be local, i.e., provides access...

Full description

Saved in:
Bibliographic Details
Published in:Random structures & algorithms Vol. 51; no. 4; pp. 607 - 630
Main Authors: David, Roee, Goldenberg, Elazar, Krauthgamer, Robert
Format: Journal Article
Language:English
Published: Hoboken Wiley Subscription Services, Inc 01.12.2017
Subjects:
ISSN:1042-9832, 1098-2418
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We study the problem of reconstructing a low‐rank matrix, where the input is an n × m matrix M over a field F and the goal is to reconstruct a (near‐optimal) matrix M′ that is low‐rank and close to M under some distance function Δ. Furthermore, the reconstruction must be local, i.e., provides access to any desired entry of M′ by reading only a few entries of the input M (ideally, independent of the matrix dimensions n and m). Our formulation of this problem is inspired by the local reconstruction framework of Saks and Seshadhri (SICOMP, 2010). Our main result is a local reconstruction algorithm for the case where Δ is the normalized Hamming distance (between matrices). Given M that is ϵ‐close to a matrix of rank d<1/ϵ (together with d and ϵ), this algorithm computes with high probability a rank‐d matrix M′ that is O(dϵ)‐close to M. This is a local algorithm that proceeds in two phases. The preprocessing phase reads only O˜(d/ϵ3) random entries of M, and stores a small data structure. The query phase deterministically outputs a desired entry M′i,j by reading only the data structure and 2d additional entries of M. We also consider local reconstruction in an easier setting, where the algorithm can read an entire matrix column in a single operation. When Δ is the normalized Hamming distance between vectors, we derive an algorithm that runs in polynomial time by applying our main result for matrix reconstruction. For comparison, when Δ is the truncated Euclidean distance and F=ℝ, we analyze sampling algorithms by using statistical learning tools. A preliminary version of this paper appears appears in ECCC, see: http://eccc.hpi-web.de/report/2015/128/ © 2017 Wiley Periodicals, Inc. Random Struct. Alg., 51, 607–630, 2017
Bibliography:Supported by European Research Council (the European Union's Seventh Framework Programme FP/2007–2013; to E.G.) (616787); US‐Israel BSF (to R.K.) (#2010418); Israel Science Foundation (to R.K.) (#897/13).
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1042-9832
1098-2418
DOI:10.1002/rsa.20720