tensor completion and tensor estimationmontanar/other/talks/donoho2017.pdf ·...

87
Tensor completion and tensor estimation Andrea Montanari and Nike Sun Stanford University and UC Berkeley June 25, 2017 Andrea Montanari (Stanford) Landscape June 25, 2017 1 / 51

Upload: others

Post on 25-Oct-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Tensor completion and tensor estimation

Andrea Montanari and Nike Sun

Stanford University and UC Berkeley

June 25, 2017

Andrea Montanari (Stanford) Landscape June 25, 2017 1 / 51

Page 2: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

When did I first meet David?

2005: MSRI?

2007? ‘There are some interesting mathematical things happening’

Andrea Montanari (Stanford) Landscape June 25, 2017 2 / 51

Page 3: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

When did I first meet David?

2005: MSRI?

2007? ‘There are some interesting mathematical things happening’

Andrea Montanari (Stanford) Landscape June 25, 2017 2 / 51

Page 4: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

When did I first meet David?

2005: MSRI?

2007? ‘There are some interesting mathematical things happening’

Andrea Montanari (Stanford) Landscape June 25, 2017 2 / 51

Page 5: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Tensor estimation: General question

Unknown tensor

X 2 Rd1 � � � Rdk

X = (Xi1;:::;ik )i1�d1;:::;ik�dk

Estimate X from noisy/incomplete observations

Andrea Montanari (Stanford) Landscape June 25, 2017 3 / 51

Page 6: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

To symplify notations. . .

I d1 = d2 = � � � = dk � d

I Tensors will be symmetric, e.g.:

Xi1:i2;i3 = Xi2;i1;i3 = Xi1;i3;i2 = : : :

I Results generalize.

Andrea Montanari (Stanford) Landscape June 25, 2017 4 / 51

Page 7: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Two concrete models:

Andrea Montanari (Stanford) Landscape June 25, 2017 5 / 51

Page 8: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model #1: Spiked tensors

Y = X +W

= �vk0 +W

Signal: v0 2 Sd�1 � fx 2 Rd : kxk2 = 1g.

Noise: (Wi1;i2;:::;ik )i1<i2<���<ik �iid N(0; 1=n)

SNR: �

Given Y , estimate v0

[Montanari, Richard, 2015]

Andrea Montanari (Stanford) Landscape June 25, 2017 6 / 51

Page 9: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model #1: Spiked tensors

Y = X +W

= �vk0 +W

Signal: v0 2 Sd�1 � fx 2 Rd : kxk2 = 1g.

Noise: (Wi1;i2;:::;ik )i1<i2<���<ik �iid N(0; 1=n)

SNR: �

Given Y , estimate v0

[Montanari, Richard, 2015]

Andrea Montanari (Stanford) Landscape June 25, 2017 6 / 51

Page 10: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model #1: Spiked tensors

k = 1 (sequence model):

yi = � v0;i + wi

k = 2 (spiked matrix model):

Yij = � v0;iv0;j +Wij

k = 3 (spiked matrix model):

Yijl = � v0;iv0;iv0;l +Wijl

Andrea Montanari (Stanford) Landscape June 25, 2017 7 / 51

Page 11: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model #2: Tensor completion

X =rX

`=1

vk`

Observed entries E � [d ]k � f1; : : : ; dgk

�E (X )i1;:::;ik =

(Xi1;:::;ik if (i1; : : : ; ik ) 2 E ,0 otherwise

Given Y = �E (X ), estimate X .

Andrea Montanari (Stanford) Landscape June 25, 2017 8 / 51

Page 12: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model #2: Tensor completion

X =rX

`=1

vk`

Observed entries E � [d ]k � f1; : : : ; dgk

�E (X )i1;:::;ik =

(Xi1;:::;ik if (i1; : : : ; ik ) 2 E ,0 otherwise

Given Y = �E (X ), estimate X .

Andrea Montanari (Stanford) Landscape June 25, 2017 8 / 51

Page 13: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Outline

1 Why?

2 Tensor completion

3 Overcomplete tensors

4 Spiked tensor model (a.k.a. tensor PCA)

5 Conclusion

arXiv:1612.07866, CPAMarXiv:1411.1076, NIPS

Andrea Montanari (Stanford) Landscape June 25, 2017 9 / 51

Page 14: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Why?

Andrea Montanari (Stanford) Landscape June 25, 2017 10 / 51

Page 15: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A cartoon application (not realistic!)

Andrea Montanari (Stanford) Landscape June 25, 2017 11 / 51

Page 16: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Image denoising

= + noise

Y = X + W

Andrea Montanari (Stanford) Landscape June 25, 2017 12 / 51

Page 17: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Image denoising

Idea (not necessarily a good one): View X as a 28� 28 matrix

Singular values:

0 5 10 15 20 25 30

`

0.0

0.2

0.4

0.6

0.8

1.0

σ`/σ

1

[Dozens of references]

Andrea Montanari (Stanford) Landscape June 25, 2017 13 / 51

Page 18: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Singular value thresholding

Noisy image Y 2 R28�28

Y =28X`=1

�`u`vT`

Denoised image

Y =r�X`=1

�`u`vT`

[. . . ; Candés, Sing-Long, Trzasko 2013; Donoho, Gavish 2014; Chatterjee 2015. . . ]

Andrea Montanari (Stanford) Landscape June 25, 2017 14 / 51

Page 19: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

How well does it work? r0 = 12, � = 30

Original Noisy Denoised

Andrea Montanari (Stanford) Landscape June 25, 2017 15 / 51

Page 20: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

How well does it work? � = 100

0 5 10 15 20 25 30

rank

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

MSE

:(

Andrea Montanari (Stanford) Landscape June 25, 2017 16 / 51

Page 21: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

How well does it work? � = 100

0 5 10 15 20 25 30

rank

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

MSE

:(

Andrea Montanari (Stanford) Landscape June 25, 2017 16 / 51

Page 22: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

In reality we have many similar images

I X 1;X 2; : : : ;X n 2 Rd�d

I Can we leverage similarities between images?

Andrea Montanari (Stanford) Landscape June 25, 2017 17 / 51

Page 23: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea

1. Stack images in a tensor:

X =hX 1

���X 2

��� � � � ���X n

i2 Rd Rd Rn

2. Do tensor denoising

Andrea Montanari (Stanford) Landscape June 25, 2017 18 / 51

Page 24: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A better application

Collaborative filtering:

Users � Hotel � Keyword

Andrea Montanari (Stanford) Landscape June 25, 2017 19 / 51

Page 25: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Tensor completion

Andrea Montanari (Stanford) Landscape June 25, 2017 20 / 51

Page 26: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model

I Unknown tensor X 2 (Rd)k :

X =rX

`=1

vk`

I Entries E � [d ]k , jE j = n uniformly random

I Observations

Y = �E (X )

Number of parameters � r d

Andrea Montanari (Stanford) Landscape June 25, 2017 21 / 51

Page 27: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Model

I Unknown tensor X 2 (Rd)k :

X =rX

`=1

vk`

I Entries E � [d ]k , jE j = n uniformly random

I Observations

Y = �E (X )

Number of parameters � r d

Andrea Montanari (Stanford) Landscape June 25, 2017 21 / 51

Page 28: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

What do we know about matrices? (k = 2)

Theorem (Gross 2011)

If X 2 Rd1�d2, and rank(X ) � r with factors factors are‘�-incoherent’, then we can reconstruct X exactly from

n � C (�) r(d1 _ d2)�log(d1 _ d2)

�2random entries. This is achieved by nuclear norm minimization.

[Candés, Recht, 2009; Candés, Tao, 2010; Keshavan, Montanari, Oh, 2010;Recht 2011; . . . ]

Andrea Montanari (Stanford) Landscape June 25, 2017 22 / 51

Page 29: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

What do we know about matrices? (k = 2)

Theorem (Gross 2011)

If X 2 Rd1�d2, and rank(X ) � r with factors factors are‘�-incoherent’, then we can reconstruct X exactly from

n � C (�) r(d1 _ d2)�log(d1 _ d2)

�2random entries. This is achieved by nuclear norm minimization.

[Candés, Recht, 2009; Candés, Tao, 2010; Keshavan, Montanari, Oh, 2010;Recht 2011; . . . ]

Andrea Montanari (Stanford) Landscape June 25, 2017 22 / 51

Page 30: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Let us try to redo the same with tensors!

I Number of unknown parameters � rd

I For r = O(1) and nice factors, maximum likelihood succeds for

n � Crd log d

[Do not know a reference]

I Generalizations of nuclear norm

n � C�r k�1� _ r (k�1)=2

� d1=2� d(log d)2[Yuan, Zhang, 2015; 2016]

Evaluating tensor nuclear norm is NP-hard!

Andrea Montanari (Stanford) Landscape June 25, 2017 23 / 51

Page 31: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Let us try to redo the same with tensors!

I Number of unknown parameters � rd

I For r = O(1) and nice factors, maximum likelihood succeds for

n � Crd log d

[Do not know a reference]

I Generalizations of nuclear norm

n � C�r k�1� _ r (k�1)=2

� d1=2� d(log d)2[Yuan, Zhang, 2015; 2016]

Evaluating tensor nuclear norm is NP-hard!

Andrea Montanari (Stanford) Landscape June 25, 2017 23 / 51

Page 32: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Let us try to redo the same with tensors!

I Number of unknown parameters � rd

I For r = O(1) and nice factors, maximum likelihood succeds for

n � Crd log d

[Do not know a reference]

I Generalizations of nuclear norm

n � C�r k�1� _ r (k�1)=2

� d1=2� d(log d)2[Yuan, Zhang, 2015; 2016]

Evaluating tensor nuclear norm is NP-hard!

Andrea Montanari (Stanford) Landscape June 25, 2017 23 / 51

Page 33: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Let us try to redo the same with tensors!

I Number of unknown parameters � rd

I For r = O(1) and nice factors, maximum likelihood succeds for

n � Crd log d

[Do not know a reference]

I Generalizations of nuclear norm

n � C�r k�1� _ r (k�1)=2

� d1=2� d(log d)2[Yuan, Zhang, 2015; 2016]

Evaluating tensor nuclear norm is NP-hard!

Andrea Montanari (Stanford) Landscape June 25, 2017 23 / 51

Page 34: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Let us try to redo the same with tensors!

I Number of unknown parameters � rd

I For r = O(1) and nice factors, maximum likelihood succeds for

n � Crd log d

[Do not know a reference]

I Generalizations of nuclear norm

n � C�r k�1� _ r (k�1)=2

� d1=2� d(log d)2[Yuan, Zhang, 2015; 2016]

Evaluating tensor nuclear norm is NP-hard!

Andrea Montanari (Stanford) Landscape June 25, 2017 23 / 51

Page 35: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #1: What about reducing to k = 2?k = a + b

unfolda�b : (Rd)k ! Rda Rdb

Y ! unfolda�b(Y )

unfolda�b(Y )i1���ia ; j1���jb � Yi1���ia j1���jb

I M = unfolda�b(Y )

I Do matrix completion (e.g. nuclear norm minimization) of MI Fold back

[Tomioka, Hayashi, Kashima, 2010; Tomioka, Suzuki, Hayashi, Kashima, 2011;Liu, Musialski, Wonka, Ye, 2013; Gandy, Recht, Yamada, 2011]Andrea Montanari (Stanford) Landscape June 25, 2017 24 / 51

Page 36: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #1: What about reducing to k = 2?k = a + b

unfolda�b : (Rd)k ! Rda Rdb

Y ! unfolda�b(Y )

unfolda�b(Y )i1���ia ; j1���jb � Yi1���ia j1���jb

I M = unfolda�b(Y )

I Do matrix completion (e.g. nuclear norm minimization) of MI Fold back

[Tomioka, Hayashi, Kashima, 2010; Tomioka, Suzuki, Hayashi, Kashima, 2011;Liu, Musialski, Wonka, Ye, 2013; Gandy, Recht, Yamada, 2011]Andrea Montanari (Stanford) Landscape June 25, 2017 24 / 51

Page 37: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #1: What about reducing to k = 2?k = a + b

unfolda�b : (Rd)k ! Rda Rdb

Y ! unfolda�b(Y )

unfolda�b(Y )i1���ia ; j1���jb � Yi1���ia j1���jb

I M = unfolda�b(Y )

I Do matrix completion (e.g. nuclear norm minimization) of MI Fold back

[Tomioka, Hayashi, Kashima, 2010; Tomioka, Suzuki, Hayashi, Kashima, 2011;Liu, Musialski, Wonka, Ye, 2013; Gandy, Recht, Yamada, 2011]Andrea Montanari (Stanford) Landscape June 25, 2017 24 / 51

Page 38: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #1: What about reducing to k = 2?

Corollary

If X 2 (Rd)k , rank(X ) � r, is such that unfolda�b(X ) satisfiesincoherence, then it can be reconstructed whp from n randomentries, provided

n � C r da_b (log d)2

Insights (?)I Optimal choice a = bk=2c, b = dk=2e.

I Gap: rd � n � rddk=2e.

I Unfolding cannot beat the barrier n & rddk=2e.

Andrea Montanari (Stanford) Landscape June 25, 2017 25 / 51

Page 39: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #1: What about reducing to k = 2?

Corollary

If X 2 (Rd)k , rank(X ) � r, is such that unfolda�b(X ) satisfiesincoherence, then it can be reconstructed whp from n randomentries, provided

n � C r da_b (log d)2

Insights (?)I Optimal choice a = bk=2c, b = dk=2e.

I Gap: rd � n � rddk=2e.

I Unfolding cannot beat the barrier n & rddk=2e.

Andrea Montanari (Stanford) Landscape June 25, 2017 25 / 51

Page 40: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #2: Non-convex optimization

X = vk0 , kv0k2 = 1Maximum likelihood

maximize Ln(θ) =12 �E (Y � θk )

2F ;

subject to kθk2 = 1 :

Heuristic analysis

rLn(θ) = �k �E (Y )�θ(k�1)+ smaller terms

Andrea Montanari (Stanford) Landscape June 25, 2017 26 / 51

Page 41: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #2: Non-convex optimization

X = vk0 , kv0k2 = 1Maximum likelihood

maximize Ln(θ) =12 �E (Y � θk )

2F ;

subject to kθk2 = 1 :

Heuristic analysis

rLn(θ) = �k �E (Y )�θ(k�1)+ smaller terms

Andrea Montanari (Stanford) Landscape June 25, 2017 26 / 51

Page 42: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #2: Non-convex optimization

X = vk0 , kv0k2 = 1Maximum likelihood

maximize Ln(θ) =12 �E (Y � θk )

2F ;

subject to kθk2 = 1 :

Heuristic analysis

rLn(θ) = �k �E (Y )�θ(k�1)+ smaller terms

Andrea Montanari (Stanford) Landscape June 25, 2017 26 / 51

Page 43: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea #2: Non-convex optimization

�E (Y ) = E�E (Y ) +��E (Y )� E�E (Y )

=

ndk vk0 + W

Assume random initialization hθ; v0i = cd�1=2, c = O(1):

hv0;rLn(θ)i = �k h�E (Y ); v0 θ(k�1)i+ smaller terms

= �kndk hv0;θi

k�1 � khW ; v0 θ(k�1)i+ : : :

= �ckn

d (3k�1)=2 �n1=2

dk

Gradient points in a random direction unless n & dk�1

Andrea Montanari (Stanford) Landscape June 25, 2017 27 / 51

Page 44: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 45: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 46: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 47: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 48: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 49: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Is there a practical estimator for n � rd1:1?

Barak, Moitra, 2014, k = 3:

I Under ‘Feige’s hypothesis,’ no polynomil algorithm exists for

n � d3=2

I Degree-6 Sum-Of-Squares works (approximate reconstruction) if

n � Cr2�d

3=2(log d)4

I Complexity O(d15)

Practical algorithms? Better rank dependency?

Andrea Montanari (Stanford) Landscape June 25, 2017 28 / 51

Page 50: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A theorem

A1: maxi1:::ik

jXi1:::ik j2 �

dk kX k2F

A2: kunfolda�b(X )k2op �

rkunfolda�b(X )k2

F

Theorem (Montanari, Sun, 2017)

Under the above assumptions there exists a spectral algorithmachieving kcX (Y )�X kF � "kX kF whp, provideda r � dc(k) and

n � C (�; �; ") rdk=2(log d)9 :

ac(3) = 3=4, c(k) = k=2 (k even), c(k) = (k=2)� 1 (k � 5 odd).

Andrea Montanari (Stanford) Landscape June 25, 2017 29 / 51

Page 51: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A theorem

A1: maxi1:::ik

jXi1:::ik j2 �

dk kX k2F

A2: kunfolda�b(X )k2op �

rkunfolda�b(X )k2

F

Theorem (Montanari, Sun, 2017)

Under the above assumptions there exists a spectral algorithmachieving kcX (Y )�X kF � "kX kF whp, provideda r � dc(k) and

n � C (�; �; ") rdk=2(log d)9 :

ac(3) = 3=4, c(k) = k=2 (k even), c(k) = (k=2)� 1 (k � 5 odd).

Andrea Montanari (Stanford) Landscape June 25, 2017 29 / 51

Page 52: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A theorem

Theorem (Montanari, Sun, 2017)

Under the above assumptions there exists a spectral algorithmachieving kcX (Y )�X kF � "kX kF whp, provideda r � dc(k) and

n � C (�; �; ") rdk=2(log d)9 :

ac(3) = 3=4, c(k) = k=2 (k even), c(k) = (k=2)� 1 (k � 5 odd).

I FALSE: Optimal choice a = bk=2c, b = dk=2e.

I FALSE: Gap: rd � n � rddk=2e.

I FALSE: Unfolding cannot beat the barrier n & rddk=2e.

Andrea Montanari (Stanford) Landscape June 25, 2017 30 / 51

Page 53: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A theorem

Theorem (Montanari, Sun, 2017)

Under the above assumptions there exists a spectral algorithmachieving kcX (Y )�X kF � "kX kF whp, provideda r � dc(k) and

n � C (�; �; ") rdk=2(log d)9 :

ac(3) = 3=4, c(k) = k=2 (k even), c(k) = (k=2)� 1 (k � 5 odd).

I FALSE: Optimal choice a = bk=2c, b = dk=2e.

I FALSE: Gap: rd � n � rddk=2e.

I FALSE: Unfolding cannot beat the barrier n & rddk=2e.

Andrea Montanari (Stanford) Landscape June 25, 2017 30 / 51

Page 54: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Simulations: k = 3, r = 4

105 106

n

10-2

10-1

100

MSE

d=100

d=150

d=200

d=250

d=300

102 103

n/d3/2

10-2

10-1

100

MSE

d=100

d=150

d=200

d=250

d=300

Andrea Montanari (Stanford) Landscape June 25, 2017 31 / 51

Page 55: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm idea

k = a + b, a < b

M = unfolda�b(Y )

da � n � dk=2:

I Cannot complete M

I Can estimate the top left singular space!

Andrea Montanari (Stanford) Landscape June 25, 2017 32 / 51

Page 56: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm, for k = 3 (� = nd3)1. Compute M = unfold1�2(X ) and

A =1�2�

?diag(MM T) +

1��diag(MM T)

2. Eigenvalue decomposition:

A =dX

i=1

�iu iuTi

3. Projector

Q =dX

i=1

1�i���u iuTi

4. Let Q � Q Q Q and return

cX =1�Q(X )

Andrea Montanari (Stanford) Landscape June 25, 2017 33 / 51

Page 57: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm, for k = 3 (� = nd3)1. Compute M = unfold1�2(X ) and

A =1�2�

?diag(MM T) +

1��diag(MM T)

2. Eigenvalue decomposition:

A =dX

i=1

�iu iuTi

3. Projector

Q =dX

i=1

1�i���u iuTi

4. Let Q � Q Q Q and return

cX =1�Q(X )

Andrea Montanari (Stanford) Landscape June 25, 2017 33 / 51

Page 58: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm, for k = 3 (� = nd3)1. Compute M = unfold1�2(X ) and

A =1�2�

?diag(MM T) +

1��diag(MM T)

2. Eigenvalue decomposition:

A =dX

i=1

�iu iuTi

3. Projector

Q =dX

i=1

1�i���u iuTi

4. Let Q � Q Q Q and return

cX =1�Q(X )

Andrea Montanari (Stanford) Landscape June 25, 2017 33 / 51

Page 59: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm, for k = 3 (� = nd3)1. Compute M = unfold1�2(X ) and

A =1�2�

?diag(MM T) +

1��diag(MM T)

2. Eigenvalue decomposition:

A =dX

i=1

�iu iuTi

3. Projector

Q =dX

i=1

1�i���u iuTi

4. Let Q � Q Q Q and return

cX =1�Q(X )

Andrea Montanari (Stanford) Landscape June 25, 2017 33 / 51

Page 60: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Algorithm, for k = 3 (� = nd3)1. Compute M = unfold1�2(X ) and

A =1�2�

?diag(MM T) +

1��diag(MM T)

2. Eigenvalue decomposition:

A =dX

i=1

�iu iuTi

3. Projector

Q =dX

i=1

1�i���u iuTi

4. Let Q � Q Q Q and return

cX =1�Q(X )

Andrea Montanari (Stanford) Landscape June 25, 2017 33 / 51

Page 61: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Similar method developed independently by Yuan, Xia, 2017

Andrea Montanari (Stanford) Landscape June 25, 2017 34 / 51

Page 62: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Back to image denoising

Andrea Montanari (Stanford) Landscape June 25, 2017 35 / 51

Page 63: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Image-by-image matrix denoising

0 5 10 15 20 25 30

rank

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

MSE

:(

Andrea Montanari (Stanford) Landscape June 25, 2017 36 / 51

Page 64: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Tensor denoising

0 5 10 15 20 25 30

rank

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

MSE

:)

Andrea Montanari (Stanford) Landscape June 25, 2017 37 / 51

Page 65: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Overcomplete tensors

Andrea Montanari (Stanford) Landscape June 25, 2017 38 / 51

Page 66: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

k = 3

X =rX

`=1

v ` v ` v `

X can have rank larger than d !!!

Andrea Montanari (Stanford) Landscape June 25, 2017 39 / 51

Page 67: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Can we use spectral methods?

Y = �E (X )

M = unfold1�2(Y )

Seems a lost cause:

rank(X ) � d ) rank(M ) = d

Structure is lost!

Andrea Montanari (Stanford) Landscape June 25, 2017 40 / 51

Page 68: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Can we use spectral methods?

Y = �E (X )

M = unfold1�2(Y )

Seems a lost cause:

rank(X ) � d ) rank(M ) = d

Structure is lost!

Andrea Montanari (Stanford) Landscape June 25, 2017 40 / 51

Page 69: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Can we use spectral methods?

Y = �E (X )

M = unfold1�2(Y )

Seems a lost cause:

rank(X ) � d ) rank(M ) = d

Structure is lost!

Andrea Montanari (Stanford) Landscape June 25, 2017 40 / 51

Page 70: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea: Do something before spectral analysis

A = (Ai1i2 ; j1j2)i1;i2;j1;j2�d 2 Rd2�d2;

Ai1i2 ; j1j2 �dX

`=1

Yi1j1`Yi2j2` :

NOT EQUAL TO:

M = unfold1�2(Y ) 2 Rd�d2;

B = M TM

Claim: rank�(A) � r , rank(B) � d

Andrea Montanari (Stanford) Landscape June 25, 2017 41 / 51

Page 71: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea: Do something before spectral analysis

A = (Ai1i2 ; j1j2)i1;i2;j1;j2�d 2 Rd2�d2;

Ai1i2 ; j1j2 �dX

`=1

Yi1j1`Yi2j2` :

NOT EQUAL TO:

M = unfold1�2(Y ) 2 Rd�d2;

B = M TM

Claim: rank�(A) � r , rank(B) � d

Andrea Montanari (Stanford) Landscape June 25, 2017 41 / 51

Page 72: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Idea: Do something before spectral analysis

A = (Ai1i2 ; j1j2)i1;i2;j1;j2�d 2 Rd2�d2;

Ai1i2 ; j1j2 �dX

`=1

Yi1j1`Yi2j2` :

NOT EQUAL TO:

M = unfold1�2(Y ) 2 Rd�d2;

B = M TM

Claim: rank�(A) � r , rank(B) � d

Andrea Montanari (Stanford) Landscape June 25, 2017 41 / 51

Page 73: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

X =Pr

`=1 v3`

Theorem (Montanari, Sun, 2017)

Assume (v `)`�r �iid N(0; Id), and r � d2. Then there exists aspectral algorithm achieving kcX �X kF � "kX kF whp, for nrandom entries, provided

n � C (")(d _ r)d3=2(log d)c0

Similar ideas: Hopkins, Schramm, Shi, Steurer 2016; Raghavendra,Schramm 2016

Andrea Montanari (Stanford) Landscape June 25, 2017 42 / 51

Page 74: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

X =Pr

`=1 v3`

Theorem (Montanari, Sun, 2017)

Assume (v `)`�r �iid N(0; Id), and r � d2. Then there exists aspectral algorithm achieving kcX �X kF � "kX kF whp, for nrandom entries, provided

n � C (")(d _ r)d3=2(log d)c0

Similar ideas: Hopkins, Schramm, Shi, Steurer 2016; Raghavendra,Schramm 2016

Andrea Montanari (Stanford) Landscape June 25, 2017 42 / 51

Page 75: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Simulations: k = 3, r = 4

103 104 105 106

n

10-1

100

MSE

d=10,r=12

d=20,r=24

d=30,r=36

d=40,r=48

d=50,r=60

100 101

n/(rd3/2 )

10-1

100

MSE

d=10,r=12

d=20,r=24

d=30,r=36

d=40,r=48

d=50,r=60

Andrea Montanari (Stanford) Landscape June 25, 2017 43 / 51

Page 76: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Spiked tensor model (a.k.a. tensor PCA)

Andrea Montanari (Stanford) Landscape June 25, 2017 44 / 51

Page 77: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Reminder: A much simpler model

Y = �vk0 +W

Signal: v0 2 Sd�1 � fx 2 Rd : kxk2 = 1g.

Noise: (Wi1;i2;:::;ik )i1<i2<���<ik �iid N(0; 1=d)

SNR: �

Given Y , estimate v0

[Montanari, Richard, 2015]

Andrea Montanari (Stanford) Landscape June 25, 2017 45 / 51

Page 78: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A lot of information from statistical physicsGibbs measure

��;�(dθ) =1

Z (�; �)exp

n�hY ;θk i

o�0(dθ)

I � =1: Maximum LikelihoodI � = �=k !: Bayes posteriorI �0( � ): Uniform measure on Sd�1

[Crisanti, Sommers, 1992, 1995; Auffinger, Ben Arous, Cerny, 2013; Chen,2013; Subag, 2016; Krzakala, Lelarge, Miolane, Zdeborova, 2016;. . . ]

TheoremThere exists �Bayes(k) (explicit!) such that:

limn!1

Ejhv0; vBayes(Y )ij =

(0 if � < �Bayes(k),> 0 if � > �Bayes(k).

Andrea Montanari (Stanford) Landscape June 25, 2017 46 / 51

Page 79: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A lot of information from statistical physicsGibbs measure

��;�(dθ) =1

Z (�; �)exp

n�hY ;θk i

o�0(dθ)

I � =1: Maximum LikelihoodI � = �=k !: Bayes posteriorI �0( � ): Uniform measure on Sd�1

[Crisanti, Sommers, 1992, 1995; Auffinger, Ben Arous, Cerny, 2013; Chen,2013; Subag, 2016; Krzakala, Lelarge, Miolane, Zdeborova, 2016;. . . ]

TheoremThere exists �Bayes(k) (explicit!) such that:

limn!1

Ejhv0; vBayes(Y )ij =

(0 if � < �Bayes(k),> 0 if � > �Bayes(k).

Andrea Montanari (Stanford) Landscape June 25, 2017 46 / 51

Page 80: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

What about polynomial-time estimators?

Theorem (Montanari, Richard, 2014; Hopkins, Shi, Steurer, 2015)

There exists poly-time estimator achieving Efjhθ̂Poly;θ0ijg � 1� ",

provided

� � C (")n (k�2)=4 :

No poly-time algorithm known for 1� �� n (k�2)=4.

Andrea Montanari (Stanford) Landscape June 25, 2017 47 / 51

Page 81: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

What about polynomial-time estimators?

Theorem (Montanari, Richard, 2014; Hopkins, Shi, Steurer, 2015)

There exists poly-time estimator achieving Efjhθ̂Poly;θ0ijg � 1� ",

provided

� � C (")n (k�2)=4 :

No poly-time algorithm known for 1� �� n (k�2)=4.

Andrea Montanari (Stanford) Landscape June 25, 2017 47 / 51

Page 82: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

What about polynomial-time estimators?

Theorem (Montanari, Richard, 2014; Hopkins, Shi, Steurer, 2015)

There exists poly-time estimator achieving Efjhθ̂Poly;θ0ijg � 1� ",

provided

� � C (")n (k�2)=4 :

No poly-time algorithm known for 1� �� n (k�2)=4.

Andrea Montanari (Stanford) Landscape June 25, 2017 47 / 51

Page 83: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A case study in failure: Maximum likelihood

maximize hY ;θk i

subject to θ 2 Sd�1

N (x ;m) � #ncritical points with hY ;θk i � x , hv0;θi � m

o

Andrea Montanari (Stanford) Landscape June 25, 2017 48 / 51

Page 84: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

A peek at complexity

� = 0 � = 1 � = 10

EN (x ;m) = ed �(x ;m)+o(d) ;

�(x ;m) = explicit expression

[Ben Arous, Mei, Montanari, Nica, in progress]

Andrea Montanari (Stanford) Landscape June 25, 2017 49 / 51

Page 85: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Conclusion

Andrea Montanari (Stanford) Landscape June 25, 2017 50 / 51

Page 86: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Conclusion

I Tensors are useful for modeling multivariate data

I Estimation requires entirely new ideas

I Information-computation gap

I Many open problems

Thanks! Happy birthday Dave!

Andrea Montanari (Stanford) Landscape June 25, 2017 51 / 51

Page 87: Tensor completion and tensor estimationmontanar/OTHER/TALKS/donoho2017.pdf · Tensorcompletionandtensorestimation AndreaMontanariandNikeSun Stanford University and UC Berkeley June25,2017

Conclusion

I Tensors are useful for modeling multivariate data

I Estimation requires entirely new ideas

I Information-computation gap

I Many open problems

Thanks! Happy birthday Dave!

Andrea Montanari (Stanford) Landscape June 25, 2017 51 / 51