ryan o’donnell carnegie mellon university. part 1: a. fourier expansion basics b. concepts: bias,...
TRANSCRIPT
![Page 1: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/1.jpg)
Ryan O’Donnell
Carnegie Mellon University
![Page 2: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/2.jpg)
Part 1:
A. Fourier expansion basics
B. Concepts:
Bias, Influences, Noise Sensitivity
C. Kalai’s proof of Arrow’s Theorem
![Page 3: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/3.jpg)
10 Minute Break
![Page 4: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/4.jpg)
Part 2:
A. The Hypercontractive Inequality
B. Algorithmic Gaps
![Page 5: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/5.jpg)
Sadly no time for:
Learning theory
Pseudorandomness
Arithmetic combinatorics
Random graphs / percolation
Communication complexity
Metric / Banach spaces
Coding theory
etc.
![Page 6: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/6.jpg)
1A. Fourier expansion basics
![Page 7: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/7.jpg)
f : {0,1}n {0,1}
![Page 8: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/8.jpg)
f : {−1,+1}n {−1,+1}
![Page 9: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/9.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
(+1,+1,−1)
(+1,−1,+1)
(−1,+1,+1)
−1
−1
−1
+1
+1
+1
+1
−1
![Page 10: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/10.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
+1
+1
+1
+1−1
−1
−1
−1
![Page 11: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/11.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
−1
−1
−1
+1−1
−1
−1
−1
![Page 12: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/12.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
+1
+1
+1
+1+1
+1
+1
−1
![Page 13: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/13.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
+1
+1
+1
+1+1
+1
+1
+1
![Page 14: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/14.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
−1
−1
−1
−1−1
−1
−1
−1
![Page 15: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/15.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
−1
−1
+1
+1−1
+1
+1
−1
![Page 16: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/16.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
−1
+1
−1
+1+1
−1
+1
−1
![Page 17: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/17.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
+1
−1
−1
+1+1
+1
−1
−1
![Page 18: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/18.jpg)
ℝ3
(+1,+1,+1)
(−1,−1,−1)
−1
−1
−1
+1
+1
+1
+1
−1
![Page 19: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/19.jpg)
(+1,+1,+1)
+1+1
+1+1
+1+1+1+1
−1−1
−1−1
−1−1−1−1
+1
+1
+1
+1
−1
−1 −1
−1
(+1,+1,−1)
(+1,−1,−1)
![Page 20: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/20.jpg)
=
![Page 21: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/21.jpg)
![Page 22: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/22.jpg)
=
![Page 23: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/23.jpg)
Proposition:
Every f : {−1,+1}n {−1,+1} can be
expressed as a multilinear polynomial,
That’s it. That’s the “Fourier expansion” of f.
(uniquely)
(indeed, → ℝ)
![Page 24: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/24.jpg)
Proposition:
Every f : {−1,+1}n {−1,+1} can be
expressed as a multilinear polynomial,
That’s it. That’s the “Fourier expansion” of f.
(uniquely)
(indeed, → ℝ)
![Page 25: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/25.jpg)
⇓
Rest: 0
![Page 26: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/26.jpg)
Why?
Coefficients encode useful information.
When?
1. Uniform probability involved
2. Hamming distances relevant
![Page 27: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/27.jpg)
Parseval’s Theorem:
Let f : {−1,+1}n {−1,+1}.
Then
avg { f(x)2 }
![Page 28: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/28.jpg)
“Weight” of f on S ⊆ [n]
=
![Page 29: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/29.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 30: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/30.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 31: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/31.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 32: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/32.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 33: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/33.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 34: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/34.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
![Page 35: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/35.jpg)
1B. Concepts:
Bias, Influences, Noise Sensitivity
![Page 36: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/36.jpg)
Social Choice:
Candidates ±1
n voters
Votes are random
f : {−1,+1}n {−1,+1}
is the “voting rule”
![Page 37: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/37.jpg)
Bias of f:
avg f(x) = Pr[+1 wins] − Pr[−1 wins]
Fact:
Weight on ∅ = measures “imbalance”.
![Page 38: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/38.jpg)
Influence of i on f:
Pr[ f(x) ≠ f(x(⊕i)) ]
= Pr[voter i is a swing voter]
Fact:
![Page 39: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/39.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
{1,2,3}
Maj(x1,x2,x3)
![Page 40: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/40.jpg)
![Page 41: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/41.jpg)
+1
+1
+1+1
−1
−1
−1−1
Infi(f) = Pr[ f(x) ≠ f(x(⊕i)) ]
![Page 42: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/42.jpg)
+1
+1
+1+1
−1
−1
−1−1
Infi(f) = Pr[ f(x) ≠ f(x(⊕i)) ]
![Page 43: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/43.jpg)
avg Infi(f) = frac. of edges which
are cut edges
![Page 44: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/44.jpg)
LMN Theorem:
If f is in AC0
then avg Infi(f)
![Page 45: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/45.jpg)
⇒ avg Infi(Parityn) = 1
⇒ Parity ∉ AC0
⇒ avg Infi(Majn) =
⇒ Majority ∉ AC0
![Page 46: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/46.jpg)
KKL Theorem:
If Bias(f) = 0,
then
Corollary:
Assuming f monotone,
−1 or +1 can bribe o(n) voters
and win w.p. 1−o(1).
![Page 47: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/47.jpg)
Noise Sensitivity of f at ϵ:
NSԑ(f) = Pr[wrong winner wins],
when each vote misrecorded w/prob ϵ
f(
f(
)
)
+ − + + − − + − −
− − + + + + + − −
![Page 48: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/48.jpg)
![Page 49: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/49.jpg)
Learning Theory principle:
[LMN’93, …, KKMS’05]
If all f ∈ C have small NSԑ(f)
then C is efficiently learnable.
![Page 50: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/50.jpg)
{2}{1}
∅
{3}
{1,3}{1,2} {2,3}
[3]
![Page 51: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/51.jpg)
Proposition:
for small ԑ,
with Electoral College:
ϵ 10
1
![Page 52: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/52.jpg)
1C. Kalai’s proof of Arrow’s Theorem
![Page 53: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/53.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
A > B?
B > C?
C > A?
![Page 54: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/54.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
• • • • • •
A > B?
B > C?
C > A?
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
+
−
+
+
+
−
+
+
−
+
−
−
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 55: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/55.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
• • • • • •
A > B?
B > C?
C > A?
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
f( )f( )f( )
=+=+=−
Society: “A > B > C”
+
−
+
+
+
−
+
+
−
+
−
−
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 56: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/56.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
• • • • • •
A > B?
B > C?
C > A?
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
f( )f( )f( )
=+=+=−
Society: “A > B > C”
+
−
+
+
+
−
+
+
−
+
−
+
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 57: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/57.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
• • • • • •
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
Society: “A > B > C”
A > B?
B > C?
C > A?
f( )f( )f( )
=+=+=+
+
−
+
+
+
−
+
+
−
+
−
+
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 58: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/58.jpg)
Ranking 3 candidates
Condorcet [1775] Election:
=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)
Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy
eg]
Maybe some other f?
• • • • • •
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
Society: “A > B > C > A”?A > B?
B > C?
C > A?
f( )f( )f( )
=+=+=+
+
−
+
+
+
−
+
+
−
+
−
+
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 59: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/59.jpg)
Arrow’s Impossibility Theorem [1950]:
If
f : {−1,+1}n {−1,+1} never gives
irrational outcome in Condorcet
elections,
then
f is a Dictator or a negated-Dictator.
![Page 60: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/60.jpg)
Gil Kalai’s Proof [2002]:
![Page 61: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/61.jpg)
• • • • • •
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
A > B?
B > C?
C > A?
f( )f( )f( )
=+=+=−
+
−
+
+
+
−
+
+
−
+
−
−
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 62: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/62.jpg)
• • • • • •
“C >
A >
B”
“A >
B >
C”
“B >
C >
A”
A > B?
B > C?
C > A?
f( )f( )f( )
=+=+=−
+
−
+
+
+
−
+
+
−
+
−
−
−
+
−
−
−
+
−
+
−
+
−
+
−
+
+
![Page 63: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/63.jpg)
Gil Kalai’s Proof:
![Page 64: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/64.jpg)
Gil Kalai’s Proof:
![Page 65: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/65.jpg)
Gil Kalai’s Proof, concluded:
f never gives irrational outcomes ⇒ equality
⇒ all Fourier weight “at level 1”
⇒ f(x) = ±xj for some j (exercise).
![Page 66: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/66.jpg)
⇓
Guilbaud’s Theorem [1952]
Guilbaud’s Number ≈ .912
![Page 67: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/67.jpg)
Corollary of “Majority Is Stablest” [MOO05]:
If Infi(f) ≤ o(1) for all i,
then
Pr[rational outcome with f]
![Page 68: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/68.jpg)
![Page 69: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/69.jpg)
Part 2:
A. The Hypercontractive Inequality
B. Algorithmic Gaps
![Page 70: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/70.jpg)
2A. The Hypercontractive Inequality
AKA Bonami-Beckner Inequality
![Page 71: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/71.jpg)
all use “Hypercontractive Inequality”
![Page 72: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/72.jpg)
Hoeffding Inequality:
Let
F = c0 + c1 x1 + c2 x2 + ··· + cn xn,
where xi’s are indep., unif. random ±1.
![Page 73: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/73.jpg)
Mean: μ = c0 Variance:
Hoeffding Inequality:
Let
F = c0 + c1 x1 + c2 x2 + ··· + cn xn,
![Page 74: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/74.jpg)
Mean: μ = Variance:
Hypercontractive Inequality*:
Let
![Page 75: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/75.jpg)
Then for all q ≥ 2,
Hypercontractive Inequality:
Let
![Page 76: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/76.jpg)
Then F is a “reasonabled” random variable.
Hypercontractive Inequality:
Let
![Page 77: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/77.jpg)
Then for all q ≥ 2,
Hypercontractive Inequality:
Let
![Page 78: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/78.jpg)
Then
“q = 4” Hypercontractive Inequality:
Let
![Page 79: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/79.jpg)
Then
“q = 4” Hypercontractive Inequality:
Let
![Page 80: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/80.jpg)
all use Hypercontractive Inequality
![Page 81: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/81.jpg)
just use “q = 4” Hypercontractive Inequality
![Page 82: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/82.jpg)
“q = 4” Hypercontractive Inequality:
Let F be degree d over n i.i.d. ±1 r.v.’s.
Then
Proof [MOO’05]: Induction on n.
Obvious step.
Use induction hypothesis.
Use Cauchy-Schwarz on the obvious thing.
Use induction hypothesis.
Obvious step.
![Page 83: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/83.jpg)
2B. Algorithmic Gaps
![Page 84: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/84.jpg)
Opt
best poly-timeguarantee
ln(N)
“Set-Cover is NP-hard to
approximate to factor ln(N)”
![Page 85: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/85.jpg)
Opt
LP-Rand-Roundingguarantee
ln(N)
“Factor ln(N) Algorithmic Gap
for LP-Rand-Rounding”
![Page 86: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/86.jpg)
Opt(S)
LP-Rand-Rounding(S)
ln(N)
“Algorithmic Gap Instance S
for LP-Rand-Rounding”
![Page 87: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/87.jpg)
Algorithmic Gap instances
are often “based on” {−1,+1}n.
![Page 88: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/88.jpg)
Sparsest-Cut:
Algorithm: Arora-Rao-Vazirani SDP.
Guarantee: Factor
![Page 89: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/89.jpg)
![Page 90: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/90.jpg)
Opt = 1/n
![Page 91: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/91.jpg)
Opt = 1/n
![Page 92: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/92.jpg)
Opt = 1/n
![Page 93: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/93.jpg)
Opt = 1/n
f(x) = sgn( )
![Page 94: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/94.jpg)
Opt = 1/n
f(x) = sgn(r1x1 + ••• + rnxn)
ARV gets
![Page 95: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/95.jpg)
Opt = 1/n
ARV gets
gap:
![Page 96: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/96.jpg)
Algorithmic Gaps → Hardness-of-Approx
LP / SDP-rounding Alg. Gap instance
• n optimal “Dictator” solutions
• “generic mixture of Dictators” much worse
+ PCP technology
= same-gap hardness-of-approximation
![Page 97: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/97.jpg)
Algorithmic Gaps → Hardness-of-Approx
LP / SDP-rounding Alg. Gap instance
• n optimal “Dictator” solutions
• “generic mixture of Dictators” much worse
+ PCP technology
= same-gap hardness-of-approximation
![Page 98: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/98.jpg)
KKL / Talagrand Theorem:
If f is balanced,
Infi(f) ≤ 1/n.01 for all i,
then
avg Infi(f) ≥
Gap: Θ(log n) = Θ(log log N).
![Page 99: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/99.jpg)
[CKKRS05]: KKL + Unique Games Conjecture
⇒ Ω(log log log N) hardness-of-approx.
![Page 100: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/100.jpg)
2-Colorable 3-Uniform hypergraphs:
Input: 2-colorable, 3-unif. hypergraph
Output: 2-coloring
Obj: Max. fraction of legally
colored hyperedges
![Page 101: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/101.jpg)
2-Colorable 3-Uniform hypergraphs:
Algorithm: SDP [KLP96].
Guarantee:
[Zwick99]
![Page 102: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/102.jpg)
Algorithmic Gap Instance
Vertices: {−1,+1}n
6n hyperedges:{ (x,y,z) : poss. prefs in
a Condorcet
election}
(i.e., triples s.t. (xi,yi,zi) NAE for all i)
![Page 103: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/103.jpg)
Elts: {−1,+1}n Edges: Condorcet votes (x,y,z)
2-coloring = f : {−1,+1}n → {−1,+1}
frac. legally colored hyperedges
= Pr[“rational” outcome with f]
Instance 2-colorable? ✔
(2n optimal solutions: ±Dictators)
![Page 104: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/104.jpg)
Elts: {−1,+1}n Edges: Condorcet votes (x,y,z)
SDP rounding alg. may output
Random weighted majority also
rational-with-prob.-.912! [same CLT arg.]
f(x) = sgn(r1x1 + ••• + rnxn)
![Page 105: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/105.jpg)
Algorithmic Gaps → Hardness-of-Approx
LP / SDP-rounding Alg. Gap instance
• n optimal “Dictator” solutions
• “generic mixture of Dictators” much worse
+ PCP technology
= same-gap hardness-of-approximation
![Page 106: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/106.jpg)
Corollary of Majority Is Stablest:
If Infi(f) ≤ o(1) for all i,
then
Pr[rational outcome with f]
Cor: this + Unique Games Conjecture
⇒ .912 hardness-of-approx*
![Page 107: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/107.jpg)
2C. Future Directions
![Page 108: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/108.jpg)
Develop the “structure vs. pseudorandomness”
theory for Boolean functions.
![Page 109: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e7a5503460f94b7a71f/html5/thumbnails/109.jpg)