on gallager’s problem: new bounds for noisy communication. navin goyal & mike saks joint work...

38
On Gallager’s problem: On Gallager’s problem: New Bounds for New Bounds for Noisy Noisy Communication. Communication. Navin Goyal & Mike Saks Navin Goyal & Mike Saks Joint work with Guy Kindler Guy Kindler Microsoft Research Microsoft Research

Post on 21-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

On Gallager’s problem: On Gallager’s problem:

New Bounds for New Bounds for NoisyNoisy

Communication.Communication.

On Gallager’s problem: On Gallager’s problem:

New Bounds for New Bounds for NoisyNoisy

Communication.Communication.

Navin Goyal & Mike Saks Navin Goyal & Mike Saks Joint work with

Guy KindlerGuy KindlerMicrosoft ResearchMicrosoft Research

Page 2: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Ambrose Bierce

1842 – 1914(?)

“Noise is the chief product and the authenticating sign of civilization”

In CS:In CS: Noise appears in the study of information theory, Noise appears in the study of information theory,

network design, learning theory, cryptography, network design, learning theory, cryptography,

quantum computation, hardness of approximation, quantum computation, hardness of approximation,

theory of social choice, embeddings of metric spaces, theory of social choice, embeddings of metric spaces,

privacy in databases…privacy in databases…

Page 3: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

In this talkIn this talkIn this talkIn this talk

[El Gamal ’84]:[El Gamal ’84]: The The noisy broadcast networknoisy broadcast network model. model.

[Gallager ’88]:[Gallager ’88]: nn¢¢loglog(n)loglog(n) algorithm for identity. algorithm for identity.

Main result:Main result: Gallager’s algorithm is tight. Gallager’s algorithm is tight.

Proof by reduction:Proof by reduction:

GGeneralized noisy decision trees (gnd-trees)eneralized noisy decision trees (gnd-trees)..

Lower bound for gnd-trees.Lower bound for gnd-trees.

Page 4: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

First, a Fourier-analytic resultFirst, a Fourier-analytic resultFirst, a Fourier-analytic resultFirst, a Fourier-analytic result

Definition (Fourier):Definition (Fourier): Let Let f:{-1,1}f:{-1,1}nn!!{-1,1}{-1,1} be a Boolean be a Boolean

function. The function. The ii’th Fourier coefficient of ’th Fourier coefficient of ff::

ffii = = EExx»»UU[f(x)[f(x)¢¢xxii]]..

[Talagrand ’96]: [Talagrand ’96]: Let Let p= Prp= Prxx»»UU[f(x)=1][f(x)=1], (, (p<1/2p<1/2).).

Then Then ii (f (fii))22·· p p22log(1/p).log(1/p).

Crucial for our result!Crucial for our result!

(as hinted in slide #26..)(as hinted in slide #26..)

Page 5: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

What next:What next:

Communication under noise - examplesCommunication under noise - examples

The noisy broadcast model The noisy broadcast model

Gallager: the algorithm and the problemGallager: the algorithm and the problem

Gnd-trees: Generalized Noisy Decision TreesGnd-trees: Generalized Noisy Decision Trees

Our resultsOur results

About the proofAbout the proof

Page 6: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

0110001100

Noisy computation: case 1Noisy computation: case 1Noisy computation: case 1Noisy computation: case 1

1.1. Noiseless channel:Noiseless channel: nn transmissions. transmissions.

2.2. naïve:naïve: nn¢¢log(n)log(n) (error is polynomially small in (error is polynomially small in nn))

3.3. [Shannon ’48]:[Shannon ’48]: cc¢¢nn (error is exponentially small in (error is exponentially small in nn))

Aggregation of bits: Big advantageAggregation of bits: Big advantage

Page 7: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

y=10101y=10101

Noisy computation: case 2Noisy computation: case 2Noisy computation: case 2Noisy computation: case 2

x=01100x=01100

Goal: compute f(x,y)Goal: compute f(x,y)

1.1. Noiseless channel:Noiseless channel:

kk transmissions. transmissions.

2.2. naïve:naïve: kk¢¢log(k)log(k)

3.3. [Schulman ’96]:[Schulman ’96]: cc¢¢kk

(error is exponentially small in (error is exponentially small in kk))

Page 8: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The Noisy Broadcast ModelThe Noisy Broadcast Model[El Gamal ’84][El Gamal ’84]

The Noisy Broadcast ModelThe Noisy Broadcast Model[El Gamal ’84][El Gamal ’84]

0

000

0

0

00

1

1xx11xx11

xx55xx55

xx44xx44

xx33xx33

xx22xx22

xx66xx66

xx77xx77

xx88xx88

xx99xx99

xx1010xx1010

Input:Input: xx11,..,x,..,xnn..

One bit transmitted at a One bit transmitted at a

time. time.

Error rate:Error rate: (small const.). (small const.).

Goal:Goal: compute compute g(xg(x11,..,x,..,xnn))..

In this talk:In this talk: we want to we want to

compute compute xx11,..,x,..,xnn..Order of transmissions is predefinedOrder of transmissions is predefined

1

1

1

1

1 1

1

1

0

0

Page 9: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Some historySome historySome historySome history

Computing identity:Computing identity:

Naïve solution:Naïve solution: n log nn log n (repetition) (repetition)

[Gallager ’88]:[Gallager ’88]: n loglog nn loglog n..

[Yao 97]:[Yao 97]: Try thresholds first. Try thresholds first.

[KM ’98]: [KM ’98]: Any threshold in Any threshold in O(n)O(n)..

In adversarial model:In adversarial model:

[FK ’00]:[FK ’00]: OR in OR in O(nO(n¢¢loglog**n)n)..

[N ’04]:[N ’04]: OR in OR in O(n)O(n)..

xx11xx11

11

1

1

1 1

1

10

0

xx55xx55

xx44xx44

xx33xx33

xx22xx22

xx66xx66

xx77xx77 xx88xx88

xx99xx99

xx11xx11

Fails for “adversarial noise”.Fails for “adversarial noise”.Fails for “adversarial noise”.Fails for “adversarial noise”.

Gallager’s problem:Gallager’s problem:Can this be made linear?Can this be made linear?

Gallager’s problem:Gallager’s problem:Can this be made linear?Can this be made linear?

Page 10: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

what’s next:what’s next:

Communication under noise - examplesCommunication under noise - examples

The noisy broadcast model The noisy broadcast model

Gallager: an algorithm and a problemGallager: an algorithm and a problem

Gnd-trees: Generalized Noisy Decision TreesGnd-trees: Generalized Noisy Decision Trees

Statement of resultsStatement of results

About the proofAbout the proof

Page 11: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

g(x)=yg(x)=y g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )f (x1 )

f (x1 )f (x1 )

f (x2 )f (x2 )f (x1 )f (x1 )f (x1 )f (x1 )f (x2 )f (x2 )

f (x2 )f (x2 )

Generalized Noisy Decision (gnd) Generalized Noisy Decision (gnd) TreesTrees

Generalized Noisy Decision (gnd) Generalized Noisy Decision (gnd) TreesTrees

Input:Input: xx,, but access is to noisy but access is to noisy

copiescopies x x11,x,x22, x, x33……

xxii=x=x©©NNii ((NNii flips flips xxjj w.p. w.p. ))

Any Boolean queries!Any Boolean queries!

v =“01”=“01”

ffvv : Boolean function: Boolean function

Goal: Goal: computecompute g(x)g(x),,

minimizingminimizing

depth(T)depth(T)

Page 12: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Generalized Noisy Decision (gnd) Generalized Noisy Decision (gnd) TreesTrees

Generalized Noisy Decision (gnd) Generalized Noisy Decision (gnd) TreesTrees

Noisy decision trees [FPRU ‘94]:Noisy decision trees [FPRU ‘94]:

Query noisy coordinates of Query noisy coordinates of xx..

Identity computable in Identity computable in nlog(n)nlog(n)..

g(x)=yg(x)=y g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )f (x1 )

f (x1 )f (x1 )

f (x2 )f (x2 )f (x1 )f (x1 )f (x1 )f (x1 )f (x2 )f (x2 )

f (x2 )f (x2 )

Page 13: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

O(n) [FPRU][FPRU]

Some bounds for noisy treesSome bounds for noisy treesSome bounds for noisy treesSome bounds for noisy trees

function

noisy trees

OR n) [FPRU][FPRU]

PARITY n log n) [FPRU][FPRU]

MAJORITY n log n) [FPRU][FPRU]

n) [GKS]

n) [KM[KM**]]

IDENTITY n log n) [FPRU][FPRU]n log n) [GKS]

gnd-trees

g(x)=yg(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )

f (x1 )

f (x2 )f (x1 )f (x1 )f (x2 )

f (x2 )

(n)

(n)

Page 14: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Our resultsOur resultsOur resultsOur results

Main theorem: Main theorem: (n¢loglog(n)) bound for identity in n.b. bound for identity in n.b.

network.network.

Lower-bound for gnd-tree :Lower-bound for gnd-tree : nlog(n)nlog(n) Lower bound for Lower bound for

computing identity in generalized decision trees.computing identity in generalized decision trees.

Reduction theorem: Reduction theorem:

knkn time protocol in time protocol in -noise n.b. network -noise n.b. network

)) 2kn2kn depth gnd-tree for noise depth gnd-tree for noise ckck..

Proof of main theorem:Proof of main theorem:

2kn ¸ ckn log n

2k(1/)ck ¸ log n

k= (loglog(n))

Page 15: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

what’s next:what’s next:

About communication under noiseAbout communication under noise

The noisy broadcast model The noisy broadcast model

Gallager: the algorithm and the problemGallager: the algorithm and the problem

Generalized Noisy Decision Trees (gnd-trees)Generalized Noisy Decision Trees (gnd-trees)

Our resultsOur results

About the proofAbout the proof

Page 16: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

About the proof:About the proof:

The reduction:The reduction:

A series of transformations from a broadcast protocol A series of transformations from a broadcast protocol

into a gnd-tree protocol.into a gnd-tree protocol.

Page 17: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

About the proof:About the proof:

The reduction:The reduction:

A series of transformations from a broadcast protocol A series of transformations from a broadcast protocol

into a gnd-tree protocol.into a gnd-tree protocol.

Gnd-tree lower bound:Gnd-tree lower bound:

Defining a knowledge measure.Defining a knowledge measure.

Bounding knowledge measure by depth of tree.Bounding knowledge measure by depth of tree.

Page 18: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Lower bound for gnd-treesLower bound for gnd-treesLower bound for gnd-treesLower bound for gnd-trees

Our claim:Our claim: A gnd-tree which computes identity on A gnd-tree which computes identity on

x=xx=x11,..,x,..,xnn requires requires ((nn¢¢log n)log n) depth. depth.

We actually prove:We actually prove: If If depth(T)depth(T)··nn¢¢log nlog n then then

PrPrxx»» U U[T returns x] <[T returns x] <(()), (, (limlim!!00(()=0)=0))

Page 19: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The big pictureThe big pictureThe big pictureThe big picture

We prove:We prove: If depth(T)If depth(T)··nn¢¢log nlog n then then

PrPrxx»»UU[T returns x] <[T returns x] <(()), ( , ( limlim!!00(()=0 )=0 ))

Structure of such proofs:Structure of such proofs:

1.1. Define:Define: Knowledge measure Knowledge measure MMxx(v)(v)

2.2. Show:Show: TT correct only if w.h.p. correct only if w.h.p. MMxx(() > t) > t

3.3. Show:Show: If If depth(T)<<nlog n depth(T)<<nlog n, then w.h.p. , then w.h.p. MMxx(()<t)<t

In our case:In our case:

t = log(n)t = log(n), and typically , and typically MMxx(v,a)(v,a)--MMxx(v) (v) · · 1/(1/(33¢¢n)n)

is the leaf reached by is the leaf reached by

TT..

is the leaf reached by is the leaf reached by

TT..

g(x)=yg(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )

f (x1 )

f (x2 )f (x1 )f (x1 )f (x2 )

f (x2 )

Disclaimer: Disclaimer: We consider case where We consider case where

each noisy copy is queried once…each noisy copy is queried once…

(more work needed in general (more work needed in general

case)case)

Disclaimer: Disclaimer: We consider case where We consider case where

each noisy copy is queried once…each noisy copy is queried once…

(more work needed in general (more work needed in general

case)case)

Page 20: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Perceived probabilityPerceived probabilityPerceived probabilityPerceived probability

Perceived probability (“likelihood”) of Perceived probability (“likelihood”) of xx::

LLxx(v)=Pr[x|visit(v)](v)=Pr[x|visit(v)]

Pr[x|visit(v)]Pr[x|visit(v)] is “multiplicative”. is “multiplicative”.

g(x)=yg(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )

f (x1 )

f (x2 )f (x1 )f (x1 )f (x2 )

f (x2 )

Page 21: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Knowledge measure: 1Knowledge measure: 1stst attempt attemptKnowledge measure: 1Knowledge measure: 1stst attempt attempt

Log likelihood of Log likelihood of xx::

LLLLxx(v)(v)= = nn + + log(Llog(Lxx(v))(v))

LLLLxx(root)(root)=0,=0,

LLLLxx(())¸ n – constn – const

We’d like to show:We’d like to show: Typically, Typically, LLLLxx(v,a)-LL(v,a)-LLxx(v) < 1/log(n)(v) < 1/log(n)..

ButBut: : After After nn coordinate queries, coordinate queries, LLLLxx¼¼(n)(n)..

Reason:Reason: xx is quickly separated from far away points. is quickly separated from far away points.

Separating Separating xx from from neighborsneighbors is the hardest. is the hardest.

g(x)=yg(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )

f (x1 )

f (x2 )f (x1 )f (x1 )f (x2 )

f (x2 )

Page 22: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Knowledge measure: seriouslyKnowledge measure: seriouslyKnowledge measure: seriouslyKnowledge measure: seriously

Log likelihood “gradient” at Log likelihood “gradient” at xx::

MMiixx(v)= (v)= log(Llog(Lxx(v))(v))--log(Llog(Lxx©©ii(v))(v))

MMxx(v)= (v)= AVGAVGii (M(Miixx(v))(v))

= = log(Llog(Lxx(v)) - AVG(v)) - AVGii ( log(L( log(Lxx©©ii(v)) )(v)) )

MMxx(root)=0 , M(root)=0 , Mxx(())¸̧ log(n) log(n) -- c c

g(x)=yg(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..g(x)=..

f (x1 )

f (x1 )

f (x2 )f (x1 )f (x1 )f (x2 )

f (x2 )

All that is left: typical gain in Mx is at most 1/n. All that is left: typical gain in Mx is at most 1/n.

Page 23: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

aa=1=1..aa=1=1..

v

f(x5 )

v

f(x5 )v

f(x5 )

v

f(x5 )

v,1v,1 v,0v,0v,1v,1

Gain in knowledge measureGain in knowledge measureGain in knowledge measureGain in knowledge measure

Page 24: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gain in knowledge measureGain in knowledge measureGain in knowledge measureGain in knowledge measure

MMiixx(v,a)-(v,a)-MMii

xx(v)=(v)=

v

f(x5 )

v

f(x5 )

v,0v,0v,1v,1aa=1=1..aa=1=1..

log(Llog(Lxx(v,a))(v,a))--log(Llog(Lxx©©ii(v,a)) (v,a))

--( ( log(Llog(Lxx(v))(v))--log(Llog(Lxx©©ii(v)) (v)) ))

Page 25: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gain in knowledge measureGain in knowledge measureGain in knowledge measureGain in knowledge measure

MMiixx(v,a)-(v,a)-MMii

xx(v)(v)

MMxx(v,a)-M(v,a)-Mxx(v)=(v)=

The coup des grâce:The coup des grâce: For every queryFor every query ffvv, x,, x,

E[ME[Mxx(v,a)-M(v,a)-Mxx(v)](v)]·· 1/( 1/(33n)n)

E[(ME[(Mxx(v,a)-M(v,a)-Mxx(v))(v))22]]·· 1/( 1/(33n)n)

Proof:Proof: Adaptation ofAdaptation of [Talagrand ‘96][Talagrand ‘96]..

v

f(x5 )

v

f(x5 )

v,0v,0v,1v,1

Expression depends only on Expression depends only on ff, , xx

!!

Expression depends only on Expression depends only on ff, , xx

!!

Page 26: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Main open problemMain open problemMain open problemMain open problem

Show lower bound for computing a Boolean function.Show lower bound for computing a Boolean function.

Not known even for a random function!Not known even for a random function!

Generalize for other network designs.Generalize for other network designs.

Page 27: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Thank You !Thank You !

Page 28: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplified

1.1. Partition to groups of size Partition to groups of size log(n)log(n)

2.2. Each player sends its bit Each player sends its bit loglog(n)loglog(n) times. times.

0000

11

1111

1,1,1

11

0000

11

1111

11

0,0,0

Page 29: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

11

Gallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplified

1.1. Partition to groups of size Partition to groups of size log(n)log(n)

2.2. Each player sends its bit Each player sends its bit loglog(n)loglog(n) times. times.

0000

1010

1111

0,0,01010

0000

1111

1111

1010111111

Page 30: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

11110101101010101111 1010

Gallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplified

1.1. Partition to groups of size Partition to groups of size log(n)log(n)

2.2. Each player sends its bit Each player sends its bit loglog(n)loglog(n) times. times.

W.h.p., in W.h.p., in allall groups, almost all players know all groups, almost all players know all

bits.bits.

0000

10011001

1111

10011001

0000 1111

10011001

Page 31: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplifiedGallager’s solution, simplified

3.3. Each group transmits error correcting code of its bits:Each group transmits error correcting code of its bits:

* Each player transmits a constant number of bits.Each player transmits a constant number of bits.

4.4. W.h.p. all players now know all bits of all groups.W.h.p. all players now know all bits of all groups.

0000

10011001

1111

10011001

0000

11110101

1111

10011001

1,1,11,1,1 0,1,10,1,10,1,00,1,01,0,01,0,0

suppose suppose code(1001)=100 111 100 011code(1001)=100 111 100 011suppose suppose code(1001)=100 111 100 011code(1001)=100 111 100 011

Page 32: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The reductionThe reductionThe reductionThe reduction

The program:The program:

Start with a noisy broadcast protocol with Start with a noisy broadcast protocol with knkn steps. steps.

Gradually, simlulate protocol in more “tree-like” Gradually, simlulate protocol in more “tree-like”

models.models.

W.l.o.g., assume each node performs W.l.o.g., assume each node performs 10k10k

transmissions.transmissions.

first step:first step: each transmission is replaced by three, only each transmission is replaced by three, only

one of which is noisy.one of which is noisy.

Page 33: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The reductionThe reductionThe reductionThe reduction

First step:First step: each transmission is replaced by three, only each transmission is replaced by three, only

one of which is noisy.one of which is noisy.

Function of Function of xx33, and , and

of past receptions.of past receptions.

Function of Function of xx33, and , and

of past receptions.of past receptions.

bxx33xx33x3, b(0),b(1)xx33xx33

b(0), b(1) b(0), b(1)

transmitted transmitted noise noise

freefree..

b(0), b(1) b(0), b(1)

transmitted transmitted noise noise

freefree..

Page 34: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The reductionThe reductionThe reductionThe reduction

Second step:Second step: noisy transmissions moved to beginning noisy transmissions moved to beginning

of protocol.of protocol.

Function of Function of xx33, and , and

of past receptions.of past receptions.

Function of Function of xx33, and , and

of past receptions.of past receptions.

bxx33xx33b(0),b(1)xx33xx33

x3, x3, x3,..xx33xx33

Page 35: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The reductionThe reductionThe reductionThe reduction

Second step:Second step: noisy transmissions moved to beginning noisy transmissions moved to beginning

of protocol.of protocol.

After noisy phase: each player has After noisy phase: each player has 10k10k noisy copies of noisy copies of

each bit.each bit.

Equivalent to having an Equivalent to having an kk-noisy copy of -noisy copy of xx..

b(0),b(1)xx33xx33

x3, x3, x3,..xx33xx33

Page 36: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

The reductionThe reductionThe reductionThe reduction

Third step:Third step: each player begins with an each player begins with an kk-noisy copy of -noisy copy of

xx..

Each transmission depends on transmitter’s noisy Each transmission depends on transmitter’s noisy

copy, and past transmissions (and perhaps a random copy, and past transmissions (and perhaps a random

decision).decision).

b(0),b(1)

x3, x3, x3,..x©N3x©N3

Equivalent to Equivalent to

a gnd tree!a gnd tree!Equivalent to Equivalent to

a gnd tree!a gnd tree!

Page 37: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gain in progress measureGain in progress measureGain in progress measureGain in progress measure

MMiixx(v)= (v)= log(Pr[x|visit(v)])-log(Pr[xlog(Pr[x|visit(v)])-log(Pr[x©© i|visit(v)]) i|visit(v)])

v

f(x5 )

v

f(x5 )

v,0v,0v,1v,1

Page 38: On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research

Gain in progress measureGain in progress measureGain in progress measureGain in progress measure

MMiixx(v)(v)

MMiixx(v,a)-(v,a)-MMii

xx(v)=(v)=

v

f(x5 )

v

f(x5 )

v,0v,0v,1v,1

aa==ff(x(x55)) : a random : a random

variablevariable

aa==ff(x(x55)) : a random : a random

variablevariable

Only depends on Only depends on ff !.!.Only depends on Only depends on ff !.!.