broadcasting and multicasting nested message sets · special cases: [cov72, ber73, gal74],[km77]...
TRANSCRIPT
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Broadcasting and MulticastingNested Message Sets
Shirin Saeedi Bidokhti (TUM)joint work with Vinod Prabhakaran (TIFR) and Suhas Diggavi (UCLA)
May 12, 2013
1 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup
Transmitter
Receiver Receiver Receiver Receiver Receiver
Communication Media
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup: broadcasting
Transmitter
Broadcast Channel
Receiver Receiver Receiver Receiver Receiver
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup: multicasting
Transmitter
Receiver Receiver Receiver Receiver Receiver
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup: nested message sets
Transmitter
W1,W2,W3
Receiver
W1
Receiver
W1
Receiver
W1, W2
Receiver
W1, W2, W3
Receiver
W1, W2, W3
Communication Media
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup
Transmitter
W1
,W2,W3
Receiver
W1
Receiver
W1
Receiver
W1
, W2
Receiver
W1
, W2, W3
Receiver
W1
, W2, W3
Communication Media
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup
Transmitter
W1,W2
,W3
Receiver
W1
Receiver
W1
Receiver
W1, W2
Receiver
W1, W2
, W3
Receiver
W1, W2
, W3
Communication Media
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Problem setup
Transmitter
W1,W2,W3
Receiver
W1
Receiver
W1
Receiver
W1, W2
Receiver
W1, W2, W3
Receiver
W1, W2, W3
Communication Media
2 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Practical motivation
video streaming forms more than 50% of the traffic over mobilenetworks
receiver devices have different QoS demands and one can gain bycoding for these demands
we consider nested message multicast setting, motivated byapplications in Scalable Video Coding (SVC): a base layer and severalrefinement layers
3 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Previous work
p(y1, y2|x)Encoder
Decoder
Decoder
-
-
- -
-
-XW0,W1,W2
Y2
Y1
W0, W2
W0, W1
single-hop broadcast channelCover (1972)special cases: [Cov72, Ber73, Gal74], [KM77]multilevel broadcast channels with nested message sets [BZT07, NE09]3-receiver linear deterministic BC with nested message sets [PDT07]
multi-hop networkclassical (wireline) networks [ACLY00],[EF03,NY04, RW09]
4 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Previous work
p(y1, y2|x)Encoder
Decoder
Decoder
-
-
- -
-
-XW0,��W1,W2
Y2
Y1
W0, W2
W0,��W1
single-hop broadcast channelCover (1972)special cases: [Cov72, Ber73, Gal74], [KM77]multilevel broadcast channels with nested message sets [BZT07, NE09]3-receiver linear deterministic BC with nested message sets [PDT07]
multi-hop networkclassical (wireline) networks [ACLY00],[EF03,NY04, RW09]
4 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Previous work
p(y1, y2|x)Encoder
Decoder
Decoder
-
-
- -
-
-XW0,W1,W2
Y2
Y1
W0, W2
W0, W1
single-hop broadcast channelCover (1972)special cases: [Cov72, Ber73, Gal74], [KM77]multilevel broadcast channels with nested message sets [BZT07, NE09]3-receiver linear deterministic BC with nested message sets [PDT07]
multi-hop networkclassical (wireline) networks [ACLY00],[EF03,NY04, RW09]
4 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Previous work
p(y1, y2|x)Encoder
Decoder
Decoder
-
-
- -
-
-XW0,W1,W2
Y2
Y1
W0, W2
W0, W1
single-hop broadcast channelCover (1972)special cases: [Cov72, Ber73, Gal74], [KM77]multilevel broadcast channels with nested message sets [BZT07, NE09]3-receiver linear deterministic BC with nested message sets [PDT07]
multi-hop networkclassical (wireline) networks [ACLY00],[EF03,NY04, RW09]
4 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Previous work
p(y1, y2|x)Encoder
Decoder
Decoder
-
-
- -
-
-XW0,W1,W2
Y2
Y1
W0, W2
W0, W1
single-hop broadcast channelCover (1972)special cases: [Cov72, Ber73, Gal74], [KM77]multilevel broadcast channels with nested message sets [BZT07, NE09]3-receiver linear deterministic BC with nested message sets [PDT07]
multi-hop networkclassical (wireline) networks [ACLY00],[EF03,NY04, RW09]
4 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
In this talk...
We look at the broadcast channel through deterministic models
linear deterministic modelour goal in this work is not to find approximate solutions, but toinvestigate new techniques that may be adapted to more general channels
The type of question that we are interested in is the finding of ultimaterates of communication in the nested message set broadcast andmulticast scenarios
5 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
In this talk...
We look at the broadcast channel through deterministic models
linear deterministic modelour goal in this work is not to find approximate solutions, but toinvestigate new techniques that may be adapted to more general channels
The type of question that we are interested in is the finding of ultimaterates of communication in the nested message set broadcast andmulticast scenarios
5 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
6 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
7 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels
H3
H2
H1
H4
Encoder
Decoder
Decoder
Decoder
Decoder
-
-
-
-
- -��3QQsJJJ
-
-
-
-
XW1,W2
Y4
Y3
Y2
Y1
W1, W2
W1, W2
W1
W1
W1 ∈ FR1 W2 ∈ FR2 , X ∈ Fm, Hi ∈ Fni×m, Yi = HiX
Y1 =
[Y1
1Y2
1
]=
[1 0 0 00 0 1 1
]︸ ︷︷ ︸
H1
x1x2x3x4
=
[x1
x3 + x4
]
Y2 =
[Y1
2Y2
2
]=
[1 0 0 00 1 0 0
]︸ ︷︷ ︸
H2
x1x2x3x4
=
[x1x2
]
.
.
.
8 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels
H3
H2
H1
H4
Encoder
Decoder
Decoder
Decoder
Decoder
-
-
-
-
- -��3QQsJJJ
-
-
-
-
XW1,W2
Y4
Y3
Y2
Y1
W1, W2
W1, W2
W1
W1
W1 ∈ FR1 W2 ∈ FR2 , X ∈ Fm, Hi ∈ Fni×m, Yi = HiX
Y1 =
[Y1
1Y2
1
]=
[1 0 0 00 0 1 1
]︸ ︷︷ ︸
H1
x1x2x3x4
=
[x1
x3 + x4
]
Y2 =
[Y1
2Y2
2
]=
[1 0 0 00 1 0 0
]︸ ︷︷ ︸
H2
x1x2x3x4
=
[x1x2
]
.
.
.
8 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels
H3
H2
H1
H4
Encoder
Decoder
Decoder
Decoder
Decoder
-
-
-
-
- -��3QQsJJJ
-
-
-
-
XW1,W2
Y4
Y3
Y2
Y1
W1, W2
W1, W2
W1
W1
W1 ∈ FR1 W2 ∈ FR2 , X ∈ Fm, Hi ∈ Fni×m, Yi = HiX
I1 = {1, 2}: Public receivers
Y1 =
[Y1
1Y2
1
]=
[1 0 0 00 0 1 1
]︸ ︷︷ ︸
H1
x1x2x3x4
=
[x1
x3 + x4
]
Y2 =
[Y1
2Y2
2
]=
[1 0 0 00 1 0 0
]︸ ︷︷ ︸
H2
x1x2x3x4
=
[x1x2
]
.
.
.
8 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels
H3
H2
H1
H4
Encoder
Decoder
Decoder
Decoder
Decoder
-
-
-
-
- -��3QQsJJJ
-
-
-
-
XW1,W2
Y4
Y3
Y2
Y1
W1, W2
W1, W2
W1
W1
W1 ∈ FR1 W2 ∈ FR2 , X ∈ Fm, Hi ∈ Fni×m, Yi = HiX
I1 = {1, 2}: Public receiversI2 = {3, 4}: Private receivers
Y1 =
[Y1
1Y2
1
]=
[1 0 0 00 0 1 1
]︸ ︷︷ ︸
H1
x1x2x3x4
=
[x1
x3 + x4
]
Y2 =
[Y1
2Y2
2
]=
[1 0 0 00 1 0 0
]︸ ︷︷ ︸
H2
x1x2x3x4
=
[x1x2
]
.
.
.
8 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels: objective
Y1 = H1
x1x2x3x4
=[
x1]
Y2 = H2
x1x2x3x4
=
[x1 + 2x2
x3
]
Y3 = H3
x1x2x3x4
=
x1x2 + x3
x4
Y4 = H4
x1x2x3x4
=
x1x1 + x2
x2 + 3x3 + 2x4
Can the source convey the message
W =
w1w2w3
common
private
?
What we control:The source operations.
X = AW =
a1 a2 a3
a4 a5 a6
a6 a7 a8
a9 a10 a11
w1
w2
w3
This is a matrix completion problem...
9 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels: objective
Y1 = H1
x1x2x3x4
=[
x1]
Y2 = H2
x1x2x3x4
=
[x1 + 2x2
x3
]
Y3 = H3
x1x2x3x4
=
x1x2 + x3
x4
Y4 = H4
x1x2x3x4
=
x1x1 + x2
x2 + 3x3 + 2x4
Can the source convey the message
W =
w1w2w3
common
private
?
What we control:The source operations.
X = AW =
a1 a2 a3
a4 a5 a6
a6 a7 a8
a9 a10 a11
w1
w2
w3
This is a matrix completion problem...
9 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Linear deterministic broadcast channels: objective
Y1 = H1
x1x2x3x4
=[
x1]
Y2 = H2
x1x2x3x4
=
[x1 + 2x2
x3
]
Y3 = H3
x1x2x3x4
=
x1x2 + x3
x4
Y4 = H4
x1x2x3x4
=
x1x1 + x2
x2 + 3x3 + 2x4
Can the source convey the message
W =
w1w2w3
common
private
?
What we control:The source operations.
X = AW =
a1 a2 a3
a4 a5 a6
a6 a7 a8
a9 a10 a11
w1
w2
w3
This is a matrix completion problem...
9 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
10 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation I: MIMO broadcast channel in the high SNR
Transmitter
Receiver
Receiver
Receiver
Y1=H1X+Z1
Y2=H2X+Z2
Y3=H3X+Z3
the high SNR regimestudy of the interaction of the signals (rather than the background noise)
11 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
X1
X2X3
X4
Y1 =
Y2 =
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 =
[X1X4
]
Y2 =
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1
X1X2X3X4
Y2 =
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1X
Y2 =
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1X
Y2 =
[X1 + 2X2
X4
]
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1X
Y2 = H2X
Y3 =
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1X
Y2 = H2X
Y3 =
X1 + 2X2X1 + X4
X3X4
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Motivation II: wireline networks
source
receiver 1
receiver 2
receiver 3
Y1 = H1X
Y2 = H2X
Y3 = H3X
12 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
13 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
A class of linear deterministic channels
Y1 =[
1 0 0 0]
x1x2x3x4
=[
x1]
Y2 =
[0 1 0 00 0 1 0
]x1x2x3x4
=
[x2x3
]
Y3 =
1 0 0 00 1 0 00 0 0 1
x1x2x3x4
=
x1x2x4
Y4 =
1 0 0 00 0 1 00 0 0 1
x1x2x3x4
=
x1x3x4
S
D1 D2 D3D3 D4D4
x1 x2 x3 x4
14 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Notation
S
D1 D2 D3D3 D4D4
E{1} E{2} Eφ
ES, S ⊆ I1: all resource connected to every (public) receiver i ∈ S andnot connected to any public receiver j /∈ S
EpS , S ⊆ I1, p ∈ I2: resource in ES and connected to private receiver p
XS (resp. XpS): vector of symbols carried over ES (resp. Ep
S ).
15 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Notation
E3{2}
S
D1 D2 D3D3 D4D4
E{1} E{2} Eφ
ES, S ⊆ I1: all resource connected to every (public) receiver i ∈ S andnot connected to any public receiver j /∈ S
EpS , S ⊆ I1, p ∈ I2: resource in ES and connected to private receiver p
XS (resp. XpS): vector of symbols carried over ES (resp. Ep
S ).
15 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Notation
E3{2}
S
D1 D2 D3D3 D4D4
E{1} E{2} Eφ
ES, S ⊆ I1: all resource connected to every (public) receiver i ∈ S andnot connected to any public receiver j /∈ S
EpS , S ⊆ I1, p ∈ I2: resource in ES and connected to private receiver p
XS (resp. XpS): vector of symbols carried over ES (resp. Ep
S ).
15 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The challenge
S
W1 = [w1,1]
W2 = [w2,1,w2,2]
D1 D2 D3D3 D4D4
w1,1 w1,1 + w2,1 w2,1 w2,2
Mixing of the common and private messages is necessary; but in a controlledmanner
One has to reveal (partial) information about the private message to publicreceivers!
16 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate splitting and linear encoding scheme
let W = [w1,1 . . .w1,R1 w2,1 . . .w2,R2 ]T
let X = A ·Wreveal information about the private messages to public receiversthrough a zero-structured encoding matrix
A =
R1←→α{1,2}↔
α{1}↔α{2}↔ αφ↔
0 0 00 0
0 0
llll
|E{1,2}|
|E{1}|
|E{2}|
|Eφ|
R2 = α{1,2} + α{2} + α{1} + αφ
choose appropriate parameters, and complete the matrix
17 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate splitting and linear encoding scheme
let W = [w1,1 . . .w1,R1 w2,1 . . .w2,R2 ]T
let X = A ·Wreveal information about the private messages to public receiversthrough a zero-structured encoding matrix
A =
R1←→α{1,2}↔
α{1}↔α{2}↔ αφ↔
0 0 00 0
0 0
llll
|E{1,2}|
|E{1}|
|E{2}|
|Eφ|
R2 = α{1,2} + α{2} + α{1} + αφ
choose appropriate parameters, and complete the matrix
17 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate splitting and linear encoding scheme
let W = [w1,1 . . .w1,R1 w2,1 . . .w2,R2 ]T
let X = A ·Wreveal information about the private messages to public receiversthrough a zero-structured encoding matrix
A =
R1←→α{1,2}↔
α{1}↔α{2}↔ αφ↔
0 0 00 0
0 0
llll
|E{1,2}|
|E{1}|
|E{2}|
|Eφ|
R2 = α{1,2} + α{2} + α{1} + αφ
choose appropriate parameters, and complete the matrix
17 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate splitting and linear encoding scheme
let W = [w1,1 . . .w1,R1 w2,1 . . .w2,R2 ]T
let X = A ·Wreveal information about the private messages to public receiversthrough a zero-structured encoding matrix
A =
R1←→α{1,2}↔
α{1}↔α{2}↔ αφ↔
0 0 00 0 0
0 00 0
0 00 0
l
l
l
l
|Ep{1,2}|
|Ep{1}|
|Ep{2}|
|Epφ|
R2 = α{1,2} + α{2} + α{1} + αφ
choose appropriate parameters, and complete the matrix
17 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate splitting and linear encoding scheme
let W = [w1,1 . . .w1,R1 w2,1 . . .w2,R2 ]T
let X = A ·Wreveal information about the private messages to public receiversthrough a zero-structured encoding matrix
A =
R1←→α{1,2}↔
α{1}↔α{2}↔ αφ↔
0 0 00 0
0 0
llll
|E{1,2}|
|E{1}|
|E{2}|
|Eφ|
R2 = α{1,2} + α{2} + α{1} + αφ
choose appropriate parameters, and complete the matrix
17 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
At first glance...
A rate pair (R1,R2) is achievable if there exist variables αS, S ⊆ I1, s.t.
Structural constraints:
αS ≥ 0 ∀S ⊆ I1
R2 =∑
αS
Decoding constraints at public receiver i ∈ I1:
R1 +∑S3i
αS ≤∑S3i
|ES|
Decoding constraints at private receiver p ∈ I2:
R2 ≤∑S∈T
αS +∑
S∈T c
|EpS | ∀T ⊆ 2I1 superset saturated
R1 + R2 ≤∑S⊆I1
|EpS |
18 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
At first glance...
(R1,R2) is achievable if
R1 ≤ minreceivers i
mincuti
R1 + R2 ≤ minprivate receivers p
mincutp
2R1 + R2 ≤ minprivate receivers p
{mincut1 + mincut2 + remaining resources of receiver p
}
and this turns out to be optimal for two public and any number of privatereceivers, characterizing the capacity region.
18 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
At first glance...
(R1,R2) is achievable if
R1 ≤ minreceivers i
mincuti
R1 + R2 ≤ minprivate receivers p
mincutp
2R1 + R2 ≤ minprivate receivers p
{mincut1 + mincut2 + remaining resources of receiver p
}
and this turns out to be optimal for two public and any number of privatereceivers, characterizing the capacity region.
18 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
When there are more than two public receivers...
(0, 2) is not achievable using the previous scheme!
S
W1 = []
W2 = [w2,1,w2,2]
D1 D2 D3 D4D4 D5D5 D6D6
w2,1 w2,1 + w2,2 w2,2
The private information revealed to different subsets of public receivers neednot be independent
19 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
When there are more than two public receivers...
(0, 2) is not achievable using the previous scheme!
S
W1 = []
W2 = [w2,1,w2,2]
D1 D2 D3 D4D4 D5D5 D6D6
w2,1 w2,1 + w2,2 w2,2
The private information revealed to different subsets of public receivers neednot be independent
19 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Appropriate pre-encoding
S
W1 = []
W2 = [w2,1,w2,2]
D1 D2 D3 D4D4 D5D5 D6D6
X{1} X{2} X{3}
pre-encode W2 = [w2,1,w2,2]T into W ′2 = [w′2,1,w
′2,2,w
′2,3]
now use an structured encoding matrix X{1}X{2}X{3}
=
1 0 00 1 00 0 1
w′2,1w′2,2w′2,3
.20 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Achievable rate-region
A rate pair (R1,R2) is achievable if there exist variables αS, S ⊆ I1, s.t.
Structural constraints:
αS ≥ 0 ∀φ 6=S ⊆ I1
R2 =∑
αS
Decoding constraints at public receiver i ∈ I1:
R1 +∑S3i
αS ≤∑S3i
|ES|
Decoding constraints at private receiver p ∈ I2:
R2 ≤∑S∈T
αS +∑
S∈T c
|EpS | ∀T ⊆ 2I1 superset saturated
R1 + R2 ≤∑S⊆I1
|EpS |
The converse holds for three (or fewer) public and any number of privatereceivers, characterizing the capacity region.
21 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Optimality results
22 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Example
Is rate pair (R1 = 1,R2 = 2) achievable?
S
D1 D2 D3 D4D4 D5D5
Is (R1 = 1,R2 = 2) in the innerbound?NO
23 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Example
Is rate pair (R1 = 1,R2 = 2) achievable?
S
D1 D2 D3 D4D4 D5D5
Is (R1 = 1,R2 = 2) in the innerbound?
NO
23 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Example
Is rate pair (R1 = 1,R2 = 2) achievable?
S
D1 D2 D3 D4D4 D5D5
Is (R1 = 1,R2 = 2) in the innerbound?NO
23 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0
←
R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|
×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|
×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|
×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|
×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|
×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|
×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|
×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|
×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|
×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←
...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints:
4R1 + 2R2 ≤ 7
α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Rate pair (R1 = 1,R2 = 2) is not in the innerbound
Structural constraints: 4R1 + 2R2 ≤ 7α{1,2,3} . . . , α{1} ≥ 0 ←R2 = α{1,2,3} + . . .+ αφDecoding constraints at public receivers:
R1 + α{1,2,3} + α{1,3} + α{1,2} + α{1} ≤∑
S31 |ES|×2←
R1 + α{1,2,3} + α{2,3} + α{1,2} + α{2} ≤∑
S32 |ES|×1←
R1 + α{1,2,3} + α{2,3} + α{1,3} + α{3} ≤∑
S33 |ES|×1←
Decoding constraints at private receivers p ∈ I2:
R2 ≤ α{1} + α{2} + α{3} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Epφ|
×1, p=5←
R2 ≤ α{1} + α{1,2} + α{1,3} + α{2,3} + α{1,2,3} + |Ep{2}|+ |E
p{3}|+ |E
pφ|
×1, p=4←...R1 + R2 ≤
∑|Ep
S |
24 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The idea: an outerbound similar to the innerbound
Innerbound
α{1,2,3} ≥ 0
R1 ≤ 1−∑S31
αS
R1 ≤ 2−∑S32
αS
R1 ≤ 2−∑S33
αS
R2 ≤∑
S∈{{1},{1,2},{1,3},{2,3},{1,2,3}
}αS +∑
S∈{{2},{3}φ}c
|EpS |
R2 ≤∑
S∈{{1},{{2},{{3},{1,2},{1,3},{{2,3},{1,2,3}
}αS +∑
S∈{φ}c
|EpS |
Outerbound
1n
H(X{1,2,3}|W1) ≥ 0
R1 ≤ 1− 1n
H(X{1},X{1,2},X{1,3},X{1,2,3}|W1)
R1 ≤ 2− 1n
H(X{2}, X{2,3}|W1)
R1 ≤ 2− 1n
H(X{2,3}, X{3}|W1)
R2 ≤1n
H(X{1}, X{2,3}|W1) + 1
R2 ≤1n
H(X{1}, X{3}, X{2,3}, X{2}|W1)
25 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
The idea: submodularity of the entropy function
4R1 + 2R2
≤ 7− 1n
(2H(X{1}|W1) + H(X{2}, X{2,3}|W1) + H(X{2,3}, X{3}|W1)−H(X{1}, X{2,3}|W1)− H(X{1}, X{3}, X{2,3}, X{2}|W1)
)≤ 7 by sub-modularity
26 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Summary of the converse proof
1 Write upper-bounds on R1 and R2 using standard techniques and findbounds similar to inner-bounds
2 Take the appropriate weighted sum that cancels out all α parameters
3 Use sub-modularity of entropyUse the fact that α parameters all cancel outSome technicalityDoes not extend to I1 = {1, 2, 3, 4}
27 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Summary of the converse proof
1 Write upper-bounds on R1 and R2 using standard techniques and findbounds similar to inner-bounds
2 Take the appropriate weighted sum that cancels out all α parameters
3 Use sub-modularity of entropyUse the fact that α parameters all cancel outSome technicalityDoes not extend to I1 = {1, 2, 3, 4}
27 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Summary of the converse proof
1 Write upper-bounds on R1 and R2 using standard techniques and findbounds similar to inner-bounds
2 Take the appropriate weighted sum that cancels out all α parameters
3 Use sub-modularity of entropyUse the fact that α parameters all cancel outSome technicalityDoes not extend to I1 = {1, 2, 3, 4}
27 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Up to now...
general channels
linear deterministic model
combinationnetworkmodel
28 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
29 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
linear deterministic channels: more generally
30 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
linear deterministic channels: more generally
Y1 = H1
x1x2x3x4
=[
x1]
Y2 = H2
x1x2x3x4
=
[x1 + 2x2
x3
]
Y3 = H3
x1x2x3x4
=
x1x2 + x3
x4
Y4 = H4
x1x2x3x4
=
x1x1 + x2
x2 + 3x3 + 2x4
30 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[
Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[
Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[
Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[
Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[
Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[Vφ |
V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[Vφ | V{1} |
V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[Vφ | V{1} | V{2} |
V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Towards more general linear deterministic models
Nullspace of H1Nullspace of H2
Nullspace of H12
V =[Vφ | V{1} | V{2} | V{1,2}
]
31 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Sketch of the achievability proof
change of basis to create simpler equivalent channels: the source sendsX = VAW.
techniques of rate splitting and superposition coding at the sourcethrough a zero-structured code A with parameters to be designed.
decodability conditions at each user and their translations to rankconditions of the channel matrices.
one choice of the structured code that satisfies all the rank conditions.
32 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Sketch of the achievability proof
change of basis to create simpler equivalent channels: the source sendsX = VAW.
techniques of rate splitting and superposition coding at the sourcethrough a zero-structured code A with parameters to be designed.
decodability conditions at each user and their translations to rankconditions of the channel matrices.
one choice of the structured code that satisfies all the rank conditions.
32 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Sketch of the achievability proof
change of basis to create simpler equivalent channels: the source sendsX = VAW.
techniques of rate splitting and superposition coding at the sourcethrough a zero-structured code A with parameters to be designed.
decodability conditions at each user and their translations to rankconditions of the channel matrices.
one choice of the structured code that satisfies all the rank conditions.
32 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Sketch of the achievability proof
change of basis to create simpler equivalent channels: the source sendsX = VAW.
techniques of rate splitting and superposition coding at the sourcethrough a zero-structured code A with parameters to be designed.
decodability conditions at each user and their translations to rankconditions of the channel matrices.
one choice of the structured code that satisfies all the rank conditions.
32 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Capacity region
The capacity region of a linear deterministic broadcast channel with twopublic receivers (in I1 = {1, 2}) and any number of private receivers (inI2 = {3, · · · ,K}) is given by
R1 ≤ mini∈I
r{i}
R1 + R2 ≤ minp: private
r{i}
2R1 + R2 ≤ minp: private
{r{1} + r{2} + r{1,2,i} − r{1,2}},
The rates given above are expressed in log|F|(·).
r{i} , rank(Hi) r{i1,··· ,i|S|} , rank
Hi1...
Hi|S|
33 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Outline
1 Linear deterministic models
2 Why is this model worthwhile to investigate?
3 Simplification: a combination network modelThe challengeLinear encoding schemesOptimality results
4 Linear deterministic BClinear deterministic channel: more generallyCapacity region
5 Final remarks
34 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Summary
took a deterministic approach to the problem of broadcasting nestedmessages
characterized a general achievable rate-region over thecombination-network BC: the capacity region when there are three (orfewer) public and any number of private receivers
discussed techniques to use sub modularity in a more systematic way toshow optimality results.
characterized the capacity region for linear deterministic BC for twopublic and any number of private receivers
35 / 36
Linear deterministic models Why is this model worthwhile to investigate? Simplification: a combination network model Linear deterministic BC Final remarks
Interesting open questions
How to extend the results to MIMO Gaussian broadcast channels?
How may the schemes (over combination network model) be insightfulto linear deterministic broadcast channels with more than two publicreceivers?
36 / 36