bounding the entropic region via information geometry · rev for n = 2: h(x) h(xy ) h(y ) h(xy )...

15
Bounding the Entropic Region via Information Geometry John MacLaren Walsh & Yunshu Liu Department of Electrical and Computer Engineering Drexel University Philadelphia, PA [email protected] & [email protected] Thanks to NSF CCF-1016588, NSF CCF-1053702, & AFOSR FA9550-12-1-0086.

Upload: others

Post on 12-Jul-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Bounding the Entropic Region via InformationGeometry

John MacLaren Walsh & Yunshu LiuDepartment of Electrical and Computer Engineering

Drexel UniversityPhiladelphia, PA

[email protected] & [email protected]

Thanks to NSF CCF-1016588, NSF CCF-1053702, & AFOSR FA9550-12-1-0086.

Page 2: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Outline

1. Entropic Vectors Review: What are they, and why are they important?

2. Entropy Vector Region: What is known/unknown?

3. Structure of the unknown part of Γ∗4

4. How can one parameterize distributions giving extremal entropic vectors via

information geometry?

5. Which distributions give entropy vectors in the unknown part of Γ∗4?

Page 3: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Region of Entropic Vectors Γ∗N – What is it?

1. X = (X1, . . . , XN ) N discrete RVs

2. every subset XA = (Xi, i ∈ A) A ⊆{1, . . . , N} ≡ [N ] has joint entropy

h(XA).

3. h = (h(XA) |A ⊆ [N ] ) ∈ R2N−1 en-

tropic vector

• Example: for N = 3, h =

(h1, h2, h3, h12, h13, h23, h123).

4. a ho ∈ R2N−1 is entropic if ∃ joint PMF

pX s.t. h(pX) = ho.

5. Region of entropic vectors = Γ∗N

6. Closure Γ∗N is a convex cone [1].

H(X)

H(Y )

H(XY )

REV for N = 2:

H(X) ≤ H(XY )

H(Y ) ≤ H(XY )

H(XY ) ≤ H(X) + H(Y )

Γ∗N is an unknown non-polyhedral convex cone for N ≥ 4.

Page 4: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Why are Entropic Vectors Important?

Network Coding

...

...

S

...

...

Yss

i

T

e Ue

Re

�(t)t!s

Out(i)In(i)

(Roughly) [1, 2, 3, 4] intersect Γ∗N with

hYs ≥ ωs, s ∈ S hYS =∑s∈S

hYs ,

hUOut(s)|Ys = 0, s ∈ S

hUOut(i)|UIn(i)= 0, i ∈ V \ (S ∪ T )

hUe ≤ Re, e ∈ E

hYβ(t)|UIn(t)= 0, t ∈ T

and project onto ωs, Re

Distributed Storage (MDCS DSCSC)

El

...

...

ES DA B

Sj

...

...

...

...

Dm Fm

Ul Vm

⇢ S ⇥ E ⇢ E ⇥ D

⇢ SRlYj H(Xj) Zl

(Roughly) [5] intersect Γ∗N with

hYj ,j∈S =∑j∈S

hYj

hZl|(Yj ,j∈Ul) = 0, l ∈ E

h(Yj ,j∈Fm)|(Zl,l∈Vm) = 0, m ∈ D

hYj > H(Xj), j ∈ S

hZl ≤ Rl, l ∈ E

and project onto {H(Xi), Rl}

3

Page 5: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Entropic Vector Region: What is known/unknown? – From Outside

Φ4 binary entropic vectors

ΓN Shannon Outer BoundZN Non-Shannon Outer BoundΓ∗

N Region of Entropic Vectors

SN Subspace Ranks Bound

conv(Φ4) convex hull

MqN

GF(q)-Representable Matroid Bound

• Shannon Outer Bound: ΓN : entropy is submod-

ular: I(XA;XB|XC) ≥ 0 ∀A,B, C. A subset of

these, the elemental inequalities ⇒ the rest

I(Xi;Xj |XK) ≥ 0, H(Xi|XN\{i}) ≥ 0 (1)

Γ2 = Γ∗2,Γ3 = Γ∗3. ΓN 6= Γ∗N , N ≥ 4

• Non-Shannon Outer:[6, 7, 8, 9, 10, 11, 12]

Zhang & Yeung, Dougherty & Freiling & Zeger, Matus

Start with 4 unconstr. r.v.s

add rv. obeying distr. match & Markov. cond.

Intersect ΓN for N ≥ 5 w/ Markov & distr. match

Project back to orig. 4 unconstr. vars.

obtain new information inequalities this way!

• Infinite Number of *Linear* Information In-

equalities: [11] Matus a sequence of these inequali-

ties & a curve in Γ∗4 ⇒ Γ∗N is non-polyhedral convex

cone for N ≥ 4.4

Page 6: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Entropic Vector Region: What is known/unknown? – From Inside (Linear)

Φ4 binary entropic vectors

ΓN Shannon Outer BoundZN Non-Shannon Outer BoundΓ∗

N Region of Entropic Vectors

SN Subspace Ranks Bound

conv(Φ4) convex hull

MqN

GF(q)-Representable Matroid Bound

• Matroids: (rank function) Submodular, non-dec.

r : 2N → Z, r(A) ≤ |A|. (ΓN ∩Z w/ h({i}) ≤ 1).

• (Fq-)Representable Matroid r(A) := rank(A:,A),

A (∈ Fr(N )×Nq ). ∝ entropic: [13, 14, 15]

X = UA, U ∼ Frank(N )q ⇒ h(XA) = r(A) log2(q)

take conic hull to get inner bound

• characterize: “forbidden minors” for q ∈ {2, 3, 4}.[16] cardinality & integer req. ⇒ small ( for Γ∗4.

• Ingleton’s Inequality: [17] h(·) = r(·) is repr. ma-

troid, ⇒ Ingletonkl ,I(i; j|k) + I(i; j|l) + I(k; l)− I(i; j) ≥ 0

(not an info. ineq.) derive w/ common inf. [18]

• Subspace Bounds: Group variables. (vector linear)

Conic hull for 4-rvs =Ingleton Inner bnd [19, 20, 21].

S4 , Γ4 ∩ij⊂{1,2,3,4} Ingletonij ≥ 0DFZ obtained S5 w/ extra ineq.s Unknown ≥ 6.

5

Page 7: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Entropic Vector Region: What is known/unknown? – From Inside (Groups)

Only provably exhaustive inner bound is due to Chan [22]:

Subgroups

hA , log|G|��T

i2A Gi

��

Group CharacterizableEntropic Vector

Group GG1

G2

GN

} collect over all groups/ subgroups

⇤N �⇤N = conic(⇤N )

6

Page 8: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Structure of the Unknown Part of Γ4

Γ4 Shannon Outer Bound

Γ∗4

Region of Entropic Vectors

S4 Ingleton Inner Bound

Pij � Γ4 ∩ Ingletonij ≤ 0

P∗ij � Γ∗

4 ∩ Ingletonij ≤ 0

S=ij � S4 ∩ Ingletonij = 0

Matus 1995 Conditional Independence Relations [23]

• faces of Γ4 have entropic point in rel. int.

• h ∈ Γ4, Ingletonij < 0 =⇒ Ingletonkl ≥ 0

• Γ∗4 = S4 ∪ij P∗ij• 6 symm. Pij . Each w/ 1 Ingl. vio. extr. ray.

Walsh EII 2013 [24]:

• Even though 15 dim., ∀hA ∈ Ingletonij

proj\hAPij = proj\hAP∗ij = proj\hAS=ij

• Only one necessary nonlinear inform. ineq.!

• = forms: (−hA) ∈ Ingletonij , P∗ij = Pij∩hA ≤ gupA (ho

\A) , maxh∈P∗ij ,h\A=ho\A

hA (2)

and if (+hA) ∈ Ingletonij , P∗ij = Pij∩

hA ≥ gdnA (ho\A) , min

h∈P∗ij ,h\A=ho\A

hA (3)

7

Page 9: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Overall Aim of the Paper/Talk

re-parameterization

Entropies

Distributions⌘ , [pX(x)]

✓ ,log

✓pX(x)

pX(0)

◆�

8

Page 10: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Information Geometric Properties of Distributions on Shannon Facets

p

m-geodesicm-geodesic

e-geodesic

→ΠE⊥

A∪B(p)

→ΠE↔,⊥

A,B(p)

E↔,⊥A,B

E⊥A∪Be-autoparallel submanifold

e-au

topar

alle

lsu

bm

anifol

d

entropy submodularity

hA + hB ≥ hA∪B + hA∩B

E↔,⊥A,B =

{θ∣∣∣pX = pXA\B|XA∩BpXBpX(A∪B)c

}

E⊥A,B ={θ∣∣pX = pXA∪BpX(A∪B)c

}

• Shannon outer bound:

I(XA;XB|XC) ≥ 0

• Hence, on the Shannon facet:

I(XA;XB|XC) = 0

• means XA ↔XC ↔XB• This is an e-autoparallel submani-

fold of pXA∪B∪C !

• =⇒ those pXA∪B∪C on this

boundary (affine set) of entropy

have a parameterization in which

they are also affine (known A,b)

• Sometimes X 6= XA∪B∪C , so

also need the structure having a

particular marginal pXA∪B∪C (m-

autoparallel)

• mutually dual foliations

9

Page 11: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Interesting 4-atom Distribution Support

Left: (0000)(0110)(1010)(1111) in m-coordinate

Right: (0000)(0110)(1010)(1111) in e-coordinate

α = 0.25,β = γ = 0.5

Hyperplane E : I(x3, x4) = 0

DFZ 4 atom conjecture(point)

where α = β ∗ γ

The whole 3D spacep(0000) = αp(0110) = β − αp(1010) = γ − αp(1111) = 1 + α− γ − β

β = γ = 0.5

4 atoms uniform(point)

Matus’s curve in ISIT07(line)α = β ∗ γ, γ = 0.5

α ≈ 0.35,β = γ = 0.5

Given marginals(line)

p(x3 = 0) = γ

p(x4 = 0) = β

Marginal distribution of x3 and x4

10

Page 12: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Structure of Γ∗4: Matus Notation for Pij

f34

VM

VN

r∅1 VM = (r131 , r14

1 , r231 , r24

1 , r12, r

22)

VN = (r11, r

21, r

121 )

VP = (r∅1 , r31, r

41, r

131 , r14

1 , r231 , r24

1 , r1231 , r124

1 , r1341 , r234

1 , r12, r

22, r

∅3 , f34)

VK = (r1231 , r124

1 , r1341 , r234

1 )

VK

VR = (r31, r

41, r

∅3)

VR

= (VM , VK , VR, r∅1 , f34)

11

Page 13: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

Information Geometry of Ingleton Violation for 4-atom Support

f34

VM

VN

r∅1

where VM = (r131 , r14

1 , r231 , r24

1 , r12, r

22) VN = (r1

1, r21, r

121 )VK = (r123

1 , r1241 , r134

1 , r2341 )

VK

VR = (r31, r

41, r

∅3)

VR

VP = (VM , VK , VR, r∅1 , f34)

Hyperplane E : I(x3, x4) = 0where α = β ∗ γ

β = γ = 0.5Given marginals

α = 0.25,β = γ = 0.54 atoms uniform

DFZ 4 atom conjectureα ≈ 0.35,β = γ = 0.5

Hyperplane Ingleton12 = 0

Ingleton12 > 0

Ingleton12 < 0

12

Page 14: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

References[1] Raymond W. Yeung, Information Theory and Network Coding. Springer, 2008.

[2] Xijin Yan, Raymond W. Yeung, and Zhen Zhang, “The Capacity Region for Multi-source Multi-sink Network Coding,” in IEEE International Symposium on

Information Theory (ISIT), Jun. 2007, pp. 116 – 120.

[3] ——, “An Implicit Characterization of the Achievable Rate Region for Acyclic Multisource Multisink Network Coding,” IEEE Trans. on Information Theory,

vol. 58, no. 9, pp. 5625–5639, Sep. 2012.

[4] T. Chan and A. Grant, “Dualities between entropy functions and network codes,” in Fourth Workshop on Network Coding, Theory and Applications (NetCod),

January 2008.

[5] R. W. Yeung and Zhen Zhang, “Distributed source coding for satellite communications,” IEEE Trans. on Information Theory, vol. 45, no. 4, pp. 1111–1120,

1999.

[6] Raymond W. Yeung, “A Framework for Linear Information Inequalities,” IEEE Trans. on Information Theory, vol. 43, no. 6, Nov. 1997.

[7] Zhen Zhang and Raymond W. Yeung, “On Characterization of Entropy Function via Information Inequalities,” IEEE Trans. on Information Theory, vol. 44,

no. 4, Jul. 1998.

[8] ——, “A Non-Shannon-Type Conditional Inequality of Information Quantities,” IEEE Trans. on Information Theory, vol. 43, no. 6, Nov. 1997.

[9] K. Makarychev, Y. Makarychev, A. Romashchenko, and N. Vereshchagin, “A new class of non-Shannon-type inequalities for entropies,” Communication in

Information and Systems, vol. 2, no. 2, pp. 147–166, December 2002.

[10] Weidong Xu, Jia Wang, Jun Sun, “A projection method for derivation of non-Shannon-type information inequalities,” in IEEE International Symposium on

Information Theory (ISIT), 2008, pp. 2116 – 2120.

[11] Frantisek Matus, “Infinitely Many Information Inequalities,” in IEEE Int. Symp. Information Theory (ISIT), Jun. 2007, pp. 41–44.

[12] Randall Dougherty, Chris Freiling, Kenneth Zeger, “Non-Shannon Information Inequalities in Four Random Variables,” Apr. 2011, arXiv:1104.3602v1. [Online].

Available: http://arxiv.org/pdf/1104.3602.pdf

[13] Babak Hassibi, Sormeh Shadbakht, Matthew Thill, “On Optimal Design of Network Codes,” in Information Theory and Applications, UCSD, Feb. 2010,

presentation.

[14] Congduan Li, John MacLaren Walsh, Steven Weber, “A computational approach for determining rate regions and codes using entropic vector bounds,” in 50th

Annual Allerton Conference on Communication, Control and Computing, Oct. 2012. [Online]. Available:

http://www.ece.drexel.edu/walsh/Allerton2012LJW.pdf

[15] Congduan Li, Jayant Apte, John MacLaren Walsh, Steven Weber, “A new computational approach for determining rate regions and optimal codes for coded

networks,” in The 2013 IEEE International Symposium on Network Coding (NetCod 2013), Jun. 2013. [Online]. Available:

http://www.ece.drexel.edu/walsh/LiNetCod13.pdf

13

Page 15: Bounding the Entropic Region via Information Geometry · REV for N = 2: H(X) H(XY ) H(Y ) H(XY ) H(XY ) H(X) + H(Y ) N is an unknown non-polyhedral convex cone for N 4. ... Jayant

[16] James Oxley, Matroid Theory, 2nd. Ed. Oxford University Press, 2011.

[17] A. W. Ingleton, “Representation of Matroids,” in Combinatorial Mathematics and its Applications, D. J. A. Welsh, Ed. San Diego: Academic Press, 1971,

pp. 149–167.

[18] D. Hammer, A. Romashschenko, A. Shen, N. Vereshchagin, “Inequalities for Shannon Entropy and Kolmogorov Complexity,” Journal of Computer and System

Sciences, vol. 60, pp. 442–464, 2000.

[19] A. S. N. V. D. Hammer, A. Romashchenko, “Inequalities for Shannon Entropy and Kolmogorov Complexity,” Journal of Computer and System Sciences,

no. 60, pp. 442–464, 2000.

[20] Ryan Kinser, “New Inequalities for Subspace Arrangements,” J. of Comb. Theory Ser. A, vol. 188, no. 1, pp. 152–161, Jan. 2011.

[21] Randall Dougherty, Chris Freiling, Kenneth Zeger, “Linear rank inequalities on five or more variables,” submitted to SIAM J. Discrete Math. arXiv:0910.0284.

[22] T. Chan and R. Yeung, “On a relation between information inequalities and group theory,” IEEE Trans. on Information Theory, vol. 48, no. 7, pp. 1992 –

1995, Jul. 2002.

[23] F. Matus and M. Studeny, “Conditional Independences among Four Random Variables I,” Combinatorics, Probability and Computing, no. 4, pp. 269–278,

1995.

[24] J. M. Walsh, “Information Geometry, Polyhedral Computation, and Entropic Vectors,” in First Workshop on Entropy and Information Inequalities, The Chinese

University of Hong Kong, Apr. 2013. [Online]. Available: http://www.youtube.com/watch?v=gYkS73jyvPU&feature=youtu.be

14