shannon’s channel capacity - university of albertahcdc/library/mimochclass/channelcapacity.pdf ·...

7

Click here to load reader

Upload: phungminh

Post on 01-May-2018

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

Shannon’s Channel Capacity

Shannon derived the following capacity formula (1948) for an additivewhite Gaussian noise channel (AWGN):

C = W log2 (1 + S/N) [bits/second]

• W is the bandwidth of the channel in Hz

• S is the signal power in watts

• N is the total noise power of the channel watts

Channel Coding Theorem (CCT):The theorem has two parts.

1. Its direct part says that for rate R < C there exists a coding systemwith arbitrarily low block and bit error rates as we let the codelengthN →∞.

2. The converse part states that for R ≥ C the bit and block errorrates are strictly bounded away from zero for any coding system

The CCT therefore establishes rigid limits on the maximal supportabletransmission rate of an AWGN channel in terms of power and bandwidth.

Bandwidth Efficiency characterizes how efficiently a system uses itsallotted bandwidth and is defined as

η =Transmission Rate

Channel BandwidthW[bits/s/Hz].

From it we calculate the Shannon limit as

ηmax = log2

(1 +

S

N

)[bits/s/Hz]. (1)

Note: In order to calculate η, we must suitably define the channel bandwidth W . One com-monly used definition is the 99% bandwidth definition, i.e., W is defined such that 99%of the transmitted signal power falls within the band of width W .

Page 2: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 2

Average Signal Power S can be expressed as

S =kEb

T= REb,

• Eb is the energy per bit

• k is the number of bits transmitted per symbol

• T is the duration of a symbol

• R = k/T is the transmission rate of the system in bits/s.

• S/N is called the signal-to-noise ratio

• N = N0W is the total noise power

• N0 is the one-sided noise power spectral density

we obtain the Shannon limit in terms of the bit energy and noise powerspectral density, given by

ηmax = log2

(1 +

REb

N0W

).

This can be resolved to obtain the minimum bit energy required for reli-able transmission, called the Shannon bound:

Eb

N0≥ 2ηmax − 1

ηmax,

Fundamental limit: For infinite amounts of bandwidth, i.e., ηmax → 0,we obtain

Eb

N0≥ lim

ηmax→0

2ηmax − 1

ηmax= ln(2) = −1.59dB

This is the absolute minimum signal energy to noise power spectral den-sity ratio required to reliably transmit one bit of information.

Page 3: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 3

Normalized Capacity

The dependence on the arbitrary definition of the bandwidth W is notsatisfactory. We prefer to normalize our formulas per signal dimension.It is given by [5]. This is useful when the question of waveforms andpulse shaping is not a central issue, since it allows one to eliminate theseconsiderations by treating signal dimensions [2].

Cd =1

2log2

(1 + 2

RdEb

N0

)[bits/dimension]

Cc = log2

(1 +

REb

N0

)[bits/complex dimension]

Applying similar manipulations as above, we obtain the Shannon boundnormalized per dimension as

Eb

N0≥ 22Cd − 1

2Cd;

Eb

N0≥ 2Cc − 1

Cc.

System Performance Measure In order to compare different commu-nications systems we need a parameter expressing the performance level.It is the information bit error probability Pb and typically falls into therange 10−3 ≥ Pb ≥ 10−6.

Page 4: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 4

Examples:

Spectral Efficiencies versus power efficiencies of various coded and un-coded digital transmission systems, plotted against the theoretical limitsimposed by the discrete constellations.

Unachieva

ble

Region

QPSK

BPSK

8PSK

16QAMBTCM32QAM

Turbo65536

TCM16QAM

ConvCodes

TCM8PSK

214 Seqn.

214 Seqn.

(256)

(256)

(64)

(64)

(16)

(16)

(4)

(4)

32QAM

16PSK

BTCM64QAM

BTCM16QAM

TTCM

BTCBPSK

(2,1,14) CC

(4,1,14) CCTurbo65536

Cc [bits/complex dimension]

-1.59 0 2 4 6 8 10 12 14 150.1

0.2

0.3

0.40.5

1

2

5

10

EbN0

[dB]

BPSK

QPSK

8PSK

16QAM16PSK

ShannonBound

Page 5: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 5

Discrete Capacities

In the case of discrete constellations, Shannon’s formula needs to be al-tered. We begin with its basic formulation:

C = maxq

[H(Y )−H(Y |X)]

= maxq

−∫y

∑akq(ak)p(y|ak)log

q(a′k)∑a′k

p(y|a′k)

−∫y

∑akq(ak)p(y|ak) log (p(y|ak))

]

= maxq

∑akq(ak)

∫y

log

p(y|ak)∑a′kq(a′k)p(y|a′k)

where {ak} are the K discrete signal points, q(ak) is the probability withwhich ak is selected, and

p(y|ak) =1√

2πσ2exp

−(y − ak)2

2σ2

in the one-dimensional case, and

p(y|ak) =1

2πσ2 exp

−(y − ak)2

2σ2

in the complex case.

Symmetrical Capacity: When q(ak) = 1/K, then

C = log(K)− 1

K

∑ak

∫n

log

∑a′k

exp

−(n− a′k + ak)2 − n2

2σ2

This equation has to be evaluated numerically.

Page 6: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 6

Code Efficiency

Shannon et. al. [4] proved the following lower bound on the codeworderror probability PB:

PB > 2−N(Esp(R)+o(N)); Esp(R) = maxq

maxρ>1

(E0(q, ρ)− ρR)) .

The bound is plotted for rate R = 1/2 for BPSK modulation [3], togetherwith selected Turbo coding schemes and classic concatenated methods:

N=44816-state

N=36016-state

N=13344-state

N=133416-state

N=4484-state

N=204816-state

N=2040 concatenated (2,1,8) CCRS (255,223) code

N=2040 concatenated (2,1,6) CC +RS (255,223) code

N=1020016-state

N=1638416-state N=65536

16-state

N=4599 concatenated (2,1,8) CCRS (511,479) code

N=1024 block Turbo code using (32,26,4) BCH codes

UnachievableRegion

ShannonCapacity

10 100 1000 104 105 106−1

0

1

2

3

4

EbN0

[dB]

N

Page 7: Shannon’s Channel Capacity - University of Albertahcdc/Library/MIMOchClass/ChannelCapacity.pdf · EE 7950: Statistical Communication Theory 5 Discrete Capacities In the case of

EE 7950: Statistical Communication Theory 7

Code Performance

The performance of various coding schemes is shown below (see [2]). Notethe steep behavior of the turbo codes known as the Turbo cliff.

0 1 2 3 4 5 6 7 8 9 1010−6

10−5

10−4

10−3

10−2

10−1

1

Bit Error Probability (BER)

EbN0

BPSK

CC; 4 states

CC; 16 states

CC; 64 states

CC; 214

states CC; 240 states

Turbo

Code

References

[1] R.G. Gallager, Information Theory and Reliable Communication, John Wiley & Sons,Inc., New-York, 1968.

[2] C. Schlegel, Trellis Coding, IEEE Press, Piscataway, NJ, 1997.[3] C. Schlegel and L.C. Perez, “On error bounds and turbo codes,”, IEEE Communications

Letters, Vol. 3, No. 7, July 1999.[4] C.E. Shannon, R.G. Gallager, and E.R. Berlekamp, “Lower bounds to error probabilities

for coding on discrete memoryless channels,” Inform. Contr., vol. 10, pt. I, pp. 65–103,1967, Also, Inform. Contr., vol. 10, pt. II, pp. 522-552, 1967.

[5] J.M. Wozencraft and I.M. Jacobs, Principles of Communication Engineering, John Wiley& Sons, Inc., New York, 1965, reprinted by Waveland Press, 1993.