capacity limits of wireless channels with multiple antennas: challenges, insights, and new...
Post on 22-Dec-2015
231 Views
Preview:
TRANSCRIPT
Capacity Limits of Wireless Channels
with Multiple Antennas:Challenges, Insights, and New
Mathematical Methods
Andrea GoldsmithStanford University
CoAuthors: T. Holliday, S. Jafar, N. Jindal, S. Vishwanath
Princeton-Rutgers Seminar SeriesRutgers University
April 23, 2003
Future Wireless Systems
Nth Generation CellularNth Generation WLANsWireless Entertainment Wireless Ad Hoc NetworksSensor Networks Smart Homes/AppliancesAutomated Cars/FactoriesTelemedicine/LearningAll this and more…
Ubiquitous Communication Among People and Devices
Challenges
The wireless channel is a randomly-varying broadcast medium with limited bandwidth.
Fundamental capacity limits and good protocol designs for wireless networks are open problems.
Hard energy and delay constraints change fundamental design principles
Many applications fail miserably with a “generic” network approach: need for crosslayer design
Outline
Wireless Channel Capacity
Capacity of MIMO ChannelsImperfect channel informationChannel correlations
Multiuser MIMO ChannelsDuality and Dirty Paper Coding
Lyapunov Exponents and Capacity
Wireless Channel Capacity
Fundamental Limit on Data Rates
Main drivers of channel capacity Bandwidth and power Statistics of the channel Channel knowledge and how it is
used Number of antennas at TX and RX
Capacity: The set of simultaneously achievable rates {R1,…,Rn}
R1R2
R3
R1
R2
R3
MIMO Channel Model
x1
x2
x3
y1
y2
y3
h11
h21
h31
h12
h22
h32
h13 h23
h33
),0(~,
2
11
1
1111
INnn
n
x
x
hh
hh
y
ynHxy
mnmnm
n
m
Model applies to any channel described by a matrix (e.g. ISI channels)
n TX antennas m RX antennas
What’s so great about MIMO?
Fantastic capacity gains (Foschini/Gans’96, Telatar’99)Capacity grows linearly with antennas when channel
known perfectly at Tx and Rx
Vector codes (or scalar codes with SIC) optimalAssumptions:
Perfect channel knowledge Spatially uncorrelated fading: Rank
(HTQH)=min(n,m)
)(
1
2
:)(:)1log(max||logmax
QHHRank
iii
PPP
T
PQTrQ
T
iii
pQHHIB
C
What happens when these assumptions are relaxed?
Realistic Assumptions
No transmitter knowledge of HCapacity is much smaller
No receiver knowledge of HCapacity does not increase as the
number of antennas increases (Marzetta/Hochwald’99)
Will the promise of MIMO be realized in practice?
Partial Channel Knowledge
Model channel as H~N(,)Receiver knows channel H perfectlyTransmitter has partial information about H
)|(~
),0(~ 2
HpH
INn
nHxy
Channel
ReceiverTransmitter
x y
,H
Partial Information Models
Channel mean informationMean is measured, Covariance unknown
Channel covariance informationMean unknown, measure covariance
We have developed necessary and sufficient conditions for the optimality of beamformingObtained for both MISO and MIMO channelsOptimal transmission strategy also known
) ,(~ INH
),0(~ NH
Beamforming
Scalar codes with transmit precoding
1x
2x
nx
x Receiver1c
nc
• Transforms the MIMO system into a SISO system.• Greatly simplifies encoding and decoding.• Channel indicates the best direction to beamform
•Need “sufficient” knowledge for optimality
Optimality of Beamforming
Mean Information
Optimality of Beamforming
Covariance Information
No Tx or Rx Knowledge Increasing nT beyond coherence time T in a block fading
channel does not increase capacity (Marzetta/Hochwald’99)Assumes uncorrelated fading.
We have shown that with correlated fading, adding Tx antennas always increases capacitySmall transmit antenna spacing is good!
Impact of spatial correlations on channel capacityPerfect Rx and Tx knowledge: hurts (Boche/Jorswieck’03)Perfect Rx knowledge, no Tx knowledge: hurts (BJ’03)
Perfect Rx knowledge, Tx knows correlation: helpsTX and Rx only know correlation: helps
Gaussian Broadcast and Multiple Access
Channels
Broadcast (BC): One Transmitter to Many Receivers.
Multiple Access (MAC): Many Transmitters to One Receiver.
x h1(t)x h21(t)
x h3(t)
• Transmit power constraint• Perfect Tx and Rx knowledge
x h22(t)
Differences:Shared vs. individual power constraintsNear-far effect in MAC
Similarities:Optimal BC “superposition” coding is also
optimal for MAC (sum of Gaussian codewords)
Both decoders exploit successive decoding and interference cancellation
Comparison of MAC and BC
P
P1
P2
MAC-BC Capacity Regions
MAC capacity region known for many casesConvex optimization problem
BC capacity region typically only known for (parallel) degraded channelsFormulas often not convex
Can we find a connection between the BC and MAC capacity regions?Duality
Dual Broadcast and MAC Channels
x
)(1 nh
x
)(nhM
+
)(1 nz
)(1 nx
)(nxM
)(1 ny)( 1P
)( MP
x
)(1 nh
x
)(nhM
+
)(nzM
)(nyM
+
)(nz
)(ny)(nx)(P
Gaussian BC and MAC with same channel gains and same noise power at each receiver
Broadcast Channel (BC)Multiple-Access Channel (MAC)
The BC from the MAC
Blue = BCRed = MAC
21 hh
P1=1, P2=1
P1=1.5, P2=0.5
P1=0.5, P2=1.5
),;(),;,( 21212121 hhPPChhPPC BCMAC
MAC with sum-power constraint
PP
MACBC hhPPPChhPC
10
211121 ),;,(),;(
Sum-Power MAC
MAC with sum power constraintPower pooled between MAC
transmittersNo transmitter coordination
P
P
MAC BCSame capacity region!
),;(),;,(),;( 210
211121
1
hhPChhPPPChhPC SumMAC
PPMACBC
BC to MAC: Channel Scaling
Scale channel gain by , power by 1/MAC capacity region unaffected by scaling Scaled MAC capacity region is a subset of the scaled BC capacity region for any MAC region inside scaled BC region for anyscaling
1h1P
2P 2h
+
+
21 P
P
1h
2h+
MAC
BC
The BC from the MAC
0
2121
2121 ),;(),;,(
hhPP
ChhPPC BCMAC
Blue = Scaled BCRed = MAC
1
2
h
h
0
BC in terms of MAC
MAC in terms of BC
PP
MACBC hhPPPChhPC
10
211121 ),;,(),;(
0
2121
2121 ),;(),;,(
hhPP
ChhPPC BCMAC
Duality: Constant AWGN Channels
What is the relationship betweenthe optimal transmission strategies?
Equate rates, solve for powers
Opposite decoding order Stronger user (User 1) decoded last in BCWeaker user (User 2) decoded last in MAC
Transmission Strategy
Transformations
BB
BMM
BB
M
MM
RPh
PhPhR
RPh
Ph
PhR
221
22
222
2
222
2
12
121
222
121
1
)1log()1log(
)1log()1log(
Duality Applies to Different
Fading Channel Capacities
Ergodic (Shannon) capacity: maximum rate averaged over all fading states.
Zero-outage capacity: maximum rate that can be maintained in all fading states.
Outage capacity: maximum rate that can be maintained in all nonoutage fading states.
Minimum rate capacity: Minimum rate maintained in all states, maximize average rate in excess of minimum
Explicit transformations between transmission strategies
Duality: Minimum Rate Capacity
BC region known MAC region can only be obtained by duality
Blue = Scaled BCRed = MAC
MAC in terms of BC
What other unknown capacity regions can be obtained by duality?
Dirty Paper Coding (Costa’83)
Dirty Paper Coding
Clean Channel Dirty Channel
Dirty Paper
Coding
Basic premiseIf the interference is known, channel
capacity same as if there is no interference
Accomplished by cleverly distributing the writing (codewords) and coloring their ink
Decoder must know how to read these codewords
Modulo Encoding/Decoding
Received signal Y=X+S, -1X1S known to transmitter, not receiver
Modulo operation removes the interference effectsSet X so that Y[-1,1]=desired message (e.g. 0.5)Receiver demodulates modulo [-1,1]
-1 +3 +5+1-3
…-5 0
S
-1 +10
-1 +10
X
+7-7
…
Broadcast MIMO Channel
111 n x H y 1H
x
1n
222 n x H y 2H
2n
t1 TX antennasr11, r21 RX antennas
)1 t(r
)2 t(r
)IN(0,~n)IN(0,~n21 r2r1
Non-degraded broadcast channel
Perfect CSI at TX and RX
Capacity Results
Non-degraded broadcast channelReceivers not necessarily “better” or
“worse” due to multiple transmit/receive antennas
Capacity region for general case unknown
Pioneering work by Caire/Shamai (Allerton’00): Two TX antennas/two RXs (1 antenna each)Dirty paper coding/lattice precoding*
Computationally very complexMIMO version of the Sato upper bound
*Extended by Yu/Cioffi
Dirty-Paper Coding (DPC)
for MIMO BCCoding scheme:
Choose a codeword for user 1Treat this codeword as interference to user 2Pick signal for User 2 using “pre-coding”
Receiver 2 experiences no interference:
Signal for Receiver 2 interferes with Receiver 1:
Encoding order can be switched
)) log(det(I R 2222THH
) det(I
))( det(Ilog R
121
12111 T
T
HH
HH
Dirty Paper Coding in Cellular
Does DPC achieve capacity?
DPC yields MIMO BC achievable region.We call this the dirty-paper region
Is this region the capacity region?
We use duality, dirty paper coding, and Sato’s upper bound to address this question
MIMO MAC with sum power
MAC with sum power: Transmitters code independentlyShare power
Theorem: Dirty-paper BC region equals the dual sum-power MAC region
PP
MACSumMAC PPPCPC
10
11 ),()(
)()( PCPC SumMAC
DPCBC
P
Transformations: MAC to BC
Show any rate achievable in sum-power MAC also achievable with DPC for BC:
A sum-power MAC strategy for point (R1,…RN) has a given input covariance matrix and encoding order
We find the corresponding PSD covariance matrix and encoding order to achieve (R1,…,RN) with DPC on BC The rank-preserving transform “flips the effective
channel” and reverses the order Side result: beamforming is optimal for BC with 1 Rx
antenna at each mobile
)()( PCPC SumMAC
DPCBC
DPC BC Sum MAC
Transformations: BC to MAC
Show any rate achievable with DPC in BC also achievable in sum-power MAC:
We find transformation between optimal DPC strategy and optimal sum-power MAC strategy “Flip the effective channel” and reverse order
)()( PCPC SumMAC
DPCBC
DPC BC Sum MAC
Computing the Capacity Region
Hard to compute DPC region (Caire/Shamai’00)
“Easy” to compute the MIMO MAC capacity regionObtain DPC region by solving for sum-
power MAC and applying the theoremFast iterative algorithms have been
developedGreatly simplifies calculation of the DPC
region and the associated transmit strategy
)()( PCPC SumMAC
DPCBC
Based on receiver cooperation
BC sum rate capacity Cooperative capacity
Sato Upper Bound on the
BC Capacity Region
+
+
1H
2H
1n
2n
1y
2y
x
|HHΣI|log2
1maxH)(P, T
xsumrateBC
x
C
Joint receiver
The Sato Bound for MIMO BC
Introduce noise correlation between receiversBC capacity region unaffected
Only depends on noise marginals
Tight Bound (Caire/Shamai’00)Cooperative capacity with worst-case noise correlation
Explicit formula for worst-case noise covarianceBy Lagrangian duality, cooperative BC region equals
the sum-rate capacity region of MIMO MAC
|ΣHHΣΣI|log2
1maxinfH)(P, 1/2
zT
x1/2
zsumrateBC
xz
C
Sum-Rate Proof
DPC Achievable
Lagrangian Duality
Obvious
)()( PCPC BCDPCBC
)()( PCPC SumMACMAC
)()( PCPC DPCBC
SumMAC
Duality
Sato Bound
)()( PCPC SumMAC
sumrateCoopBC
)()( PCPC CoopBCBC
)()( PCPC BC
sumrateDPCBC
Compute from MAC
*Same result by Vishwanath/Tse for 1 Rx antenna
MIMO BC Capacity Bounds
Sato Upper Bound
Single User Capacity BoundsDirty Paper Achievable Region
BC Sum Rate Point
Does the DPC region equal the capacity region?
Full Capacity Region
DPC gives us an achievable region
Sato bound only touches at sum-rate point
We need a tighter bound to prove DPC is optimal
A Tighter Upper Bound
Give data of one user to other usersChannel becomes a degraded BCCapacity region for degraded BC knownTight upper bound on original channel capacity
This bound and duality prove that DPC achieves capacity under a Gaussian input restrictionRemains to be shown that Gaussian inputs are optimal
+
+
1H
2H 2n
1y
2y
x 2y
1n
Full Capacity Region Proof
Tight Upper Bound
Worst Case Noise Diagonalizes
Duality
)()( PCPC DSMBCBC
)()( PCPC DPBCMAC
)()( PCPC DPCBCBC
Final Result
Duality
)()( PCPC MACDSMMAC
)()( PCPC DSMMAC
DSMBC
inputsGaussianfor
PCPC BCDPCBC )()(
Compute from MAC
Time-varying Channels
with Memory
Time-varying channels with finite memory induce infinite memory in the channel output.
Capacity for time-varying infinite memory channels is only known in terms of a limit
Closed-form capacity solutions only known in a few casesGilbert/Elliot and Finite State Markov Channels
nn
nXpYXI
nC
n;
1limmax
)(
A New Characterization of Channel Capacity
Capacity using Lyapunov exponents
Similar definitions hold for (Y) and (X;Y)Matrices BYi
and BXiYi depend on input and
channel
)],()()([max)(
YXYXCxp
||...||log1
lim)(21 nXXX
nBBB
nX
where the Lyapunov exponent
for BXi a random matrix whose entries
depend on the input symbol Xi
Lyapunov Exponents and Entropy
Lyapunov exponent equals entropy under certain conditionsEntropy as a product of random matricesConnection between IT and dynamic systems
theory
Still have a limiting expression for entropySample entropy has poor convergence properties
),,(log1
lim)( 1 nn
XXPn
X
),,(log1
lim)( 1 nn
YYPn
Y
)),(,),,((log1
lim),( 11 nnn
YXYXPn
YX
Lyapunov Direction Vector
The vector pn is the “direction” associated with (X) for any .Also defines the conditional channel state
probability
Vector has a number of interesting properties
It is the standard prediction filter in hidden Markov models
Under certain conditions we can use its stationary distribution to directly compute (X) (X)
)|(P||...||
...1
121
21 nn
XXX
XXXn XZ
BBB
BBBp
n
n
Computing Lyapunov Exponents
Define as the stationary distribution of the “direction vector” pnpn
We prove that we can compute these Lyapunov exponents in closed form as
This result is a significant advance in the theory of Lyapunov exponent computation
||]||[log)( , XX pBEX
pn
pn+1
pn+2
Computing Capacity
Closed-form formula for mutual information
We prove continuity of the Lyapunov exponents with respect to input distribution and channelCan thus maximize mutual information
relative to channel input distribution to get capacity
Numerical results for time-varying SISO and MIMO channel capacity have been obtained
We also develop a new CLT and confidence interval methodology for sample entropy
),()()();( YXYXYXI
Sensor Networks
Energy is a driving constraint.Data flows to centralized location.Low per-node rates but up to 100,000 nodes.Data highly correlated in time and space.Nodes can cooperate in transmission and
reception.
Energy-Constrained Network Design
Each node can only send a finite number of bitsTransmit energy per bit minimized by sending each
bit over many dimensions (time/bandwidth product)Delay vs. energy tradeoffs for each bit
Short-range networks must consider both transmit, analog HW, and processing energySophisticated techniques for modulation, coding,
etc., not necessarily energy-efficient Sleep modes save energy but complicate networking
New network design paradigm:Bit allocation must be optimized across all protocolsDelay vs. throughput vs. node/network lifetime
tradeoffsOptimization of node cooperation (coding, MIMO,
etc.)
Results to DateModulation Optimization
Adaptive MQAM vs. MFSK for given delay and rateTakes into account RF hardware/processing tradeoffs
MIMO vs. MISO vs. SISO for constrained energySISO has best performance at short distances
(<100m)
Optimal Adaptation with Delay/Energy Constraints
Minimum Energy Routing
Conclusions Shannon capacity gives fundamental data rate limits
for wireless channels
Many open capacity problems for time-varying multiuser MIMO channels
Duality and dirty paper coding are powerful tools to solve new capacity problems and simplify computation
Lyapunov exponents a powerful new tool for solving capacity problems
Cooperative communications in sensor networks is an interesting new area of research
top related