low-complexity parameter estimation algorithms using cooperation and sparsity

55
Introduction Parameter estimation Echo cancellation Channel estimation Prior information Sparsity Low-cost solutions Other priors Collaborative estimation Distributed estimation Low-complexity algorithms Low-Complexity Parameter Estimation Algorithms using Cooperation and Sparsity Vítor H. Nascimento Laboratório de Processamento de Sinais Departamento de Eng. de Sistemas Eletrônicos Escola Politécnica Universidade de São Paulo November 6, 2014

Upload: cpqd

Post on 04-Jul-2015

115 views

Category:

Technology


4 download

DESCRIPTION

II International Workshop on Challenges and Trends on Broadband Wireless Mobile Access Networks – Beyond LTE-A

TRANSCRIPT

Page 1: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Low-Complexity Parameter Estimation

Algorithms using Cooperation and

Sparsity

Vítor H. Nascimento

Laboratório de Processamento de Sinais

Departamento de Eng. de Sistemas Eletrônicos

Escola Politécnica

Universidade de São Paulo

November 6, 2014

Page 2: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Contents

1 Introduction

Parameter estimation

Echo cancellation

Channel estimation

2 Prior information

Sparsity

Low-cost solutions

Other priors

3 Collaborative estimation

Distributed estimation

Low-complexity algorithms

Page 3: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Parameter estimation

Putting numbers to mathematical models.

• Adaptive filtering

• Kalman filtering

• Current research goals:

• Reduce complexity — save energy and chip area

• Improve performance — better tracking without

knowledge of underlying variation model

• Sparse system identification

• Distributed estimation

Page 4: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Example: acoustic echo

car

speech

echo

time (s)

time (s)

1

1

1

2

2

3

3

4

4-1

0

0

0

0

-0.2

0.2

Page 5: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Example: acoustic echo

Page 6: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

Page 7: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) = a0x(n− τ0) + a1x(n− τ1) + . . .

Page 8: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) = a0x(n− τ0) + a1x(n− τ1) + . . .

Page 9: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) = a0x(n− τ0) + a1x(n− τ1) + . . .

Page 10: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) = a0x(n)+a1x(n−1)+a2x(n−2)+ · · ·+a999x(n−999)

Page 11: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) =[

a0 a1 . . .]

x(n− τ0)

x(n− τ1)...

= aTxn

Page 12: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo model

User B

x(n)

y (n)

s(n)

LoudspeakerMicrophone

User A

+

Echo

d(n)=s(n)+y(n)+noise

User A + echo + noise

y(n) =[

0 0 0 −0.4 0 . . . 0 0.2 0 . . .]

xn

Page 13: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo impulse response

0 20 40 60 80 100 120 140 160 180−0.4

−0.3

−0.2

−0.1

0

0.1

0.2

0.3

n

an

Page 14: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Echo impulse response

0 20 40 60 80 100 120 140 160 180−0.5

−0.4

−0.3

−0.2

−0.1

0

0.1

0.2

n

an

Page 15: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Challenges

• Long impulse responses

⇒ high complexity & low tracking speed

• Fast changes in impulse response

• Fast changes in signal and noise power

• No model available for time variation

Page 16: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Challenges

• Long impulse responses

⇒ high complexity & low tracking speed

• Fast changes in impulse response

• Fast changes in signal and noise power

• No model available for time variation

i.e., for how

the ai vary

with time.

Page 17: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Channel estimation

Challenges:

• Quick acquisition (reduce use of pilots)

• Track fast mobile users

• Keep complexity low

Page 18: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Channel estimation

Challenges:

• Quick acquisition (reduce use of pilots)

• Track fast mobile users

• Keep complexity low (especially for massive MIMO)

Page 19: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Prior information

• Sparsity of the solution

• Smoothness of solution

• Statistical properties

• Model for time variation (of the ai)

Advantages

• Requires less data

• Better noise rejection

• Faster tracking

• More robust against model uncertainties

Page 20: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Least squares

Standard method for parameter estimation (Gauß, 1795):

Minimize square error

minx

‖Ax − b‖22

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2−0.2

0

0.2

0.4

0.6

0.8

1

1.2

1.4

b

Ax

Ax − b

Page 21: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Measures of distance — norms

x

a

b

0

How far is x from 0?

• Euclidean distance (ℓ2-norm): ‖x‖2 =√a2 + b2

• Manhattan distance (ℓ1-norm): ‖x‖1 = |a|+ |b|• Maximum (ℓ∞-norm): ‖x‖∞ = max{|a|, |b|}

Page 22: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Measures of distance — norms

x

a

b

0

How far is x from 0?

• Euclidean distance (ℓ2-norm): ‖x‖2 =√a2 + b2

• Manhattan distance (ℓ1-norm): ‖x‖1 = |a|+ |b|• Maximum (ℓ∞-norm): ‖x‖∞ = max{|a|, |b|}

Page 23: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Measures of distance — norms

x

a

b

0

How far is x from 0?

• Euclidean distance (ℓ2-norm): ‖x‖2 =√a2 + b2

• Manhattan distance (ℓ1-norm): ‖x‖1 = |a|+ |b|• Maximum (ℓ∞-norm): ‖x‖∞ = max{|a|, |b|}

Page 24: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Measures of distance — norms

x

a

b

0

How far is x from 0?

• Euclidean distance (ℓ2-norm): ‖x‖2 =√a2 + b2

• Manhattan distance (ℓ1-norm): ‖x‖1 = |a|+ |b|• Maximum (ℓ∞-norm): ‖x‖∞ = max{|a|, |b|}

Page 25: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Regularization

—A method for incorporating prior knowledge

Regularized least squares

minx

{

‖Ax − b‖22 + f(x)}

f(x) — measures how far x is from the kind of solution we

expect.

Examples:

• f(x) = ‖x‖22: if we know that x is not too far from the

origin

Page 26: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Regularization

—A method for incorporating prior knowledge

Regularized least squares

minx

{

‖Ax − b‖22 + f(x)}

f(x) — measures how far x is from the kind of solution we

expect.

Examples:

• f(x) = ‖x‖22: if we know that x is not too far from the

origin

• f(x) = ‖x‖1: if we know that x is sparse

Page 27: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Regularization

—A method for incorporating prior knowledge

Regularized least squares

minx

{

‖Ax − b‖22 + f(x)}

f(x) — measures how far x is from the kind of solution we

expect.

Examples:

• f(x) = ‖x‖22: if we know that x is not too far from the

origin

• f(x) = ‖x‖1: if we know that x is sparse

• f(x) = TV(x): if we know that x is smooth

Page 28: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Regularization

—A method for incorporating prior knowledge

Regularized least squares

minx

{

‖Ax − b‖22 + f(x)}

f(x) — measures how far x is from the kind of solution we

expect.

Examples:

• f(x) = ‖x‖22: if we know that x is not too far from the

origin

• f(x) = ‖x‖1: if we know that x is sparse

• f(x) = TV(x): if we know that x is smooth

• f(x) = ‖x‖∞: optimize the worst case scenario

Page 29: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity

Sparse (or approximately sparse) systems occur

frequently:

• Echo impulse resposte

• Channels in mobile communications

• Underwater channels

• Directions of interferers in beamforming

• Radar targets

• Image processing and acquisition

Page 30: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

True solution

Noisy least-squares

solution

Page 31: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

True solution

ℓ1 regularization

Page 32: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

Pointy shapes:

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

True solution

Page 33: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

True solution

ℓ2 regularization

Page 34: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

Pointy shapes: ℓ0,5

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

Page 35: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

Pointy shapes: ℓ0,5

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

Page 36: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Sparsity-inducing regularizations

Pointy shapes: ℓ0?

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

Page 37: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Solution to regularized problems

Difficulties:

• Lack of closed-form solution

• For ℓ1: non-diferentiable cost function

• For ℓp with p < 1:

• non-convex cost function

• find global minimum: NP-hard

• find local minimum: polynomial time

Solution: use iterative algorithms, starting from good

initial conditions.

Example: homotopy algorithms

Page 38: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Homotopy

Solve a sequence of problems

minxk

{

‖Axk − b‖22 + λk‖x‖p}

.

1 Start with a large value of λ0 ⇒ x0 = 0.

2 Reduce λk slowly, using xk−1 as initial condition for

new optimization

Problem:

Must solve large number of systems linear equations.

Solution:

Use DCD.

Page 39: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Dichotomous coordinate-descent algorithm

Zakharov & Tozer, 2004.

• Iterative method for solving systems of linear

equations

• Avoids multiplications, no divisions (only shifts and

adds)

• Easy to implement in hardware (FPGAs or ASICs)

• Many problems in Engineering are solved by

sequentially solving systems of linear equations

Why is DCD useful in our case?

• Use previous estimate xk−1 as a warm start for DCD

at iteration k

• Very few DCD iterations are necessary (1 ∼ 8)

• Each DCD iteration is very cheap

Page 40: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Example

5 10 15 20 25 30 35 40−40

−35

−30

−25

−20

−15

−10

−5

0L1_2m12_gr (57): σ = 0.01, M = 64, N = 256

Number of non−zeros

MS

E (

dB)

Hl1−DCD: N

u = 1

Hl1−DCD: N

u = 2

Hl1−DCD: N

u = 8

Hl1−DCD: N

u = 32

MPYALL1Oracle

5 10 15 20 25 30 35 4010

4

105

106

107

L1_2m12_gr (57): σ = 0.01, M = 64, N = 256

Number of non−zeros

Ope

ratio

ns

Hl1−DCD: N

u = 1

Hl1−DCD: N

u = 2

Hl1−DCD: N

u = 8

Hl1−DCD: N

u = 32

MPYALL1

Parameters of the scenario: M = 64 measurements,

N = 256 unknowns, noise variance σ = 10−4.

Page 41: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Other ways of using sparsity

Split the set of coefficients and use several independent

estimations.

+

y(n)

x(n)

e (n) e (n)

4w3w2ww1

z−1 z−1 z−1

21

22

+

−−

0 0.5 1 1.5 2

x 104

10−2

10−1

100

101

NLMSSSLMSVL alg.

• Needs less data to acquire a good estimate.

Page 42: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Regularization for smooth solutionsTV regularization (derivative): acoustic images

Page 43: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Blind equalization

Prof. Magno Silva, Maria Miranda & João Mendes

Page 44: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Collaborative estimation

w(n)

w1(n)

w2(n)

e1(n)

e2(n)

e(n)

d(n)

y(n)

y1(n)

y2(n)

λ(n)

1 − λ(n)

x(n)

• Exchange of information between different algorithms

• Increased diversity leads to improved performance

• More robust against parameter choices and changes

in environment

• Overall performance same or better than possible

with each component filter

Page 45: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Diversity gain

Page 46: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Robustness

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

x 104

10−5

10−4

10−3

10−2

10−1

100

101

Time

MS

D

CombinationVSS aVSS b

µo = µf µo = µs µo > µf µo < µs µs < µo < µf

Page 47: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Advantages

• May approach performance of model-aided

algorithms using model-free methods

• Highly parallelizable

• Very robust to environment conditions

• Redundancy in filters can be used to limit complexity

Page 48: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distributed estimation / sensor networks

2 3

5 6 7

4

8 9 10

1

0.2

0.30.2

0.2

0.1

0.1

0.1

0.2

0.3

0.2

0.20.5

0.3

0.2

0.1

0.2

0.1

0.7

0.3

0.3

Collaboration of several nodes

• Information exchange: better performance

• How much information exchange is necessary?

• Parallel estimation algorithms

Page 49: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distributed MIMO

Page 50: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distributed MIMO

Massive MIMO

Page 51: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distributed MIMO

Device-to-device communications

Page 52: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distributed MIMO

Device-to-device communications

Page 53: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Distribution field estimation

Prof. Cassio Lopes

Page 54: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Low-cost estimation algorithms

Better performance:

• Prior information

• Cooperative estimation

• Diversity gain: different algorithms

• Robustness: different settings

Lower-cost algorithms:

• Reduce redundancy → differential cooperation

• Avoid multiplications and divisions→ DCD.

Page 55: Low-complexity parameter estimation algorithms using cooperation and sparsity

Introduction

Parameter estimation

Echo cancellation

Channel estimation

Prior

information

Sparsity

Low-cost solutions

Other priors

Collaborative

estimation

Distributed estimation

Low-complexity

algorithms

Thank you.