chapter-12: least squares & rls estimation marc moonendspuser/dsp-cis/2012-2013/chapte… ·...
TRANSCRIPT
1
DSP-CIS
Chapter-12: Least Squares & RLS Estimation
Marc Moonen Dept. E.E./ESAT, KU Leuven
[email protected] www.esat.kuleuven.be/scd/
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 2
Part-III : Optimal & Adaptive Filters : Optimal & Adaptive Filters - Intro
• General Set-Up • Applications • Optimal (Wiener) Filters
– : Least Squares & Recursive Least Squares Estimation • Least Squares Estimation • Recursive Least Squares (RLS) Estimation • Square-Root Algorithms
– : Least Means Squares (LMS) Algorithm
: Fast Recursive Least Squares Algorithms
: Kalman Filtering
Chapter-11
Chapter-12
Chapter-13
Chapter-14
Chapter-15
2
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 3
Least Squares & RLS Estimation
3
1. Least Squares (LS) Estimation
Prototype optimal/adaptive filter revisited
+<
filter
filter input
error desired signal
filter output
filter parameters
filter structure ?
! FIR filters(=pragmatic choice)
cost function ?
! quadratic cost function(=pragmatic choice)
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 4
Least Squares & RLS Estimation
uk−3uk−2uk−1uk
4
1. Least Squares (LS) Estimation
FIR filters (=tapped-delay line filter/‘transversal’ filter)
66
filter input
0
6
w0[k] w1[k] w2[k] w3[k]
u[k] u[k-1] u[k-2] u[k-3]
b w
aa+bw
+<
error desired signal
filter output
yk =N!1!
l=0
wl · uk!l
yk = wT · uk
where
wT ="w0 w1 w2 · · · wN!1
#
uTk =
"uk uk!1 uk!2 · · · uk!N+1
#
3
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 5
Least Squares & RLS Estimation
5
1. Least Squares (LS) Estimation
Quadratic cost function
MMSE : (see Lecture 8)
JMSE(w) = E{e2k} = E{|dk ! yk|2} = E{|dk !wTuk|2}
Least-squares(LS) criterion :if statistical info is not available, may use an alternative ‘data-based’ criterion...
JLS(w) =L!
k=1
e2k =
L!
k=1
|dk ! yk|2 ="L
k=1 |dk !wTuk|2
Interpretation? : see below
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 6
Least Squares & RLS Estimation
6
1. Least Squares (LS) Estimation
filter input sequence : u1,u2,u3, . . . ,uL
corresponding desired response sequence is : d1, d2, d3, . . . , dL
!
"""#
e1
e2...
eL
$
%%%&
' () *error signal e
=
!
"""#
d1
d2...
dL
$
%%%&
' () *d
!
!
""""#
uT1
uT2...
uTL
$
%%%%&
' () *U
·
!
""#
w0w1...
wN
$
%%&
' () *w
cost function : JLS(w) =+L
k=1 e2k = "e"22 = "d! Uw"22
# linear least squares problem : minw "d! Uw"22
4
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 7
Least Squares & RLS Estimation
7
1. Least Squares (LS) Estimation
JLS(w) =L!
k=1
e2k = !e!22 = eT · e = !d" Uw!22
minimum obtained by setting gradient = 0 :
0 = [!JLS(w)
!w]w=wLS = [
!
!w(dTd + wTUTUw " 2wTUTd)]w=wL
= [2 UTU" #$ %Xuu
w " 2 UTd"#$%Xdu
]w=wLS
Xuu · wLS = Xdu # wLS = X"1uuXdu ‘normal equations’.
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 8
Least Squares & RLS Estimation
9
1. Least Squares (LS) Estimation
Note : correspondences with Wiener filter theory ?
! estimate X̄uu and X̄du by time-averaging (ergodicity!)
estimate{X̄uu} =1
L
L!
k=1
uk · uTk =
1
LUT · U =
1
LXuu
estimate{X̄du} =1
L
L!
k=1
uk · dk =1
LUT · d =
1
LXdu
leads to same optimal filter :
estimate{wWF} = (1LXuu)"1 · ( 1
LXdu) = X"1uu · Xdu = wLS
5
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 9
Least Squares & RLS Estimation
10
1. Least Squares (LS) Estimation
Note : correspondences with Wiener filter theory ? (continued)
! Furthermore (for ergodic processes!) :
X̄uu = limL"#
1
L
L!
k=1
uk · uTk = lim
L"#
1
LXuu
X̄du = limL"#
1
L
L!
k=1
uk · dk = limL"#
1
LXdu
so that
limL"#wLS = wWF .
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 10
Least Squares & RLS Estimation
11
2. Recursive Least Squares (RLS)
For a fixed data segment 1...L, least squares problem is
minw !d" Uw!22
d =
!
"""#
d1
d2...
dL
$
%%%&U =
!
""""#
uT1
uT2...
uTL
$
%%%%&
wLS = [UTU ]"1 · UTd .
Wanted : recursive/adaptive algorithmsL# L + 1, sliding windows, etc...
6
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 11
Least Squares & RLS Estimation
122. Recursive Least Squares (RLS)
• given optimal solution at time L ...
wLS(L) = [Xuu(L)]!1Xdu(L) = [U (L)TU (L)]!1[U (L)Td(L)]
d(L) =
!
""""#
d1
d2
...
dL
$
%%%%&U(L) =
!
""""#
uT1
uT2...
uTL
$
%%%%&
• ...compute optimal solution at time L + 1
wLS(L + 1) = ...
d(L + 1) =
!
""""#
d1
d2
...
dL+1
$
%%%%&U(L + 1) =
!
""""#
uT1
uT2...
uTL+1
$
%%%%&
Wanted : O(N2) instead of O(N3) updating scheme
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 12
Least Squares & RLS Estimation
13
2.1 Standard RLS
It is observed that Xuu(L + 1) = Xuu(L) + uL+1uTL+1
The matrix inversion lemma states that
[Xuu(L + 1)]!1 = [Xuu(L)]!1 ! [Xuu(L)]!1uL+1uTL+1[Xuu(L)]!1
1+uTL+1[Xuu(L)]!1uL+1
Result :
wLS(L + 1) = wLS(L) + [Xuu(L + 1)]!1uL+1! "# $Kalman gain
· (dL+1 ! uTL+1wLS(L))! "# $
a priori residual
= standard recursive least squares (RLS) algorithm
Remark : O(N 2) operations per time update
Remark : square-root algorithms with better numerical propertiessee below
7
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 13
Least Squares & RLS Estimation
14
2.2 Sliding Window RLS
Sliding window RLS
JLS(w) =!L
k=L!M+1 e2k
M = length of the data window
wLS(L) = [U (L)TU (L)]!1" #$ %
[Xuu(L)]!1
[U (L)Td(L)]" #$ %Xdu(L)
with
d(L) =
&
''''(
dL!M+1
dL!M+2
...
dL
)
****+U (L) =
&
''''(
uTL!M+1
uTL!M+2
...
uTL
)
****+.
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 14
Least Squares & RLS Estimation
15
2.2 Sliding Window RLS
It is observed that Xuu(L + 1) = Xuu(L)! uL!M+1uTL!M+1! "# $
XL|L+1uu
+uL+1uTL+1
• leads to : updating (cfr supra) + similar downdating
• downdating is not well behaved numerically, hence to be avoided...
• simple alternative : exponential weighting
8
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 15
Least Squares & RLS Estimation
16
2.3 Exponentially Weighted RLS
Exponentially weighted RLS
JLS(w) =!L
k=1 !2(L!k)e2k
0 < ! < 1 is weighting factor or forget factor
11!! is a ‘measure of the memory of the algorithm’
wLS(L) = [U (L)TU (L)]!1
" #$ %[Xuu(L)]!1
[U (L)Td(L)]" #$ %Xdu(L)
with
d(L) =
&
''''(
!L!1d1
!L!2d2
...
!0dL
)
****+U (L) =
&
'''(
!L!1uT1
!L!2uT2
...
!0uTL
)
***+
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 16
Least Squares & RLS Estimation
17
2.3 Exponentially Weighted RLS
It is observed that Xuu(L + 1) = !2Xuu(L) + uL+1uTL+1
hence
[Xuu(L + 1)]!1 = 1!2 [Xuu(L)]!1 !
1!2 [Xuu(L)]!1uL+1u
TL+1
1!2 [Xuu(L)]!1
1+ 1!2u
TL+1[Xuu(L)]!1uL+1
wLS(L + 1) = wLS(L) + [Xuu(L + 1)]!1uL+1 · (dL+1 ! uTL+1wLS(L)) .
9
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 17
Least Squares & RLS Estimation
18
3. Square-root Algorithms
• Standard RLS exhibits unstable roundo! error accumulation,hence not the algorithm of choice in practice
• Alternative algorithms (‘square-root algorithms’), which havebeen proved to be stable numerically, are based on orthogo-nal matrix decompositions, namely QR decomposition(+ QR updating, inverse QR updating, see below)
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 18
Least Squares & RLS Estimation
19
3.1 QRD-based RLS Algorithms
QR Decomposition for LS estimation
least squares problem
minw !d" Uw!22
least squares solution
wLS = [UTU ]"1 · UTd .
avoid matrix squaring for numerical stability !
10
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 19
Least Squares & RLS Estimation
20
3.1 QRD-based RLS Algorithms
QR Decomposition for LS estimation
least squares problem
minw !d" Uw!22
‘square-root algorithms’ based on QR decomposition (QRD) :
U!"#$L#N
= Q!"#$L#N
· R!"#$N#N
QT · Q = I, Q is orthogonal R is upper triangular
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 20
Least Squares & RLS Estimation
21
Everything you need to know about QR decomposition
Example :U! "# $%
&&'
1 6 102 7 !113 8 124 9 !13
(
))* =
Q! "# $%
&&'
0.182 0.816 0.1740.365 0.408 !0.6190.547 0 0.7160.730 !0.408 !0.270
(
))* ·
R! "# $%
'5.477 14.605 !5.112
0 4.082 8.9810 0 20.668
(
*
Remark : QRD " Gram-Schmidt
Remark : UT · U = RT · RR is Cholesky factor or square-root of UT · U# ‘square-root’ algorithms !
11
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 21
Least Squares & RLS Estimation
22
3.1 QRD-based RLS Algorithms
QRD for LS estimation
if
U = Q · R
then
QT ·!U d
"=
!R z
"
with this
wLS = R!1 · z
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 22
Least Squares & RLS Estimation
23
3.1 QRD-based RLS Algorithms
QR-updating for RLS estimation
Assume we have computed the QRD at time k
Q̃(k)T ·!U(k) d(k)
"=
!R(k) z(k)
"
The corresponding LS solution is wLS(k) = [R(k)]!1 · z(k)
Our aim is to update the QRD into
Q̃(k + 1)T ·!U (k + 1) d(k + 1)
"=
!R(k + 1) z(k + 1)
"
and then compute wLS(k + 1) = [R(k + 1)]!1 · z(k + 1)
12
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 23
Least Squares & RLS Estimation
24
3.1 QRD-based RLS Algorithms
QR-updating for RLS estimation
It is proved that the relevant QRD-updating problem is
!R(k + 1) z(k + 1)
0 !
"! Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
PS: This is based on a QR-factorization as follows:
R(k)uk+1T
!
"
##
$
%
&&
(N+1)×(N )
=Q(k +1)(N+1)×(N ) .R(k +1)
(N )×(N ) = Q(k +1)
(N+1)×(N ) q(k +1)
(N+1)×(1)
!
"
##
$
%
&&
R(k +1)0
!
"##
$
%&&= Q(k +1)
(N+1)×(N+1) . R(k +1)
0
!
"##
$
%&&
(N+1)×(N )
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 24
Least Squares & RLS Estimation
25
3.1 QRD-based RLS Algorithms
QR-updating for RLS estimation
!R(k + 1) z(k + 1)
0 !
"! Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
R(k + 1) · wLS(k + 1) = z(k + 1) .
= square-root (information matrix) RLS
Remark . with exponential weighting!
R(k + 1) z(k + 1)0 !
"! Q(k + 1)T ·
!"R(k) "z(k)uT
k+1 dk+1
"
13
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 25
Least Squares & RLS Estimation
26
3.1 QRD-based RLS Algorithms
QRD updating!
R(k + 1) z(k + 1)0 !
"! Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
basic tool is Givens rotation
Gi,j,"def=
i j" "#
$$$$%
Ii#1 0 0 0 00 cos " 0 sin " 00 0 Ij#i#1 0 00 # sin " 0 cos " 00 0 0 0 Im#j
&
''''(
! i
! j
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 26
Least Squares & RLS Estimation
27
3.1 QRD-based RLS Algorithms
QRD updating
Givens rotation applied to a vector x̃ = Gi,j,! · x :
x̃i = cos ! · xi + sin ! · xjx̃j = ! sin ! · xi + cos ! · xj
x̃l = xl for l "= i, j
x̃j = 0 i! tan ! =xjxi
!
14
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 27
Least Squares & RLS Estimation
28
3.1 QRD-based RLS Algorithms
QRD updating
!R(k + 1) z(k + 1)
0 !
"! Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
sequence of Givens transformations
#
$$%
x x x xx x x
x xx x x x
&
''("
#
$$%
x x x xx x x
x xx x x
&
''("
#
$$$%
x x x x
x x xx x
x x
&
'''("
#
$$$%
x x x x
x x x
x x
!
&
'''(
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 28
Least Squares & RLS Estimation
29
!R(k + 1) z(k + 1)
0 !
"! Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
h
0
0
R24
R11 R12 R13 R14
R33 R34
R44
R22
0
0
z11
z21
z31
z41
R23
du(4)u(3)u(1) u(2)
(delay)
memory cell
a’
b’b
arotation cell
e e
15
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 29
Least Squares & RLS Estimation
30
3.1 QRD-based RLS Algorithms
Residual extraction
!R(k + 1) z(k + 1)
0 !
"= Q(k + 1)T ·
!R(k) z(k)uT
k+1 dk+1
"
From this it is proved that the ‘a posteriori residual’ is
dk+1 ! uTk+1wLS(k + 1) = ! ·
#Ni=1 cos("i)
and the ‘a priori residual’ is
dk+1 ! uTk+1wLS(k) = !#N
i=1 cos("i)
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 30
Least Squares & RLS Estimation
31
Residual extraction (v1.0) :
h(delay)
memory cell
a’
b’b
e ea
cos
rotation cell
cos
cos
cos
cos
0
0
R24
R11 R12 R13 R14
R33 R34
R44
R22
0
0
e1
e2
e3
e4
1
z11
z21
z31
z41
R23
du(4)u(3)u(1) u(2)
output
eW
dk+1 ! uTk+1wLS(k + 1) = ! ·
!Ni=1 cos("i)
16
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 31
Least Squares & RLS Estimation
32
Residual extraction (v1.1) :
¡
`
W ecos
0
0
R24
R11 R12 R13 R14
R33 R34
R44
R22
0
0
z11
z21
z31
z41
R23
(delay)
memory cell
a’
b’b
arotation cell
du(4)u(3)u(1) u(2)
e e
0
0
0
0
1
output1dk+1 ! uTk+1wLS(k + 1) = ! ·
!Ni=1 cos("i)
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 32
Least Squares & RLS Estimation
33
Example
66near-end signal+ residual echo
6
far-end signal
d
0
0
0
0
0
0
0
u[k-1] u[k-2] u[k-3]u[k]
01
17
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 33
Least Squares & RLS Estimation
34
3.2 Inverse Updating RLS Algorithms
RLS with inverse updating
bottom line = track inverse of compound triangular matrix
!R(k + 1) z(k + 1)
0 !1
"!1
=
![R(k + 1)]!1 [R(k + 1)]!1 · z(k + 1)
0 !1
"
=
![R(k + 1)]!1 wLS(k + 1)
0 !1
".
= implicit backsubstitution
Skip
this
slid
e
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 34
Least Squares & RLS Estimation
35
3.2 Inverse Updating RLS Algorithms
First, it is proved that
!R(k + 1) [R(k + 1)]!T
0 !
"" Q(k + 1)T ·
#R(k) [R(k)]!T
uTk+1 0
$
i.e. the Q(k + 1)T that is used to update the triangular fac-tor R(k) with a new row uT
k+1, may also be used to update
[R(k)]!T , provided an all-zero row is substituted for the uTk+1.
Skip
this
slid
e
18
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 35
Least Squares & RLS Estimation
36
!R(k + 1) [R(k + 1)]!T
0 !
"" Q(k + 1)T ·
!R(k) [R(k)]!T
uTk+1 0
"
1/h
h0
0
0
0
u(1) u(2) u(3) u(4)
R23 R24
R11 R12 R13 R14
R33 R34
R44
0 0 0 0
S11
S21 S22
S31 S32 S33
S42 S43 S44S41
R22
a a’
b’bee
Skip
this
slid
e
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 36
Least Squares & RLS Estimation
37
3.2 Inverse Updating RLS Algorithms
Square-root information/covariance RLS
!R(k + 1) [R(k + 1)]!T
0 !
"" Q(k + 1)T ·
#R(k) [R(k)]!T
uTk+1 0
$
Note : numerical stability !if S(k) is stored version of [R(k)]!T , and
S(k)T · R(k) = I + "I.
then the error ‘survives’ the update, i.e.
S(k + 1)T · R(k + 1) = I + "I.
# linear round-o! error build-up !!
Skip
this
slid
e
19
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 37
Least Squares & RLS Estimation
38
3.2 Inverse Updating RLS Algorithms
Square-root covariance RLS
Aim : derive updating scheme that works without R
It is proved that!
0 [R(k + 1)]!T
! "
"" Q(k + 1)T ·
!![R(k)]!T · uk+1 [R(k)]!T
1 0
"
i.e. the Q(k + 1)T that is used to update the triangular factorsR(k) and [R(k)]!T , may be computed from the matrix-vectorproduct ![R(k)]!T · uk+1.
Skip
this
slid
e
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 38
Least Squares & RLS Estimation
39!0 [R(k + 1)]!T
! "
"" Q(k + 1)T ·
!![R(k)]!T · uk+1 [R(k)]!T
1 0
"
1/h
cos e) = b1/(W
0 rotation cell
0
0
multiply-add cell
0
00
00
a a’
b’bbc
c baa-b.c
.b .b .b .b-k1 -k2 -k3 -k4
S11
S22
S33
S21
S31
S41
S32
S42 S43 S44
u1 u2 u3 u4
e e
0
0
0
01
v1
v2
v3
v4
Skip
this
slid
e
20
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 39
Least Squares & RLS Estimation
40
3.2 Inverse Updating RLS Algorithms
Square-root covariance RLS!
0 [R(k + 1)]!T
! "
"" Q(k + 1)T ·
!![R(k)]!T · uk+1 [R(k)]!T
1 0
"
It is proved that " # Kalman gain vector#
0 [R(k + 1)]!T
! !! · k(k + 1)T
$" Q(k + 1)T ·
!![R(k)]!T · uk+1 [R(k)]!T
1 0
"
recall
wLS(k + 1) = wLS(k) + k(k + 1) · (dk+1 ! uTk+1 · wLS(k))% &' (ek+1
Skip
this
slid
e
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 40
41
1/h
.b.b-k1 .b -k3-k2.b -k4
a+b.c
cb
66 6 6
ba
ee
a-b.c abc
c bb b’
a’a0 rotation cell
0
00
00
00
S11
S22
S33
S21
S31
S41
S32
S42 S43 S44
u1 u2 u3 u4
w4w3w1 w2
multiply-subtract cell
multiply-add cell
d 0d
-1
0
0
0
-e/b
0
01
e
b
Least Squares & RLS Estimation
Skip
this
slid
e
21
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 41
Least Squares & RLS Estimation
42Example : acoustic echo cancellation
b
6
baa-b.c
ab bc
a+b.c
6
a a’
b’b
c
c
far-end signal
+ residual echonear-end signal
6
6
66
66
0
0
0
C11
u[k-1] u[k-2] u[k-3]u[k]
0
0
0
0 multiply-subtract cell
rotation cell 1/h
w0[k] w1[k] w2[k] w3[k]
multiply-add cell
1
0
0
0
-e/b
e
0
0
Skip
this
slid
e
DSP-CIS / Chapter-12 : Least Squares & RLS Estimation / Version 2012-2013 p. 42
Least Squares & RLS Estimation
43Example : decision-directed equalization
1/h
e e
c
a+b.c
ca-b.c
ab b
6
b
66
ac b
6
b b’
a’a
-e/b
0 rotation cell
0
00
00
00
S11
S22
S33
S21
S31
S41
S32
S42 S43 S44
u1 u2 u3 u4
w4w3w1 w2
multiply-subtract cell
multiply-add cell
0
b
training sequence
f[.]
1
0
0
0
0
e
b
Skip
this
slid
e