neural networks in signal processing
TRANSCRIPT
ARTICLE IN PRESS
/neucom
Neurocomputing 69 (2005) 1–2
www.elsevier.com/locate
1.
doi
Guest Editorial
Neural Networks in Signal Processing
This special issue on ‘‘Neural Networks for Signal Processing’’ features a selectionnted atcessinghop onited toy, ande beens. Thepapersave to
tering,system
of extended and updated versions of papers that have originally been presethe 2003 IEEE International Workshop on Neural Networks for Signal Pro(NNSP03) in Toulouse, France (September 17–19, 2003) (now called WorksMachine Learning for Signal Processing—MLSP). The authors have been invcontribute to this special issue on the basis of originality, technical qualitrelevance of the papers presented at the workshop. The invited papers havsubjected to the usual rigorous peer review process by anonymous reviewereditors of this special issue are convinced that this selection of excellentprovides the reader with an up-to-date account of what neural networks hoffer for today’s challenging signal processing problems.The papers can be grouped into the following categories: (1) clus
classification & regression, (2) adaptive filtering, noise estimation &identification, (3) recursive learning, and (4) blind inverse problem solving.
Clustering, classification & regression: In the first paper of this category, Guerrero-duce aor the
orkerselp ofctivelylem ofork ofy the �in thei andregular
Curieses and co-workers focus on multiclass decision problems and introparametric family of loss functions that provides accurate estimates fposterior class probabilities near the decision regions. Kaski and co-wintroduce a new method for clustering of continuous data with the hdiscrete-valued auxiliary data in such a way that the discrete data effesupervises the clustering. Lazaro and co-workers consider the probsimultaneously approximating a function and its derivatives in the framewsupport vector machines. They show that the problem can be solved binsensitive loss function and by introducing additional constraintsapproximation of the derivative.
2. Adaptive filtering, noise estimation & system identification: YamazakWatanabe develop an algebraic geometrical method for analyzing non-
Heskes
roposenamicsod forand non-identifiable models and apply this to HMMs. Ypma andformulate the problem of inference in nonlinear dynamical systems and pan algorithm that leads to an unscented Kalman smoother for which the dyneed not be inverted explicitly. Pelckmans and co-workers develop a meth
0925-2312/$ - see front matter r 2005 Elsevier B.V. All rights reserved.
:10.1016/j.neucom.2005.07.001
noise variance estimation which they apply to model selection, and the parameterg andmationression
uentials on a
ARTICLE IN PRESS
Guest Editorial / Neurocomputing 69 (2005) 1–22
tuning of Least Squares Support Vector Machines. Finally, TippinLawrence show, for regression and ICA, that a variational approxischeme enables the inference of the form of the noise distribution in the regcase, and the form of the source distributions in the ICA case.
3. Recursive learning: Asirvadam and co-workers develop an efficient seqlearning algorithm by decomposing the existing recursive learning algorithm
a fast
iteningnoise.amuraies andiew onlution,
layer by layer and neuron by neuron basis. Rao and co-workers deriveQuasi-Newton type of recursive algorithm based on their Error WhCriterion for linear parameter estimation in the presence of additive whiteThey also extend their approach to colored noise. Finally, Maeda and Wakintroduce a recursive learning scheme for Bidirectional Associative Memordiscuss its implementation in FPGAs.
4. Blind inverse problem solving: Luengo and co-workers present a unified vthree closely related blind inverse problem for sparse signals: blind deconvo
fficient
nd theed thel issue,utting
blind equalization, and blind source separation (BSS) and propose an ealgorithm.
The editors would like to thank all the authors for their excellent papers, aanonymous reviewers for their comments and useful suggestions that improvpapers. Special thanks go to Dr. Tom Heskes for inviting us to edit this speciaand to Vera Kamphuis from Neurocomputing Editorial Office for her help in pthe issue together.
Marc M. Van Hulleelgium
n.ac.be
K.U.Leuven, BE-mail address: [email protected]
Jan Larsennmark
dtu.dk
Technical University of Denmark, DeE-mail address: jl@imm.