escuela tecnica superior de ingenier ia y sistemas de
TRANSCRIPT
ESCUELA TECNICA SUPERIOR DE
INGENIERIA Y SISTEMAS DE
TELECOMUNICACION
PROYECTO FIN DE GRADO
TITULO: BCI based FES system for stroke neurorehabilitation:Comparison of SBCSP and CSSBP algorithms
AUTOR: Angel Post Vinuesa
TITULACION: Grado en Ingenierıa de Sistemas de Telecomunicacion
TUTOR: Sadasivan Puthusserypady
UNIVERSIDAD: Technical University of Denmark (DTU)
CENTRO: Department of Electrical Engineering (BME)
PAIS: Dinamarca
Fecha de lectura: 04 de Julio de 2016
Calificacion:
El coordinador de Movilidad,
Resumen
Muchas personas experimentan debilidad muscular o paralisis despues de un accidente
cerebrovascular, que puede afectar a su movilidad y equilibrio, por lo general en un lado
de su cuerpo o de un solo brazo o una pierna.
Un sistema de estimulacion electrica funcional (FES) se puede utilizar para recuperar
las capacidades motoras, utilizando las corrientes electricas para activar los nervios que
inervan las extremidades afectadas.
Este dispositivo FES es controlado por la interfaz cerebro-ordenador (BCI), que propor-
ciona un sistema de comunicacion entre el cerebro humano y los dispositivos externos,
utilizando las senales EEG de los pacientes. Diferentes actividades imaginarias, como el
movimiento de las extremidades, se pueden clasificar en base a los cambios producidos
en las bandas de frecuencia µ y β y sus distribuciones espaciales.
Con respecto a los patrones topograficos de modulaciones de ritmo cerebrales, el algo-
ritmo de patrones espaciales comunes (CSP) ha demostrado ser muy util para extraer
filtros espaciales discriminativos de sujetos especıficos. Sin embargo, el CSP es limitado
en muchas situaciones y no esta optimizado para el problema de clasificacion EEG. Para
superar esta limitacion, vamos a utilizar un metodo alternativo basado en un algoritmo
“Sub-band Common Spatial Pattern” (SBCSP) y la integracion de resultados con el
metodo “score fusion”.
Al mismo tiempo, utilizaremos otro metodo, el algoritmo “Common Spatial-Spectral
Boosting Pattern” (CSSBP), y compararemos los resultados de ambos metodos.
El objetivo de este proyecto es disenar un sistema FES controlado por la interfaz BCI
para mejorar las habilidades motoras de los dedos de una mano para los pacientes de-
spues de un accidente cerebrovascular, la comprension de los sistemas BCI y las senales de
EEG utilizados, la aplicacion de un algoritmo de SBCSP, ası como algoritmo de CSSBP
para la extraccion y clasificacion en MATLAB, y la realizacion de una evaluacion final.
Abstract
Many people experience muscle weakness or paralysis after a stroke, which can affect
their mobility and balance, usually on one side of their body or in just one arm or leg.
A functional electrical stimulation (FES) system can be used to regain motor skills, us-
ing electrical currents to activate nerves innervating the affected extremities. This FES
device is controlled by the Brain-computer interface (BCI), which provides a communi-
cation system between human brain and external devices, using the EEG signals of the
patients.
Different imaginary activities, like limb movement, can be classified based on the changes
in µ and β rhythms and their spatial distributions. With respect to the topographic
patterns of brain rhythm modulations, the Common Spatial Patterns (CSP) algorithm
has proven to be very useful to extract subject-specific, discriminative spatial filters.
However, CSP is limited in many situations and it is not optimized for the EEG classi-
fication problem. To overcome this limitation, we will use an alternative method based
on Sub-band CSP (SBCSP) and score fusion.
This method will be compared with the common spatial-spectral boosting pattern (CSSBP)
algorithm. The aim of this project is to use a BCI controlled FES system to improve
the motor skills in the fingers of one hand for post-stroke patients, understanding the
BCI systems and EEG signals used, implementing a SBCSP algorithm as well as CSSBP
algorithm for the extraction and classification in MATLAB, and do a final evaluation.
Acknowledgements
I would like to extend my appreciation and gratitude for the help and support to the
persons who have contributed in the making this study possible.
Sadasivan Puthusserypady, my supervisor for his help, patience and guidance during the
study.
Helle K.Iversen, for the opportunity to collaborate with Glostrup Hospital and her guid-
ance during the project.
All the students who have contributed in this project and all the volunteer subjects.
My family, thank you for the opportunity of being here, encouraging me in all of my
pursuits and inspiring me to follow my dreams.
My friends, thank you for listening, offering me advice, supporting me through this
entire process and also for all the unforgettable moments during the year.
A heartfelt thanks goes out to Marıa for all your love, support, patience and under-
standing despite the distance.
vi
Contents
resumen ii
Abstract iv
Acknowledgements vi
List of Figures xi
List of Tables xiii
Abbreviations xiv
1 Introduction 1
2 Background 3
2.1 The brain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.1.1 Mapping the brain . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.1.2 Neurons and the action potential . . . . . . . . . . . . . . . . . . . 6
2.1.3 Brain disorders and stroke . . . . . . . . . . . . . . . . . . . . . . . 7
2.2 BCI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.1 BCI definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.2 Electroencephalography . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.3 BCI operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.2.4 BCI in stroke . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.2.5 Functional Electrical Stimulation (FES) . . . . . . . . . . . . . . . 21
3 State of the art 23
3.1 BCI on stroke patients . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.2 BCI Imagery finger movements . . . . . . . . . . . . . . . . . . . . . . . . 27
3.3 Algorithms based on CSP . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4 Signal processing 31
4.1 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.1.1 Data loading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.1.2 Data handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
viii
Contents ix
4.2 Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.2.1 CSP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.2.2 SBCSP [1] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.2.2.1 Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.2.2.2 Sub-band CSP . . . . . . . . . . . . . . . . . . . . . . . . 37
4.2.2.3 Sub-band Score (LDA) . . . . . . . . . . . . . . . . . . . 38
4.2.2.4 Sub-band Score Fusion . . . . . . . . . . . . . . . . . . . 39
4.2.3 CSSBP [2] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.2.3.1 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4.2.3.2 Spatial Channel selection . . . . . . . . . . . . . . . . . . 41
4.2.3.3 Frequency Band Selection . . . . . . . . . . . . . . . . . . 41
4.2.3.4 Combination . . . . . . . . . . . . . . . . . . . . . . . . . 43
4.2.3.5 Training Step . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.2.3.6 Greedy optimization step . . . . . . . . . . . . . . . . . . 44
4.2.3.7 Whole process . . . . . . . . . . . . . . . . . . . . . . . . 46
4.2.3.8 Parameter estimation . . . . . . . . . . . . . . . . . . . . 46
4.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5 Method 49
5.1 BCI competition III: Dataset IVa . . . . . . . . . . . . . . . . . . . . . . . 50
5.2 BCI Competition IV: dataset IIa. . . . . . . . . . . . . . . . . . . . . . . . 51
5.3 Self-acquired dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
5.3.1 Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
5.3.2 Subjects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
5.3.3 Visual stimulus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.3.4 Data acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.3.4.1 Electrodes . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.3.4.2 Artefacts . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
5.3.4.3 G.tec system . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.3.4.4 Data storage . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.3.5 Experimental set-up . . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6 Results and discussion 67
6.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
6.1.1 BCI competition III: Dataset IVa . . . . . . . . . . . . . . . . . . . 67
6.1.2 BCI Competition IV: dataset IIa . . . . . . . . . . . . . . . . . . . 69
6.1.3 Self-acquired dataset . . . . . . . . . . . . . . . . . . . . . . . . . . 71
6.1.3.1 REAL MOVEMENT . . . . . . . . . . . . . . . . . . . . 71
6.1.3.2 IM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
6.2 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
6.2.1 BCI competition III: Dataset IVa . . . . . . . . . . . . . . . . . . . 75
6.2.2 BCI Competition IV: dataset IIa . . . . . . . . . . . . . . . . . . . 76
6.2.3 Self-acquired dataset . . . . . . . . . . . . . . . . . . . . . . . . . . 77
6.2.4 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
7 Conclusion 81
Contents x
A Questionnaire 90
B MATLAB Code 93
B.1 Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
B.2 Signal processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
C Interface 110
D Interface JS Code 119
List of Figures
2.1 Brain divided into 3 parts: Fore-brain, Mid-brain and Hind-brain. [3] . . . 3
2.2 Lobes of the cerebral cortex [4] . . . . . . . . . . . . . . . . . . . . . . . . 4
2.3 Parts of a neuron: Dendrites (receiving), cell body (integrating), axonand synapse (transmitting). [5] . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 Neuronal communication.[6] . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.5 Hemorrhagic and Ischemic Stroke.[7] . . . . . . . . . . . . . . . . . . . . . 7
2.6 BCI system. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.7 Delta frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.8 Theta frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.9 Alpha frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.10 Mu frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.11 Beta frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.12 Gamma frequency[8, 9] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.13 ERD and ERS phenomena [10] . . . . . . . . . . . . . . . . . . . . . . . . 14
2.14 Electrode placement over scalp.[11] . . . . . . . . . . . . . . . . . . . . . . 15
2.15 SCP BCI. [10] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.16 P300 BCI.[10] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.17 Upper panel: Superimposed band power time courses computed for threedifferent frequency bands (10–12 Hz, 14–18 Hz, and 36–40 Hz) from EEGtrials recorded from electrode position C3 during right index finger lifting.EEG data are triggered with respect to movement-offset (vertical line att = 0 s). Lower panel: Examples of ongoing EEG recorded during rightfinger movement. Movement-onset at t = 0 s. [12] . . . . . . . . . . . . . 17
2.18 Cortical neuronal activity [10] . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.19 Implantable FES system for upper limbs [13] . . . . . . . . . . . . . . . . 21
4.1 Singal processing overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.2 Screen capture of frame between 316.1 and 316.5 seconds . . . . . . . . . 33
4.3 SBCSP: System flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.4 LDA. Maximizing the component axes for class-separation [14] . . . . . . 38
4.5 Model training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.6 EEG Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.7 Slider . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
5.1 Paradigm of each task [15] . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
5.2 Left: Electrode montage corresponding to the international 10-20 system.Right: Electrode montage of the three monopolar EOG channels. [15] . . 52
5.3 Self-acquired dataset procedure . . . . . . . . . . . . . . . . . . . . . . . . 54
xi
List of Figures xii
5.4 Index finger movement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5.5 G.tec system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.6 Electrode positioning. Modified from [16] . . . . . . . . . . . . . . . . . . 60
5.7 Amplifier used in the recording system. [17] . . . . . . . . . . . . . . . . . 61
5.8 Paradigm’s timing scheme. . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.1 Accuracy in a first approximation in the SBCSP algorithm . . . . . . . . 68
6.2 Accuracy in a second approximation in the SBCSP algorithm . . . . . . . 68
6.3 Accuracy in a third approximation in the SBCSP algorithm . . . . . . . . 69
6.4 Accuracy SBCSP for 4-class dataset . . . . . . . . . . . . . . . . . . . . . 70
6.5 Accuracy CSSBP for 4-class dataset . . . . . . . . . . . . . . . . . . . . . 70
6.6 Accuracy SBCSP for 5-class dataset: Actual movement . . . . . . . . . . 72
6.7 Accuracy CSSBP for 5-class dataset: Actual movement . . . . . . . . . . 72
6.8 Accuracy SBCSP for 5-class dataset: IM . . . . . . . . . . . . . . . . . . . 73
6.9 Accuracy CSSBP for 5-class dataset: IM . . . . . . . . . . . . . . . . . . . 74
A.1 Questionnaire: Before the trial . . . . . . . . . . . . . . . . . . . . . . . . 91
A.2 Questionnaire: After the trial . . . . . . . . . . . . . . . . . . . . . . . . . 92
C.1 Background of the Interface . . . . . . . . . . . . . . . . . . . . . . . . . . 110
C.2 Part of the hand (1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
C.3 Part of the hand (2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
C.4 Part of the hand (3) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
C.5 Part of the hand (4) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
C.6 Part of the hand (5) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
C.7 Part of the hand (6) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
C.8 Part of the hand (7) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
C.9 Part of the hand (8) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
C.10 Part of the hand (9) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
C.11 Part of the hand (10) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
C.15 Part of the hand (14) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
C.12 Part of the hand (11) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
C.13 Part of the hand (12) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
C.14 Part of the hand (13) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
List of Tables
2.1 Functions of the lobes from the cerebral cortex . . . . . . . . . . . . . . . 5
2.2 Effects of stroke. [18] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3 Neuroimaging methods.[18] . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.1 Bandpass frequencies SBCSP . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.1 Materials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
5.2 Subjects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
5.3 Satisfaction survey results. . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.1 Bandpass frequencies: first attempt . . . . . . . . . . . . . . . . . . . . . . 67
6.2 Bandpass frequencies: first attempt . . . . . . . . . . . . . . . . . . . . . . 68
6.3 Confusion matrix of BCI Competition IV: dataset IIa . . . . . . . . . . . 71
6.4 Confusion matrix: Actual movement . . . . . . . . . . . . . . . . . . . . . 73
6.5 Confusion matrix: IM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
xiii
Abbreviations
EEG Electroencephalogram
EMG Electromyogram
MEG Magnetoencephalography
ECoG Electrocortigraphy
fMRI Functional Magnetic Resonance Imaging
NIRS Near-Infrared Spectroscopy
BCI Brain Computer Interface
FES Functional Electrical Stimulation
CSP Common Spatial Pattern
SBCSP Sub-Band Common Spatial Pattern
CSSBP Common Spatial-Spectral Boosting Pattern
CSSD Common Spatial Subspace Decomposition
DSP Discriminative Spatial Pattern
TIA Transient Ischemic Attack
VEP Visual Evoked Potentials
ERD Event Related Desynchronisation
ERS Event Related Synchronisation
MI Motor Imagery
ME Motor Execution
SCI Spiral Ccord Injury
SMR Sensorymotor Rhythm
CA Classification Accuracy
MRC Medical Research Council score
SVM Support Vector Machine
RFE Recursive Feature Elimination
xiv
Abbreviations xv
tDCS Transcranial Direct Current Stimulation
RCT Randomized Controlled Trial
ICA Independent Component Analysis
BD Bhattacharyya Distance
MD Mahalanobis Distance
ANN Artificial Neural Network
PCA Principal Component Analysis
DCA Detectable Cortical Activity
DCR Direct Cortical Response
CSSP Common Spatio-spectral Pattern
CSSSP Common Sparse Spatio-spectral Pattern
FIR Finite Impulse Response
ISSPL Iterative Spatio-spectral Patterns Learning
FBCSP Filter Bank Common Spatial Pattern
LDA Linear Discriminant Analysis
Chapter 1
Introduction
According to the World Health Organization, 15 million people suffer stroke worldwide
each year. Of these, 5 million die, and another 5 million are permanently disabled.
A stroke occurs when the blood supply to a part of the brain in suddenly cut off.
Consequently, the cells in this area become damaged or die, which can cause severe
symptoms or death.
The effects vary depend on which part of the brain is injured, and how severely. Some of
them can be sudden weakness (that includes paralysis of one side of the body, losses of
movements in one hand, leg or both, weakness and twisting of one side of the face. . . ),
problems with balance or difficulty with seeing or speaking.
Stroke is a leading cause of serious long-term disability. The sudden nature of stroke
means that sufferers and their families cannot prepare the tremendous blow to their lives
and how it affects their quality of life.
Recovery of the motor skills in after stroke patients is crucial in order to improve daily
living activities, and there are several studies about this issue, most of them about brain
computer interface (BCI) systems based on electroencephalographic (EEG) signals.
A Brain computer interface (BCI) is a device that responds to neural processes from
the brain to provide a communication between the brain and an external device. This
technology has been incredibly developed in the last years, focus on improve the lives of
people with neurological disorders, most of them in post-stroke patients.
To regain motor skills in patients, one of the most effective ways is motor imagery. In
the imagination of limb movement, suppression of EEG signals happens in the specific
region of the motor and somatosensory cortex due to loss of synchrony in µ and β bands,
classically defined in the 12-16Hz and 18-24Hz respectively, is termed event-related de-
synchronization (ERD).
1
Introduction 2
This brain rhythm can be used to control as a control signal for assistive devices to regain
motor skills. One effective system that can be controlled by a BCI is the functional
electrical stimulation (FES) system. FES is a technique that use electrical currents
to active nerves innervating extremities affected by paralysis caused by stroke or other
disorders.
In this project we address a solution for these post stroke paralysis disorders. Particu-
larly, we design a BCI controlled FES system to improve the motor skills in the fingers of
one hand for post-stroke patients. This system combines the FES stimulations and men-
tal imagery, reconstructing the neuron-circuit between paralysis limbs and corresponding
pathological brain area of the subject.
In this system, there are many techniques for the extraction and classification of the EEG
signals. In the ERD brain rhythm, one of the main problems extracting and analyzing
this signals, is that the frequency bands varies from subjects. Common Spatial Patterns
(CSP) algorithm has proven to be very useful in extracting ERD. However, it can be only
applied to the informative frequency band, which are not the same in all the subjects.
To solve this limitation, the Common Spatio-Spectral Patter (CSSP) and the Common
Sparse Spectral Pattern (CSSSP) were proposed, in which a spatial and spectral filter
are optimized to solve this problem, but the flexibility of the frequency filters is still
limited, and are not really effective when it is used with people suffering neurological
diseases.
To solve this limitation, we use a method based on Sub-band CSP (SBCSP) and score
fusion, decomposing the EEG signals into sub-bands using a filter bank, deriving the
final decision from fusion of the score from each sub-band.
In addition, we compare it with an adaptive boosting algorithm to perform autonomous
selection of key channels and frequency band.
Finally, we evaluate the system into two different ways:
- From some databases that we can find in BNCI database.
- With data recorded from 12 subjects.
The aim is that the system works properly, getting the best results as possible, under-
standing all the processes, comparing the results between the two proposed algorithms
and obtaining a conclusion.
Chapter 2
Background
2.1 The brain
The brain is the body’s control centre, managing about everything we do. Inside our
heads, is an organ of 1,5 kg of weight consisting on billions of tiny cells, being the most
complex organ of the body. It controls body activities, ranging from heart rate and
sexual function to emotion, learning, and memory. [19, 20]
2.1.1 Mapping the brain
We can divide the brain into 3 different parts: The fore-brain, the mid-brain and the
hind-brain. [19]
Figure 2.1: Brain divided into 3 parts: Fore-brain, Mid-brain and Hind-brain. [3]
3
Background 4
The fore-brain is credited with the highest intellectual functions. There we can find
the Cerebrum, the largest part of the human brain. Interpreting touch, vision, hearing,
speech, emotions, learning, reasoning and fine control of movement are the higher func-
tions of the Cerebrum.
The cerebrum is composed of right and left hemispheres connected by the corpus callo-
sum, controlling each hemisphere the opposite side of the body, and the surface of the
cerebrum has a folded appearance called the cortex, referred as grey matter because of
his grey colour, contains about 70% of the 100 billion nerve cells.
The cerebral cortex can be divided into different lobes [19].
Figure 2.2: Lobes of the cerebral cortex [4]
Background 5
Each lobe has different functions, but each lobe does not function alone.
Lobe Functions
Frontal Lobe Initiating and coordinating motor movements.
Higher cognitive skills.
Personality, emotions.
Speech: Speaking and writing.
Parietal lobe Sensory processes.
Language.
Interprets signals from vision, hearing, motor, sensory and memory.
Spatial and visual perception.
Occipital lobe Process visual information
Temporal lobe Process auditory information.
Memory.
Hearing.
Sequencing and organization.
Table 2.1: Functions of the lobes from the cerebral cortex
The fore-brain has also other parts: the basal ganglia (cerebral nuclei deep in the cerebral
cortex, coordinating muscle movements), the thalamus (prioritize the information and
send it to receive from the cerebral cortex) and the hypothalamus (control of behaviours
and regulate body temperature, blood pressure, emotions and hormones).
The mid-brain has the function of visual and auditory reflexes and relaying this infor-
mation to the hypothalamus.
Finally, the hind-brain is composed by the pons, medulla oblongata, cerebellum and
spinal cord. The pons and medulla oblongata control respiration, heart rhythms and
blood glucose, the cerebellum coordinate muscle movements, maintain posture and bal-
ance and the spinal cord, the via that receive sensory information from all parts of the
body. [5, 19, 20]
Background 6
2.1.2 Neurons and the action potential
Neurons are the basic working units of the brain, transmitting information, cooperating
and competing with each other in regulating the overall state of the nervous system.
All the neurons consist of a cell body, dendrites, an axon and a synaptic terminals. [20]
Figure 2.3: Parts of a neuron: Dendrites (receiving), cell body (integrating), axonand synapse (transmitting). [5]
Chemical signals received in the dendrites from the axons that contact them are trans-
formed into electrical signals, and make a decision about whether to pass with the other
electrical signals from all the other synapses. Electrical potentials travel from dendrites
to synapses and repeat the process. [5]
To communicate from one neuron to another, the axons of neurons transmit electrical
pulses called action potentials, because of the ion-channels that are contained in the
axon-al membrane, that can open and close to let through electrically charged ions.
The flow of ions creates an electrical current that produces voltage changes across the
neuron’s cell membrane. [5, 19]
Figure 2.4: Neuronal communication.[6]
Background 7
2.1.3 Brain disorders and stroke
More than 1000 disorders of the brain and nervous system result in more hospitalizations
than other disease group. [19]
When the brain is damaged, the results can be devastating.
We can divide the Brain disorders in four categories: Brain injuries, brain tumours,
neurodegenerative diseases and mental disorders. [21]
Brain injuries are often caused by blunt trauma, damaging brain tissue, neurons and
nerves, affecting to the main abilities of the person. Some of this brain injuries are
haematoma, contusions, cerebral oedema or strokes. There can be treated by medication,
rehabilitation or brain surgery. [21]
A stroke occurs when blood flow to an area in the brain is cut off and the brain cells,
deprived of the oxygen and glucose needed to survive, die, so some abilities controlled
by this cells are lost. There are two types of strokes: Ischemic (clot) and haemorrhagic
(bleed). Ischemic stroke is caused by a blockage cutting off the blood supply to the
brain and haemorrhagic when a blood vessel bursts within or on the surface of the
brain. [7, 18, 22, 23]
Figure 2.5: Hemorrhagic and Ischemic Stroke.[7]
Background 8
Stroke causes a greater range of disabilities than any other condition. According to the
World Health Organization, 15 million people suffer stroke worldwide each year. Of
these, 5 millions die, and another 5 million are permanently disabled [18, 22]
Difficulty % of people affected
Upper limb/arm weakness 77%
Lower limb/leg weakness 72%
Visual problems 60%
Facial weakness 54%
Slurred speech 50%
Bladder control 50%
Swallowing 45%
Aphasia 33%
Depression 33%
Bowel control 33%
Dementia 30%
Inattention/neglect 28%
Emotionalism within six-months 20%
Emotionalism post six-months 10%
Table 2.2: Effects of stroke. [18]
A transient ischemic attack (TIA) is like a mini-stroke, where symptoms resolve within
24 hours.
There are many symptoms of stroke, like sudden numbness or weakness of some part of
the body, sudden confusion, trouble speaking or understanding, sudden trouble seeing in
the eyes, sudden trouble walking, loss of balance, or sudden severe headache. [18, 23, 24]
Hemiparesis is a very common disease after a stroke, and it consists in some degree
of trouble moving one side or suffer for weakness on one side of their bodies. People
with hemiparesis may have trouble moving their arms and legs, walking and loss of
balance. Right-sided hemiparesis occurs when the left side of the brain is injured, and it
affects to language and speaking. On the other hand, left-sided hemiparesis occurs when
the right side of the brain is injured and affects to the memory, attention, non-verbal
communication and how to learn. Then, damage in the lower part of the brain affect
the movement, and is called ataxia. [18, 23, 24]
There are two types of hemiparesis: pure motor hemiparesis (face, arm and leg weakness)
and ataxic hemiparesis syndrome (weakness or clumsiness on one side of the body). [24].
Background 9
Strokes are life-changing events that can affect a person physically and emotionally. The
goal of stroke rehabilitation is to help to relearn skill that were lost. There exist reha-
bilitating activities to have part or fully recover like speech therapy, physical therapy,
occupational therapy, join a support group with common mental health and also support
from friends and family.
Furthermore there are Technology-assisted physical activities: Functional electrical stim-
ulation, robotic technology, wireless technology, virtual reality and noninvasive brain
stimulation. Finally, there are experimental therapies, like biological (stem cells) or
alternative medicine (massages, acupuncture. . . ). [22]
Brain tumors can be also very dangerous. There can be malignant or benign, and the
type of treatment depends or many factors, and can be surgery, chemotherapy and
radiation therapy.
Neurodegenerative diseases cause your brain and nerves to deteriorate over time. Some
of them are Alzheimer, Huntington, amyotrophic, Parkinson and dementia. All of them
have permanent damage but can be controlled by some treatments like medication.
Finally, mental disorders like anxiety, depression and schizophrenia are so common, and
there could be treated by medication or therapy. [19, 21]
Background 10
2.2 BCI
For many years people have dreamed and speculated with the possibility of communicate
and control the human brain with computer or robots. Over the past twenty five years,
this dream has been brought to fruition by several researches and scientific programs,
being nowadays one of the fastest-growing areas of scientific research. This technology
is called Brain-Computer Interface (BCI). [10, 25, 26]
2.2.1 BCI definition
A brain-computer interface is a device that responds to neural processes from the brain
to provide a direct communication pathway between the brain and an external device
without the use of the normal neuromuscular pathways. [27] So the possibility that
BCI allows a person to communicate with or control the external world without using
common neuromuscular pathways brings hope to persons who suffers from neurological
disorders, such severe motor disabilities, amyotrophic lateral sclerosis, spinal cord injury
or stroke. [26]
A BCI could be defined as a system that measures and analyses brain signals and converts
them in real-time into outputs that do not depend on the normal output pathways of
peripheral nerves and muscles. [26]
Figure 2.6: BCI system.
Background 11
We can distinguish between two types: dependent and independent BCIs. A dependent
BCI does not use the brain’s normal output pathways to carry the message, but activity
in this pathways is needed to generate the brain activity that does carry it, like for
example a systems that uses visual evoked potentials (VEPs) to detect gaze direction.
In contrast, an independent BCI does not depend in any way on the brain’s normal
output pathways, the message is not carried by muscles and nerves, being activity in
these pathways not needed to generate the brain activity that does carries the message,
like for example a system that uses a P300 evoked potentials produced by the user with
his intent. [10, 26]
A variety of neurophysiologic signals reflecting in-vivo brain activities might be recorder
and used to drive a BCI. Two types of brain activities may be monitored: electro-
physiological and hemodynamic Electro-physiological activity is generated by electro-
chemical transmitters exchanging information between the neurons, and hemodynamic
response is a process in which the blood releases glucose to active neurons at a greater
rate than in the area of inactive neurons. [26, 28]
A neuroimaging modality classification is summarized in the table below, including elec-
trophysiological methods such as electroencephalography, electrocorticography, magne-
toencephalography, and electrical signal acquisition in single neurons, and metabolic
methods such as functional magnetic resonance and near infrared spectroscopy. [28]
Neuro-
imaging
method
Activity
measured
Measu-
rement
Temporal
resolution
Spatial
resolution
Risk Portability
EEG Electrical Direct 0.05 s 10 mm Non-invasive Portable
MEG Magnetic Direct 0.05 s 5 mm Non-invasive Non-Portable
ECoG Electrical Direct 0.003 s 1 mm Invasive Portable
Intracortical
neuron
recording
Electrical Direct 0.003 s 0.05-0.5 mm Invasive Portable
fMRI Metabolic Indirect 1 s 10 mm Non-invasive Non-Portable
NIRS Metabolic Indirect 1 s 10 mm Non-invasive Portable
Table 2.3: Neuroimaging methods.[18]
Background 12
2.2.2 Electroencephalography
Electroencephalography is the neurophysiologic measurement of the electrical activity
of the brain using electrodes placed on the scalp. The resulting traces are known as
electroencephalogram (EEG) and they represent an electrical signal (post-synaptic po-
tentials) from a large number of neurons. [9]
EEG methods are based in electro-physiological signals, and can function in most en-
vironments, requiring relatively simple and inexpensive equipment, and being a good
solution in several researches and BCI systems. It is considered a non-invasive method
because their signals are recorder from the scalp. EEG is recording of electrical ac-
tivity from the scalp produced by firing of neurons from millions of neuron of similar
orientation (radial) with in the brain being an outcome of oscillations of this neuronal
assemblies which occurs at different frequencies. [10, 26, 29]
1. DELTA - 3 Hz and less (deep sleep, when awake pathological). Can be extracted
from the frontal lobe.
Figure 2.7: Delta frequency[8, 9]
2. THETA - 3.5 - 7.5 Hz (creativity, falling asleep). Can be extracted from the central
and occipital lobes.
Figure 2.8: Theta frequency[8, 9]
Background 13
3. ALPHA - 8 - 13 Hz (relaxation, closed eyes). Can be extracted from the occipital
lobe.
Figure 2.9: Alpha frequency[8, 9]
4. MU - 8 - 13 Hz (immobility). Can be extracted from the frontal lobe.
Figure 2.10: Mu frequency[8, 9]
5. BETA - 14 - 30 Hz and more (concentration, logical and analytical thinking, fidget).
Can be extracted from the frontal and temporal lobes..
Figure 2.11: Beta frequency[8, 9]
6. GAMMA - greater than 30 Hz (simultaneous processes). Can be extracted from
the frontal lobe.
Figure 2.12: Gamma frequency[8, 9]
Background 14
It is well known that each frequency band has their characteristics. For examples, alpha
waves are increased when person feel comfortable and during closed his or her eyes and if
he or she opened eyes then, the waves are decreased. Specially, mu waves are decreased
during the imaginary movement. In case β-waves, these are increased during mental
activities. In case gamma waves, these are increased when he concentrate on something.
[8, 30]
Changes in the power spectrum of different frequency bands before during or after an
event reflects changes in firing pattern of neurons (group). When there exist a decrease
in power in a frequency band, it is called “Event related desynchronisation” (ERD). On
the other hand, when there exist an increase in power in a frequency band it is called
“Event related synchronization” (ERS). [8]
Figure 2.13: ERD and ERS phenomena [10]
EEG is recorder by electrodes, being placed over the scalp and commonly based on
the International 10-20 system, which has been standardized by the American Electro
encephalographic Society. This system uses two reference points in the head to define
the electrode location. One is the nasion (at the top of the nose) and the inion (found
in the bony lump at the base of the skull). The transverse and median planes divide the
skull from these two points. The electrode locations are determined by marking these
planes at intervals of 10% and 20% .The letters in each location corresponds to specific
brain regions in such a way that A represents the ear lobe, C the central region, Pg the
nasopharyngeal, P the parietal, F the frontal, Fp the frontal polar, and O the occipital
area. [11]
Background 15
Figure 2.14: Electrode placement over scalp.[11]
BCIs fall into 5 groups based on the electrophysiological signals they use, some recorded
from the scalp: visual evoked potentials, slow cortical potentials, P300 evoked potentials,
µ- or β-rhythms, and other by implanted electrodes: cortical neuronal activity. [10, 26]
1. Visual evoked potentials (VEP):
The VEP-based communication systems depend on the user’s ability to control
gaze direction, they are considered as dependent BCI systems. They consists in
record the VEP from the scalp over visual cortex to determine the direction of the
eye gaze to be able to move a cursor, write a letter or other different functions
from several researches. [10]
2. Slow cortical potentials (SCPs):
Among the lowest frequency of the scalp recorder EEG signals, are the SCPs.
Negative ones are associated with movement and cortical activations, and positive
ones with reduced cortical activation. Users learn to control this signals for exam-
ple to move a cursor with two targets (top and bottom), or to write a document
by selecting letters in two-choice selections. [10]
Background 16
Figure 2.15: SCP BCI. [10]
3. P300 evoked potentials:
The P300 wave is a measurable direct reaction of the brain to a certain sensory,
cognitive or mechanical stimulus, typically evoke in the EEG over parietal cortex
a positive peak at about 300ms. It does not requires an initial training for the
user, and can be used for example to determine a choice by flashing different row
o columns, and counting how many times it appears [10].
Figure 2.16: P300 BCI.[10]
Background 17
4. Mu/Beta rhythms and other activity from sensorimotor cortex:
Prominent electrophysiological features associated with the brain’s normal motor
output channels are µ- and β-rhythms. The rhythms are synchronized when no
sensory inputs or motor outputs being processed. Movement or movement prepa-
ration results in a desynchronisation of the µ- and β-rhythms, referred to as ERD.
The, ERS occurs after movement when the rhythms synchronize again. This ERD
and ERS phenomenon in µ- and β-rhythms occur during imagined movements as
well, making them suitable for paralysed individuals [31].
Figure 2.17: Upper panel: Superimposed band power time courses computed for threedifferent frequency bands (10–12 Hz, 14–18 Hz, and 36–40 Hz) from EEG trials recordedfrom electrode position C3 during right index finger lifting. EEG data are triggeredwith respect to movement-offset (vertical line at t = 0 s). Lower panel: Examples ofongoing EEG recorded during right finger movement. Movement-onset at t = 0 s. [12]
Background 18
5. Cortical neuronal activity:
Metal microelectrodes can be used to record action potentials of single neurons in
the cerebral cortices during movements.
Related with it, there are the intracortical BCIs, (ECoG signals), which are an
invasive method that use action potential firing rates or local field potential ac-
tivity recorded from individual or small populations of neurons within the brain.
Signals recorded within cortex may encode more information and might support
BCI systems that require less training. [32]
Figure 2.18: Cortical neuronal activity [10]
Background 19
2.2.3 BCI operation
Any BCI consists of five essential elements: Signal acquisition, feature extraction, feature
translation, device output and the operation protocol. [26]
1. Signal acquisition:
Is the measurement of the neurophysiologic state of the brain. For example the
EEG recorder from the scalp or the surface of the brain or neuronal activity
recorder within the brain. The recorder interface tracks neural information re-
flecting a person’s intent embedded in the ongoing brain activity.
2. Feature extraction:
Here is where the signal processing starts operating in the BCI. In this step, it
extracts signal features that encode the intent of user. They can be in time-
domain or in frequency-domain, or both. An algorithm filters the digitized data
and extracts the features that will be used to control the BCI.
3. Feature translation:
In this stage, the algorithm translates these signals features into device commands-
orders. These commands can produce output as a letter, a muscle movement, cur-
sor movement. . . This algorithm may use linear methods or non-linear methods.
4. The output device:
The output device could be a computer screen, a Functional Electrical Stimulation
(FES) device, or others assistive devices, and the output could be for example to
select targets or letters, or move any muscle or robotic arm.
5. The operating protocol:
The protocol defines how the system is turned on and off, the details of and
sequence of steps in operation of the BCI, and the timing of the operation.
[10, 26]
Background 20
2.2.4 BCI in stroke
The main interest in BCI technology is to help people who suffered from neurological
disorders such as amyotrophic lateral sclerosis, stroke or other traumatic brain disorders.
One way is to substitute for the loss of neuromuscular functions by using stroke survivors
brain signals to interact with the environment instead of using their impaired muscles,
another recent way is pay attention to a motor task that required the activation or
deactivation of specific brain signals, to restore an impaired motor function. This second
way use the thoughts of moving the impaired limb instead of physically moving the
impaired limb, and is called MI, which is the mental rehearsal of physical movement
tasks, and can be used to move an impaired muscle without any physical demands.
Several researches have proven that is possible to detect this MI and motor execution
(ME) from EEG signals during ERD and ERS in µ- and β-rhythms. [27]
Functional electrical stimulation (FES) and motor imaginary have been extensively ap-
plied in the rehabilitation training of stroke patients.
Studies examining neuromuscular electrical stimulation use by individuals following
stroke report improved force production, selective activation of muscles, passive range
of motion, and reduction of abnormally high muscle tone.
Recent studies have described participants who practiced electrical stimulation-assisted
tasks involving object manipulation and reported improved selective movement and func-
tion in the arm and hand following stroke, with good results [33, 34]
In the next chapter, there are described several studies about BCI, from the beginning
of this technology, until nowadays and what is expect in the future.
In the next chapter there will be described several studies about BCI in stroke people,
since the first studies until nowadays.
Background 21
2.2.5 Functional Electrical Stimulation (FES)
Neurons are electrically active cells. [35] In Neurons, electrical impulses transmit the
information, these impulses are called action potentials (see section 2.1.2). Nerve signals
are frequency modulated with a frequency typically between 4 and 12 Hz. An electri-
cal stimulation can artificially substitute this action potential by changing the electric
potential across a nerve cell membrane (this also includes the nerve axon), inducing
electrical charge in the immediate vicinity of the outer membrane of the cell.[36]
Functional electrical stimulation (FES) delivers a shock to the muscle, that activates
nerves and makes the muscle move. FES can be used to generate muscle contraction in
paralyzed limbs to produce muscle functions. [37]
FES devices take advantage of this property to electrically activate nerve cells, which
then may go on to activate muscles or other nerves.
Figure 2.19: Implantable FES system for upper limbs [13]
Nerves can be stimulated using either surface (transcutaneous) or subcutaneous (percu-
taneous or implanted) electrodes. The surface electrodes are placed on the skin surface
above the nerve or muscle that needs to be “activated”. They are noninvasive, easy to
apply, and generally inexpensive. [37]
Chapter 3
State of the art
In this chapter there is a review of some interesting studies related to BCI. First, about
works of BCI in stroke people, then works of BCI in imaginary finger movements and
finally there is a review of some interesting algorithms based on CSP for the feature
extraction and translation.
3.1 BCI on stroke patients
First known attempts to use BCI in motor rehabilitation, although not yet in stroke
patients, were performed in the early 2000s.
In 2003, Pfurtscheller et al. [38] presented a case on a subject that was able to learn via
MI how to properly deliver electrical stimulations to hand and arm muscles, with the
result of basic hand tasks performance. It was not a stroke patient but a tetra paresis
one as result of a spinal cord injury (SCI), and the outcome was achieved using SMR
modulation.
Using neuronal spike activation of cortical cells, Hochberg et al. [39] reported in 2006
the case of a patient moving a cursor on a screen and controlling a robotic limb.
It was also in 2006 that the first study used in stroke patients was reported. Mohapp et
al. [40] did perform a study of the cortical activity on both hemispheres in a sample of
10 stroke patients using ME and MI. They found out that a more pronounced cortical
activity was appreciated in the contralesional than in the affected hemisphere, regardless
of whether the hand movement was executed or imagined with the unaffected or affected
hand.
23
State of the art 24
In 2008, Bai et al. [41] performed investigations on β-rhythm-based BCI with visual feed-
back in subjects without previous BCI training. They concluded that the sensorimotor
β-rhythm of EEG associated with human motor behavior could be a useful BCI-signal
for clinical applications, and that it would not require any exhaustive training of subjects
or patients to provide reliable results.
It was also in 2008 when the first clinical study of BCI application on stroke patients
was published. Buch et al. [42] reported the first application of a MEG-based BCI
on a sample of eight chronic stroke patients, which were using their modulation of
the µ-rhythm ERD to control a screen cursor. Patients were also receiving visual and
kinesthetic feedback upon successful completion of any MI and/or ME tasks. Despite
the fact that no significant improvements in motor outcome were achieved, the study
revealed that six out of eight patients significantly improved performance in terms of
classification accuracy (CA) with this technique.
During 2009 and 2010, Ang et al. [43, 44] carried out several studies of stroke patients
rehabilitations, comparing results when performed via simple robotic rehabilitation ver-
sus when performed via MI BCI robotic rehabilitation. After a number of sessions in
one-hour trials during several weeks, no significant differences between groups were ob-
served, both groups of hemiparetic stroke patients reached a significant improvement in
rehabilitation.
Regarding FES techniques introduction, in 2008 Fei Meng et al. [45] reported the
design of a training platform for chronic stroke patients to train their upper limb motor
functions. It was based on BCI-FES combination, where electrical stimulation was
driven by the EEG signals resulting from their intention to move wrist and/or hand,
and also applying the common spatial pattern (CSP) algorithm. A pilot study with
two chronic stroke patients was conducted, with the purpose to determine both the
feasibility for stroke patients to carry out BCI FES training for rehabilitation, as well
as to asses possible functional improvements after such training. Pilot study resulted in
an improvement of the error rate of the BCI control, that was less than 20% after 10
training sessions.
In 2009, Daly et al. [46] experimented also with a FES device, in this case delivering
electrical stimulus to the index finger extension muscles of a stroke patient, also for motor
learning purposes. Outcome was reported to be recovery of the index finger extension
after 9 sessions carried out 3 times per week during 3 weeks.
Related to imaginary fingers movement, there are some other studies on this field that
we will review in the next point of the state of the art.
State of the art 25
Also in 2009, Takahashi et al. [47] accomplished another study on FES, but on healthy
subjects this time, examining how ERD is affected by the functional electrical stimulation
(FES) on both feet. They detected a direct relationship between the FES stimulus
increase and bigger ERD extraction, suggeting that the muscular and articular sensations
induce ERD on foot motor area (Cz).
In 2010, Tan et al. [48], worked on a study related with the ability of stroke patients
to learn how to modulate their own sensorimotor rhythms to activate FES on muscles.
They reported that four out of six patients managed to do so to activate FES of the
wrist muscles. And as important as that, that they did so less than three months after
lesion, paving the way to the implementation of early stage theraphies, with the possible
inclusion of a BCI.
Following year Tam et al. [49], also carried out another study to determine minimal
set of electrodes required by an individual stroke subject for motor imagery to control
an assistive device using FES. After 20 sessions with five chronic stroke patients, and
reaching accuracy higher than 90%, this study showed that one training day with 12
electrodes using the SVM-RFE method achieved the best balance between the number
of electrodes and accuracy in the 20-session data. They also concluded that generally,
8-36 channels were required to maintain accuracy higher than 90% in 20 BCI training
sessions.
In 2010, the study of Prasad G. et al. [50] with five chronic stroke patients confirmed
the usefulness of MI-based BCI and physical practice combination. In the study they
evaluated not only BCI performance but also other clinical indices such as patients upper
limb movement control, fatigue and mood.
Also in 2010, but again only in healthy subjects, Tavella et al. [51] proposed a senso-
rimotor rhythm based BCI-FES to manipulate objects and carry out daily living tasks,
obtaining a good timing performance and low error rate.
Back to studies of combined theraphies for motor recovery and brain reorganization,
in 2010 and 2011 Broetz et al. [52] and Caria et al. [53] did investigate the effect of
combining BCI training with physical therapy in chronic stroke patients and reported a
significant recovery of the hand motor function. The authors highlighted the significant
brain plasticity and recovery effect got by the BCI and functional neuroimaging.
State of the art 26
Overall BCI benefits are applicable to majority of stroke patients, as derived from the
large clinical study performed by Ang et al. [54] in 2011 over 54 patients. They used
EEG-based MI BCI which was demonstrated to able to be used by the majority of stroke
patients, and suggests the convenience to try to reinforce and spread the practice of BCI
screening for any stroke intervention. Other studies to analyze the effect of combining
BCI with transcranial direct current stimulation (tDCS) were carried out in 2012 by Ang
et al. [55] and Kasashima [56]. tDCS resulted in bigger ERD so that it would be advised
to be used in modulating MI in stroke patients, and also stressing the importance of
BCI feedback for stroke rehabilitation.
In the same direction aims Mihara et al. [57] in 2013 when they show greater motor
improvements in patients using BCI with visual feedback.
[27, 28]
In conclusion, there have been several studies about BCI on stroke patients, and these
studies are growing sharply in the last years. Motor imagery EEG has been widely
performed recently because of its discriminative property and inexpensive acquisitions.
Using ME and MI tasks through EEG signals have given good results, and the senso-
rimotor β-rhythm is a real useful BCI-signal for clinical applications. Furthermore, to
use a FES system, it has been proved in some studies like Daly et al. [46] and Tam et
al. study [49] with good results as a post-stroke recovery system.
Most of the articles remark that the BCI feedback is important in a BCI system for
stroke rehabilitation.
State of the art 27
3.2 BCI Imagery finger movements
We can start the review with one study of the year 2005: Liu et al. [58], where finger
movement was used as the basic and typical tasks to be identified in the BCI exper-
iments. BP and ERD ideas were introduced and discussed in this study. The CSSD
(common spatial subspace decomposition) algorithm was used for classifying single-trial
EEG during the preparation of left-right finger movements after the two kinds of phe-
nomena were expounded in detail. Experiment and simulating results were got with up
to the 75,6% of averaged classification accuracy .
Also in 2005, Bai et al. [59] studied ERD and their spatiotemporal patterns preced-
ing voluntary sequential finger movements. Movements analized were performed with
dominant right hand and no dominant left hand. Nine subjects performed self-paced
movements consisting of three key strokes with either hand. Subjects randomized the
laterality and timing of movements. Electroencephalogram (EEG) was recorded from
122 channels. Reference-free EEG power measurements in the β-band were calculated
off-line. The results showed that for right-handers, activation on the left hemisphere
during left hand movements is greater than that on the right hemisphere during right
hand movements.
In Lehtonen et al. [60] study in 2008 tried to determine BCI control accuracy by inex-
perienced subjects. After a 20 min training, ten subjects tried to move a circle from the
center to a target location at the left or right side of the computer screen by moving
their left or right index finger. Seven out of the ten subjects were able to control the BCI
well, choosing correct target in 84%-100% of the cases, 3.5-7.7 times a minute. Their
mean single trial classification rate was 80% and bit rate 10 bits/min. These results en-
courage the development of BCIs for paralyzed persons based on detection of single-trial
movement attempts.
Xian et al. [61] went in depth in 2007 in feature extraction during finger movement
tasks. They introduced discriminative spatial patterns (DSP) for better extraction of
the difference in the amplitudes of MRPs., integrated with CSP to extract the features
from the EEG signals. And they also designed a support vector machines (SVM) based
framework as the classifier for the features. The results showed that the combined spatial
filters can realize the single-trial EEG classification better than anyone of DSP and CSP
alone. Based on this outcome, they recommend the use of such an EEG-based BCI
system with the two feature sets, one based on CSP (ERD) and the other based on DSP
(MRPs), classified by SVM.
State of the art 28
Other neuroimaging methods were also studied for finger movements. In 2009, Kubanek
et al. [62] concluded that ECoG is a reliable method for studying cortical dynamics
associated to motor functions, and becomes very appropiate and accurate for powerfull
clinically practical BCI systems. They got highly specific finger flexion time courses
from the ECoG signals.
Mohamed et al. [63] investigated the EEG in order to look for ways and means to discrim-
inate between wrist and finger movements. They used Bhattacharyya distance (BD) for
feature reduction, artificial neural networks (ANN) together with Mahalanobis distance
(MD) as classifiers, and independent component analysis (ICA) and time-frequency tech-
niques to extract spectral features based on event-related (de)synchronisation (ERD/ERS).
With these combination of techniques they demonstrated the feasibility to discriminate
between wrist and finger movements with high accuracies.
A further step was achieved by Xiao R and Ding L [64, 65] in 2013 when evaluating
the feasibility of discriminating individual fingers from one hand using noninvasive EEG
in stead of the previous invasive ECoG methods. This study was finally successful
in decoding individual fingers and thus cleared a path for developing noninvasive BCI
applications with rich complexity and intuitive and flexible controls. In this study it
was demonstrated a much higher accuracy for finger movement detection of the features
from some identified different spatial and spectral patterns as compared with classic µ/δ
rhythms.
Last year, Lee et al. [66] evaluated possible adaptations of fNIRS methods for BCIs
systems. Final conclusion was that fNRIS methods allow to clearly differentiate finger
and thumb movements, opening the door for their usage within BCI systems.
In conclusion, as Bai et al. [59] said, for right-handers, activation on the left hemisphere
during left hand movement is greater than that on the right hemisphere during right
hand movement. For that reason, we are going to do the tasks with the left hand,
in order to find better results. Spatial filtering algorithms demonstrate to be a good
solution in finger imagery tasks. Also, it has been demonstrated in the Xiao R and Ding
L [65] study that is possible to discriminate between wrist and finger movements. Using
a PCA on EEG data in a finger movement task improved finger movement detection,
much better than classic µ/β rhythms. It also revealed discriminative information about
movement of different fingers.
State of the art 29
3.3 Algorithms based on CSP
Ramoser et al. [67] demonstrated that spatial filters for multichannel EEG effectively
extract discriminatory information from two populations of single-trial EEG, recorded
during left- and right-hand movement imagery, with 90.8%, 92.7%, and 99.7% of clas-
sification results. The spatial filters were estimated from a set of data by the method
of common spatial patterns and reflect the specific activation of cortical areas. They
concluded saying that the high recognition rates and computational simplicity made it
a promising method for an EEG-based BCI.
In Blanchard et al. [68], presented a method that estimates subject-specific spatial
filters which allow for a robust extraction of the rhythm modulations. The effectiveness
of the method was proved by achieving the minimum prediction error on data set IIa in
the BCI Competition 2003, which consisted of data from three subjects recorded in ten
sessions.
To address the problem of manually selecting the operational frequency band and chan-
nels group, several approaches have been proposed. Yang et al. [69] proposed a novel
channel selection method by measuring the inconsistencies from the outputs of the mul-
tiple classifiers, while Chin et al. [70] proposed DCA approach and DCR approach to
select subject-specific discriminative channels by iteratively adding or removing channels
based on the classification accuracies.
For optimization of the spectral filter, several novel approaches, namely, common spatio-
spectral pattern (CSSP) [71] were developed.
Common sparse spectral spatial pattern (CSSSP) method was proposed [72], this method
allows simultaneous optimization of an arbitrary FIR filter within CSP analysis. How-
ever, due to inherent nature of optimization problem, the solution of filter coefficients
will depend greatly on the initial points.
After, Iterative spatio-spectral patterns learning (ISSPL) [73], and filter bank common
spatial pattern (FBCSP) [74] were proposed.
Novi et al.[1] said that the change in the rhythmic patterns varies from one subject
to another, causing a time-consuming fine-tuning process in building a BCI for every
subject. To address this issue, they proposed a new method called Sub-band Common
Spatial Pattern (SBCSP), using a standard database from BCI Competition III to test it,
comparing the method with other ones used before. The results showed that it achieves
similar result as compared to the best one in the literature which was obtained by a
time-consuming fine-tuning process.
For using the CSP algorithm, is needed to set a relatively broad frequency range and
channels, or try to find subject-related frequency bands and channels. To solve this
State of the art 30
problem, Liu et al. [2] proposed an adaptive boosting algorithm to perform autonomous
selection of key channels and frequency band. Several comparisons were performed on
three datasets. Results showed that the algorithm yields relatively higher classifica-
tion accuracies compared with seven state-of-the-art approaches. Finally, these spatial
patterns (spatial weights) and spectral patterns (bandpass filters) can also be used for
further analysis of the data.
In conclusion, among various approaches developed for EEG signals, common spatial
patterns (CSP) has been proved to be one of the most effective algorithms.
Both algorithms that are going to be used in the project, developed several issues found
in other CSP algorithms: SBCSP method solve the problem of time-consuming fine-
tuning process and CSSBP method solves the problem of setting a relatively broad
frequency range and channels, or trying to find subject-related frequency bands and
channels.
Chapter 4
Signal processing
In this chapter, there are described the different methods and processes that take part
into the signal processing.
As it is explained in the background (see section 2.2.3) a BCI consists of five essential
elements: Signal acquisition, feature selection, feature translation, device output and
the operating protocol. Being the signal processing part the one that is divided into:
preprocessing, feature extraction, feature selection and classification.
The processing of EEG data is the central part of every BCI system, and the most
important one.
To do the preprocessing, it will be divided into: data loading and data handling.
From the feature extraction, selection and classification, two algorithms will be imple-
mented to process the data:
1. SBCSP (Sub-band Common Spatial Pattern)
2. CSSBP (Common Spatial-Spectral Boosting Pattern)
Both algorithms are based on CSP (Common spatial pattern). In the next sections,
there will be explained the preprocessing used, an overview of the CSP algorithm, and
the SBCSP and CSSBP methods that there are going to be used.
31
Signal processing 32
A brief overview of the signal processing is showed in the next figure:
Figure 4.1: Singal processing overview
4.1 Preprocessing
EEG signals can be contaminated by artifacts and noise, needing to be solved by different
techniques.
4.1.1 Data loading
The EEG data is read and acquired for treating and handling in order to be used by
CSSBP and SBCSP algorithms.
First, the EEG data has been digitized by an antialiasing and pre-filtered between 0,5
and 100Hz, and filtered by a Notch lowpass filter by 50 Hz.
Depending on the data struct and info that we are loading in, there have to be changed
some parameters and adapt the loading script to the dataset used.
In the next chapter (Method) there are explained all the dataset used in this project.
Signal processing 33
4.1.2 Data handling
At first, data handling is performed in order to ensure that research data is conveniently
stored, and available for the experiment. After exploring the research data by the
SBCSP algorithm based program, a poor accuracy results refer that stored data from
the Swartz Center for Computational Neuroscience (BCI Competition IV: dataset IIa)
is maybe incomplete.
At this step, the data is analyzed by specific eeg tools-viewer, such as Sigviewer and
EEGLAB. These results are less satisfactory because of incomplete data points, refer-
enced as the event ”start of a new segment(after a break), that are present in some cases
in all channels:
Figure 4.2: Screen capture of frame between 316.1 and 316.5 seconds
Most of the classification algorithms are based on linear combinations to assign class
labels, and therefore, are dependant of feature functions:
f(x) = (f1(x), ..., fm(x)) (4.1)
and feature weights:
w = (w1, ..., wm) (4.2)
That, added to the artifacts that could exist in the signals, mean incomplete data seri-
ously affects the accuracy results, because of NAN points as weights.
Signal processing 34
The existence of incomplete data forces to algorithm corrections. Fortunately, there are
some BCI-Matlab based packets such as biosig, which includes basic functions to correct
these issues. An effective solution to deal with EOG issues includes:
1. Artefacts-laden channels removal.
2. Data sequence of intentional artifacts, that is used to calculate regression parame-
ters in time domain, in order to overwrite loss-data points from a subject or from
the whole data.
3. This task is performed only for each epoch. That means a time interval, which is
extracted from trials records.
These functions are implemented based on biosig regressor packets EOGi, EOGii and
EOGiii. (See appendix B)
In the case of linear regression techniques, a proportion of the EOG recorded data
from the affected channel is subtracted from non-contaminated data. This data is al-
ternatively a linear combination from EOG channels. The equation (4.3) is solved by
estimating b, which minimizes the signal noise, as a main parameter to apply the re-
gression.
eegich,cl(t) = eegich,c(t)− eogtb (4.3)
Being ch,cl the clear channel and ch,c the contaminated channel.
Signal processing 35
4.2 Algorithms
After doing a huge preprocessing to prepare the signal to be analysed, the data is
prepared to be processed by the algorithms implemented.
4.2.1 CSP
As it was described before, both algorithms that are going to be used in the project
are based on the CSP method. This method has been used in several BCI systems (see
section 3.3) to extract topographic paterns of brain rhythm modulations, being one of
the most effective algorithms.
The goal of this method, is to design spatial filters to have new time series with variances
optimal to discriminate two different classes of EEG.
Having a single trial, and a signal E being a NXT matrix, defining T as the number
of samples in each channel, and E a N-channel spatial-temporal EEG signal. The
normalized covariance matrix is obtained from:
C =EET
trace(EET )(4.4)
Being C1 and C2 the covariance matrix of each class. These covariance matrix can be
computed by averaging over the trials.
The composite covariance matrix and the eigenvalue decomposition, being Fc a matrix
of normalized eigenvectors with corresponding matrix of eigenvalues, ψ is:
Cc = C1 + C2 = FcψFTc (4.5)
And the whitening transformation, that adjust the variances in the space covered by the
eigenvectors in Fc:
P = ψ−1/2F Tc (4.6)
The simultaneous diagonalization of whitened covariance matrices:
C1 = PC1PT (4.7)
C2 = PC2PT (4.8)
Signal processing 36
is the basis of the extraction of the CSP. Then it is calculated the orthogonal matrix U
and diagonal matrix λ:
C1 = UλUT (4.9)
C2 = U(1− λ)UT (4.10)
in order to maximize the differentiation between two groups of data. Finally, the CSP
projection matrix will be:
Wcsp = (UTP ) (4.11)
[1]
4.2.2 SBCSP [1]
CSP must only be applied to the specific from each subject informative frequency bands
µ and β. Consequently, it can be a poor frequency bands selection, that finalizes with
a poor recognition accuracy. Instead of doing an exhaustive search from each subject,
some other algorithms from CSP were implemented, but the solution of filter coefficients,
depending on the initial points, not the appropriate way. To overcome this limitation,
a method based on Sub-band CSP was proposed by [1].
The EEG signals are decomposed into sub-bands, with a filter bank. Then, the CSP is
applied at each of the sub-bands, defining a sub-band score. With a final fusion of this
score from each of the sub-bands, the final decision is made.
An system flowchart is showed here:
Figure 4.3: SBCSP: System flowchart
Signal processing 37
4.2.2.1 Filtering
At first, EEG data is decomposed into subbands of lower frequencies using a Butterworth
filter bank in order to extract SBCSP features. The lower and upper filter cut-offs are
set according to brain wave frequencies distribution:
Band Decomposed signal Frequency range (Hz)
Gamma D1 [30-100]
Beta D2 [14-30]
Alpha D3 [8-14]
Theta D4 [3,5-8]
Delta A4 [0-3,5]
Table 4.1: Bandpass frequencies SBCSP
4.2.2.2 Sub-band CSP
As it was explained before, the CSP method is a really useful method to extract brain
rhythm topographic patterns. In this case, the extraction of the patterns is done on
each of the sub-bands, so the form of the transformed signal at the k-th sub-band is:
Z(k) = W (k)cspE
(k) (4.12)
Defining Z(k) as signals that maximize the difference in variances of two classes would
complement the largest eigenvalues of the simultaneous diagonalization result, being
defined the SBCSP features as:
f (k)p = log(var(Z
(k)p )∑
p=2r var(Z(k)p )
) (4.13)
Signal processing 38
4.2.2.3 Sub-band Score (LDA)
Linear Discriminant Analysis (LDA) has been used as dimensionality reduction tech-
nique for pattern-classification to project the EEG signal onto a lower-dimensional space
with good class-separability to prevent overfitting. LDA computes the directions that
will represent the axes that maximize the separation between multiple classes:
Figure 4.4: LDA. Maximizing the component axes for class-separation [14]
LDA finds projection matrix, Wlda that guarantees this maximal separation, maximizing
SB that is the ratio between-class variance to SW , that is the within-class variance.
G(k) is the cost function at k-th sub-band.
G(k) =W
(k)Tlda S
(k)B W
(k)lda
W(k)Tlda S
(k)W W
(k)lda
(4.14)
S(k)B = (mk
2 −mk1)(mk
2 −mk1)T (4.15)
S(k)W =
∑f(k)p ∈c1
(fkp −mk1)2 +
∑f(k)p ∈c2
(fkp −mk2)2 (4.16)
Signal processing 39
Being c1 and c2: class 1 and class 2. mk1 and mk
2 are the empirical class computed from
the training set. Having two classes, the data will be projected into a 1D representation.
The score will be:
sk = W(k)Tlda f (k)p (4.17)
4.2.2.4 Sub-band Score Fusion
One way to implement fusion scores features is by Bayesan Classifiers.
With the assumption of class conditional distributions of scores are equal normal distri-
butions, for example:
p(sk|wi) = (2πσ(k)2i )−1/2exp(
−(sk − µ(k)i )2
2σ(k)2i
) (4.18)
Being estimated from the training set µ(k)i and σ
(k)i , that are the mean and standard
deviation of scores features.
The output of Bayesian classifiers as a meta-score expressed as k-vectors [X1, X2, X3, ..., Xk]T ,
being Xk = log(p(sk|w1)p(sk|w2)
)
As it is not optimal if the covariance matrices for the classes are different, it is proposed
a SVM at the output of the Bayesan classifier for the errors that it can produces.
Matrix results ensure linear series more dispersed for best segmentation by linear dis-
criminant methods. At this point, SVM and Bayes will be tested, however a best
approximation is obtained by logistic regression, which is finally applied in the results
chapter.
Signal processing 40
4.2.3 CSSBP [2]
The CSSBP algorithm comprises four differentiated steps of EEG measurements pro-
cessing: at first, multiple spatial filtering and bandpass filtering tasks are performed,
after filtering processes, the feature extraction is applied using CSP techniques, and to
sum up weak classifiers training and pattern recognition procedures are given.
Figure 4.5: Model training
Figure 4.6: EEG Classification
Signal processing 41
4.2.3.1 Objective
With a Etrain = {xn, yn}Nn=1, as the dataset, being xn and xn the nth sample and its
label, the objective is that with a set of preconditions ν, we need to get a combination
model F produced by a subset W ∈ ν combining all the submodels learned under the
condition Wk(Wk ∈W ) and minimize the classification error on Etrain
W ∗ = argminW (1
N)|{En : F (Xn,W ) 6= yn)}Nn=1| (4.19)
4.2.3.2 Spatial Channel selection
By the implementation of the program, a set of C channels are considered, which satisfies
the condition that the universal set µ are included into C , mathematically expressed
as |µk| ≤ |C|. In fact, this set of channels is represented by a binary vector 1xC, where
1 indicates the channel location within the channels array. At the end the goal is to
get the optimal combination classifiers F on the training data by combining different
channel set preconditions.
F (Etrain;S) =∑Sk∈S
αkfk(Etrain;Sk) (4.20)
Being Etrain = {xn, yn}Nn=1 the training dataset, fk the kth trained weak classifier with
Sk, the channel set precondition and αk the combination parameter.
4.2.3.3 Frequency Band Selection
Although the EEG data has been digitized by an antialiasing and pre-filtered between
0,5 and 100Hz, and filtered by a Notch lowpass filter by 50 Hz, according to data
recording specifications, a zero phase filter is applied to get the right close bandpass
interval, which is included between 8 and 32 Hz (an open stopband interval between 7
and 32Hz). The closed interval G is then splintered into distributed subbands Bi. The
splitting procedure is given under specific constraints:
Signal processing 42
Algorithm 1 Frequency band selection: splitting criteria
1: cover: ∪B∈DB = G. That means, that the closed interval G is equal to B, wich is
composed of B kind of subbands. union set operation of, which are also included
into the universal set D.
2: length: ∀B = (l, h) ∈ D,Lmin ≤ h − l ≤ Lmax where Lmin and Lmax are two
constants to determine the length of B.
3: overlap: ∀Bmin = (l, l + 1) ⊂ G, ∃B1, B2 ∈ D,Bmin ⊆ B1⋃B2
4: equal: ∀Bmin = (l, l + 1) ⊂ G, |{B : Bmin ⊂ B,B ∈ D)}|, where C us a constant.
An important task to apply the CSSBP and CSP based algorithms, is to determine a
right window timing; that means the start time and the end time. The start time is
included between 3 and 6 seconds. Sometimes, these time milestones are overpassed by
a second fraction between 0,1 and 0,5 seconds. The effects on the results are minimal
when time variations are constrained to this range.
At the next step, a CSP feature extraction is implemented. This process is based on
band frequencies distribution. The criteria to select such subbands is using a slider (a
window based strategy) which determines B subbands within the S universal set. But
fixed within a constrained close interval [8 32Hz]. The slider is given by four parameters
(L, S,W, T ), where L is step length, S is the start offset, W is the sliding window width
and T is the terminal offset.
Figure 4.7: Slider
Signal processing 43
This process can be implemented by a basic algorithm:
Algorithm 2 Window based strategy
1: for L = Lmin : Lmax
2: for T =
3: if L > T = Tmin : Tmax
4: RUN BAND WINDOW SLIDE
5: end
6: end
7: end
A band window with Wi slides from the start point Li with a step lenght Si and it
reaches the terminal point ti and output sliding windows as sub-bands:
BandSeti = Slide(Li, Si,Wi, Ti) (4.21)
By changing the four parameters one universal set is produced:
D =I⋃i=1
BandSeti (4.22)
To select the optimal frequency band, we try to seek an optimal band set B that produces
an optimal combination classifier F :
F (Etrain;B) =∑Bk∈B
αkfk(Etrain;Bk) (4.23)
4.2.3.4 Combination
A two-tuple υk = (Sk, Bk) is used to combine the channel selection and frequency band
selection, representing a spatial-spectral precondition. The combination function can
be:
F (Etrain; ν) =∑υk∈ν
αkfk(Etrain; υk) (4.24)
Signal processing 44
4.2.3.5 Training Step
To learn the model, it is proposed an adaptive boosting algorithm. The first step is the
training step, where weak classifiers are produced under different preconditions.
The training dataset Etrain is filtered under υk for each spatial-spectral precondition.
Then, being γ the model parameter, a weak classifier fk(Etrain; γ(υk)) is trained by the
filtered training dataset Etrain that gives the CSP features. A one-to-one relationship
between precondition υk and the learner fk is established. The classification error is:
{α, υ}Ko = min{α,υ}K0
N∑n=1
L(yn,K∑k=0
αkfk(xn; γ(υk))) (4.25)
Being K the number of weak learners and L the loss function.
4.2.3.6 Greedy optimization step
A second step is proposed to learn the model. To solve the problems explained above,
it is possible to use a greedy approach:
F (Etrain, γ, {α, υ}K0 ) =
K−1∑k=0
αkfk(Etrain; γ(υk)) + αKfK(Etrain; γ(υK)) (4.26)
Supposing that Fk−1(Etrain) is known:
Fk(Etrain = Fk−1(Etrain) + argminf
N∑n=1
L(yn, (Fk−1(xn) + αkfk(xn; γ(υk)))) (4.27)
Then, it is used a steepest gradient descent to solve the last equation. Given the pseudo-
residuals:
rπ(n)k = −5F L(yπ(n), F (xπ(n))) = −(∂L(yπ(n), F (xπ(n)))
∂F (xπ(n))) (4.28)
being {πn}Nn=1 the first N members of a random permutation of {n}Nn=1. And a new set
{xπ(n), rπ(n)k}Nn=1 is created and used to learn γ(υK)
Signal processing 45
γk = argminγ,ρ
N∑n=1
(rπ(n)k − ρf(xπ(n); γk(υk))) (4.29)
It is used a ”resample” heuristic for generating stochastic sequences. It is showed in the
next algorithm:
Algorithm 3 Resample Heuristic Algorithm for Stochastic Subset Selection
1: Initialize the training data pool (ρ0) = Etrain = {xn, yn}Nn=1
2: for k=1 to K
3: Generate a random permutation {πn}|ρk−1|n=1 = randperm({n}|ρk−1|
n=1 )
4: Select the first N elements {πn}Nn=1 as {xn, yn}Nn=1 from (ρ0)
5: Use these {πn}Nn=1 elements to optimize new learner fk and its related parameters
in Algorithm 2
6: Use current local optimal classifier Fk to split the original training set Etrain =
{xn, yn}Nn=1; into two parts Ttrue = {xn, yn}n:yn and Tfalse = {xn, yn}n:yn 6= Fk(xn)
Re-adjust the training data pool:
7: for each (xn, yn) ∈ Tfalse do
8: Select out all (xn, yn) ∈ ρk−1 as {xn(m), yn(m)}Mm=1
9: Copy {xn(m), yn(m)}Mm=1 with d(d ≥ 1) times so that we get total (d+1)M duplicated
samples
10: Return these (d+ 1)M samples into ρk−1 and we get a new adjusted pool ρk
11: end for
12: end for
With γ(υK), the combination coefficient is obtained as:
αk = argminα
N∑n=1
(L(yn, Fk−1(xn) + αfk(xn; γk(υk))) (4.30)
Signal processing 46
4.2.3.7 Whole process
Here, it is detailed the whole process of the method:
Algorithm 4 The Framework of Common Spatial-Spectral Boosting Pattern (CSSBP)Algorithm
1: Input:{xn, yn}Nn=1: EEG training set; L(y, x): The loss function; K: The capacity of
the optimal precondition set (number of weak learners); ν: A universal set including
all possible preconditions;
2: Output: F : The optimal combination classifier; {fk)}Kk=1: The weak learners;
{αk}Kk=1: The weights of weak learners; {υk}Kk=1: The preconditions under which
weak learners are trained.
3: Feed {xn, yn}Nn=1 and ν into a classifier using CSP to extract features to produce a
family of weak learners F , so that a one-to-one mapping is established: F ↔ ν
4: Initialize ρ0, F0(Etrain) = argminα∑N
n=1 L(yn, α)
5: for k = 1 to K do
6: Optimize fk(Etrain; γ(υk)) described in (4.29)
7: Optimize αk as described in (4.30)
8: Update ρk as in Algorithm 1 and Fk(Etrain = Fk−1(Etrain) + αkfk(Etrain; γ(υk))
9: end for
10: for each fk(Etrain; γ(υk)), use the mapping F ↔ ν to find its corresponded precon-
dition υk
11: return F, {fk}Kk=1, {αk}Kk=1, {υk}Kk=1
4.2.3.8 Parameter estimation
Here, there will be explained different estimations of some parameters:
• K is determined by using the early stopping strategy. [75]
• N if we decrease NN more randomness will be incorporated, and if we increase it,
more samples are brought into the model. a good rate could be NN=0.7.
• Being d the copies of incorrect classified samples when adjusting ρ, is determined
by:
d = max(1, b1− ee+ e
c) (4.31)
Being e = |Tfalse/N the classification error.
Signal processing 47
4.3 Conclusion
CSP gives a lot ot possible variations to improve the results in extracting topographic
patterns. To address the problem of manually selecting the operational subkect-specific
frequency band and channel group, several approaches have been proposed.
In this project SBCSP and CSSBP were used. SBCSP is based on divide the total EEG
signals with different filters, and apply them separately a Sub-band CSP and LDA.
Otherwise, CSSBP uses a simultaneous optimization of the frequency filter and spatial
filter.
Some modifications were needed in order to use the CSP in more than two classes, and
also some modifications were made to improve the results in both algorithms.
In the results chapter, it is going to be explain which changes and why were made,
justifying them with appropriate results.
In the next chapter, it is explained the different data that is going to be used and tested
with the algorithm described above.
Chapter 5
Method
This chapter will explain the experimental set-up of the project.
To test the algorithms we will use three different data-sets:
1. BCI competition III: Dataset IVa.
2. BCI Competition IV: dataset IIa.
3. Self-acquired dataset.
That are described in the next sections.
49
Method 50
5.1 BCI competition III: Dataset IVa
The effectiveness of the designed methos are firstly assessed on the widely used bench-
mark two-classes Dataset IVa from BCI competition III. [76]
This data set was recorded from five healthy subjects, sat in a comfortable chair with
arms resting on armrests. It contains only data from the 4 initial sessions without
feedback. Visual cues indicated for 3.5 s which of the following 3 motor imageries the
subject should perform: (L) left hand, (R) right hand, (F) right foot.
The presentation of target cues were intermittent by periods of random length, 1.75 to
2.25 s, in which the subject could relax.
There were two types of visual stimulation:
1. where targets were indicated by letters appearing behind a fixation cross (which
might nevertheless induce little target-correlated eye movements).
2. where a randomly moving object indicated targets (inducing target-uncorrelated
eye movements).
From subjects al and aw 2 sessions of both types were recorded, while from the other
subjects 3 sessions of type (2) and 1 session of type (1) were recorded.
The recording was made using BrainAmp amplifiers and a 128 channel Ag/AgCl elec-
trode cap from ECI. 118 EEG channels were measured at positions of the extended
international 10/20-system. Signals were band-pass filtered between 0.05 and 200 Hz
and then digitized at 1000 Hz with 16 bit (0.1 uV) accuracy. They provide a version
of the data that is down-sampled at 100 Hz (by picking each 10th sample) that they
typically use for analysis.
Only cues for the classes ’right’ and ’foot’ are provided for the competition.
Method 51
5.2 BCI Competition IV: dataset IIa.
After we have develop the algorithms in the two class dataset, we adapt them to work
in more than 2 classes, being the algorithms are assessed on the 4-class Dataset IIa from
BCI competition IV. [15]
This data set consists of EEG data from 9 subjects. The cue-based BCI paradigm
consisted of four different MI tasks: the left hand (class 1), right hand (class 2), both
feet (class 3), and tongue (class 4). Two sessions on different days were recorded for
each subject. In each session there are 6 runs with short breaks. In each session there
are 48 trials (12 for each of the four possible classes), being a total of 288 trials.
The paradigm for each task is the next one:
Figure 5.1: Paradigm of each task [15]
No feedback was provided.
There were used twenty-two Ag/AgCl electrodes (with inter-electrode distances of 3.5
cm) to record monopolarly the EEG with the next montage:
Method 52
Figure 5.2: Left: Electrode montage corresponding to the international 10-20 system.Right: Electrode montage of the three monopolar EOG channels. [15]
The reference was the left mastoid serving and the ground the right mastoid.
HDR is the structure containing information to interpret the EEG signal, and is struc-
tured by the next detailed fields:
• TYPE, VERSION - are fields which indicate the data format and version.
• Patient – this field contains user information such as birthdate, sex , and left or
right handed, etc.
• SampleRate – is the data sampling rate.
• NRec – is the sample number per channel (denoted by s).
• Filter – information about the signals, which were sampled with 250 Hz and
bandpass-filtered between 0.5 Hz and 100 Hz.
• PhysDim - gives information about the dimension units (Mv).
• CHANTYP- determines which columns of ” s ” belong to EEG or EOG channels.
• Label – gives certain information about the position of the sensors on the skull,
related to the columns of ” s ”. The relationship with the 10-20 system can be
seen in the previous figure.
• Classlabel - is a vector with as many elements as trials have been performed. It
can take four values, 1 if the subject is asked to imagine the movement of the left
hand, 2 if the right, 3 if asked to imagine that moves both feet, and 4 if the tongue.
Method 53
• ArtifactSelection - indicates which of the trials have been contaminated by arti-
facts.
• TRIG - gives the index of the trial rows.
• EVENT – gives information about events, occurred by signal capturing.
Method 54
5.3 Self-acquired dataset
Finally, when there were good results in the previous datasets, the algorithms are tested
with self-acquired data set, consisting in a 5-class dataset, from MI and real movements
of the fingers from the left hand.
Figure 5.3: Self-acquired dataset procedure
It is explained in different sections each of the parts of the system created to acquire
the data.
1. Materials: In this section, there are given the materials used during the process.
2. Subjects: Here, there are the details of the different subjects used in data acqui-
sition.
3. Visual stimulus: where it is explained the interface used as visual stimulus.
4. Data acquisition: the data acquisition process.
5. Experimental set-up: how all the system works.
Method 55
5.3.1 Materials
The materials used were provided by the Technical University of Denmark. Here the is
a list of the materials used in the trials:
Material Description
Electrode cap: g.GAMMAcap Electrode cap with the extended 10-20 system.
Electrodes: g.LADYbird 16 active ring electrode used with
g.GAMMAcap (EEG).
GND electrode: g.LADYbirdGND Passive ground ring electrode used with
g.GAMMAcap (EEG).
Ag/AgCl passive earclip Passive earclip electrode (reference).
Conductive gel: g.GAMMAgel Special highly-conductive, high-viscosity elec-
trode gel for the electrodes.
g.GAMMAbox for 16 channels, AC coupled Power supply and driver/interface box for 16 ac-
tive electrodes for use with g.USBamp.
g.USBampGAMMAconnector Connector cable between the g.USBamp and the
g.GAMMAbox.
g.USBamp High-performance and high-accuracy biosignal
amplifier and acquisition/processing system.
With USB cable to connect to PC.
PC Windows 7 PC equipped with the g.tec driver-
s/software and MATLAB.
LCD screen Screen connected to the PC to show the interface
to the subject.
Table 5.1: Materials.
Method 56
5.3.2 Subjects
Here subjects information is provided. A questionnaire is given to the subject before
the trial (please see Figure A.1).
In the next table there is a summarize of the subjects details that have participated in
the experiment:
Subject Nr. Gender Age First BCI Optical defects Handedness
1 Male 21 Yes No Right-handed
2 Male 23 Yes No Right-handed
3 Female 24 Yes Yes Right-handed
4 Female 24 No Yes Right-handed
5 Female 25 No No Right-handed
6 Male 27 Yes No Right-handed
7 Male 26 No Yes Right-handed
8 Male 21 Yes Yes Right-handed
9 Male 21 Yes Yes Right-handed
10 Female 26 Yes Yes Right-handed
11 Male 26 Yes Yes Right-handed
12 Male 24 No Yes Right-handed
Table 5.2: Subjects.
No one suffers any neurological disorder that could affect the experiment and all have
participated voluntarily, signing a participation agreement that shows it.
Method 57
5.3.3 Visual stimulus
For the visual stimulus it has been created a dynamic and realistic interface, that try to
imitate a finger movement as real as possible.
It is used a black background with a grey circle to give a 3D impression. Then, figures
of a left-hand, a right-hand and fingers (different frames of extension and flexion), and a
green box with the instructions are included to be processed (See Figures from Appendix
C). Figures from [77].
A frame-by-frame animation changes the contents of the Stage in every frame and is
best suited to a complex animation in which an image changes in every frame instead of
simply moving across the Stage. Using this method, it is possible to create a dynamic
and realistic sequence, that imitate the movement of the fingers of one hand.
For the purpose of the study, it is going to be include only the left hand in the interface.
The left hand will be set on the right part of the screen, in order to prevent that the
subject watch at his hand during the process and get the highest possible concentration
on the task.
The interface is created by using javascript and html, and the animation is generated
by flash. JavaScript animations are done by programming gradual frames transition in
an element’s style. The frames are called by a timer. When the timer interval is small,
the animation looks continuous. The code can be checked in the Appendix D.
The hand animation is composed by frames, which change by left mouse cliking, or by
Matlab-script based timing. Only the introduction frames are automatic, then the next
ones (finger movements) are activated by left-mouse clicking.
Finally, it is compiled as a SWF object. SWFObject is a small Javascript file used for
embedding Adobe Flash content. The script can detect the Flash plug-in in all major
web browsers (on Mac and PC) and is designed to make embedding Flash movies as
easy as possible.
The interface is controlled by the Scripts given in the Appendix B: One called ”Interface”,
to start and controls the interface timing, and the ”Mover” function, that combining
matlab with java packages is able to work as a virtual mouse, moving and clicking in
each finger, extending and flexioning, depending on what the user wants.
In the case of this study, as it is going to be explained after in the timing scheme, there
is a randomly moving of the fingers every two seconds, with a total of 250 per session.
Method 58
Finally, it seems like this:
The html link with the interface is open in the Matlab browser. A green window with
some information about the trial is given, and after 6 seconds the trial starts. Randomly,
every 2 seconds, the interface script choose one of the five fingers to be moved. With
the ”Mover” function, the mouse is moved into the position of the finger to be moved,
and click on it. When the finger is clicked on, it is flexioned (different frames appears
from the extended finger until a flexioned one, giving the impression of a flexion of the
finger). After half a second, the mouse click again and the finger is extended (again
with different frames, giving the impression of an extension of the finger.). The finger
movement (flexion-extension) is shown in the figure:
Figure 5.4: Index finger movement
Then, a new task with another finger starts after 1,5 seconds of break. In this case, the
interface is ended after a total of 250 movements (50 movements per finger), but it can
be easily changed with the movements and timing that the user wants.
Method 59
5.3.4 Data acquisition
To acquire the properly data it is used the a g.tec system. Components of this system
are mentioned before (please see Table 6.5)
In the next figure, it is showed a set-up of the EEG system used: electrodes placed on
the cap collect the EEG signals, with some artifacts included, the G.tec system receives
the signal and process it to finally send it to the computer for the storage and signal
processing.
Figure 5.5: G.tec system
5.3.4.1 Electrodes
To acquire EEG signals from the brain electrodes to be mounted on the scalp are needed.
16 electrodes are placed for the experiment with the help of the g.Gamma cap (Table 6.5),
in the positions showed in the picture:
Method 60
Figure 5.6: Electrode positioning. Modified from [16]
The election of these positions is based on the primary motor location and several articles
mentioned in the state of the art, that use similar MI in the fingers area, like in Daly et
al. [46] and In Lehtonen et al. [60].
In order to improve the quality and stability of the signals, the g.GAMMAgel (Table 6.5)
is used to have higher conductivity. The electrodes chosen for this experiment are the
g.LADYbird electrodes (Table 6.5).
Active electrodes are electrodes with some built in circuitry which amplifies the electrical
current. This greatly improves the signal quality received by the modular EEG.
5.3.4.2 Artefacts
Although EEG is designed to record cerebral activity, it also records electrical activities
arising from sites other than the brain. The recorded activity that is not of cerebral
origin is termed artifact and can be divided into:
1. Physiologic artifacts: generated from the patient, they arise from sources other
than the brain. Examples: Other muscle activity, eye movements, heart muscle or
respiration activity.
2. Extraphysiologic artifacts: arise from outside the body. For example: Cable move-
ments, interferences, equipment, environment.
To avoid some of these artifacts, it is needed to explain to the subject not to blink too
much or move any parts of the body during the tasks. Also, the experiments are done
Method 61
in a soundproof room, to avoid some interferences. Finally, the subject must be in a
relaxed position and feel comfortable.
5.3.4.3 G.tec system
The electrodes are connected to the amplifier by the g.GAMMAbox (Table 6.5), that
works as a power supplier and driver/interface of the electrodes, and the g.USBampGAMMAconnector
(Table 6.5), that connects the box with the amplifier.
The amplifier used is the g.USBamp (Table 6.5). The sampling rate is set to 256Hz,
typical for EEG measurements and the signal is filtered between 0,5 and 100Hz, and
with a 50Hz notch filter to remove noise. EEG signals from the brain have an amplitude
of 10-20 uV, needing amplification. The amplifier has an input range of +/- 250 mV,
which allows recording of DC signals without saturation and 24-bit resolution with
simultaneous sampling of all channels with up to 38.4 kHz.
Finally the amplifier is connected via USB 2.0 to the PC for data storage and signal
processing.
Figure 5.7: Amplifier used in the recording system. [17]
Method 62
5.3.4.4 Data storage
For the data storage it is used a PC with Windows 7, all the needed drivers installed
and MATLAB R2013a.
In the data storage, a file of type .mat is created with the EEG signals:
”data” is the structure containing the EEG signal, and is structured by the next detailed
fields:
• Filename - File where the data is storage.
• Subject – Name of the subject.
• SamplingFrequency – is the data sampling rate (256Hz).
• Duration – Total duration of the data.
• data – the EEG signals.
And in parallel and connected by UDP, another file .mat with the target cue information
for the after signal processing is created in the interface script:
”Targets” is the structure containing the Target cue information:
• Name - File where the data is storage.
• Subject – Name of the subject.
• Starttarget – Time when the target is starting.
• Targets – Number of the class of each target.
5.3.5 Experimental set-up
The experiments were made with 12 healthy subjects described before (See Table 5.2).
It was explained to the subjects the whole process, with a small example on 25 seconds
to know how the interface is going. After, a questionnaire described before was given
to the subjects, in order to know some personal details. Then, there were sited in a
chair, in front of the screen where the interface was going to be projected, and the cap
and electrodes were carefully set with the conductive gel. The subject was asked to be
relaxed and comfortable, and try don’t to blink too much during the trial.
Finally, all the system was connected and prepared for the trial.
Method 63
The total experiment consisted in two different trials:
1. Real movement of the fingers: The subject was cued to follow the finger movements
of the interface. The total number of movements per finger were 50, so a total of
250 finger movements and the order was randomly chosen for the 5 fingers.
2. MI of the fingers: The subject had to imagine the movement of each finger, de-
pending of how appears in the screen, trying to imitate the finger movements of the
interface. Again, 50 randomly trials per finger were showed, so a total of 250. As
subjects had been doing the real movement before, it should be easier to imagine
the same movements.
Both of the trials were with the left hand, because of the explanation that it was written
in the state of the art conclusions (See page 27).
For both tasks, same timing was used for the paradigm. Several timing schemes were
designed and tested until the final one, trying to find one with enough time to success
the real movement or MI, to be concentrated all the time, have a realistic finger timing
movement and have enough data for the after training.
Finally, the one selected based on results and opinions of some subjects and supervisor
was the next one:
Figure 5.8: Paradigm’s timing scheme.
Before the trial starts, a green window appears with some information of the interface.
After 6 seconds, the trial starts. There is a new movement every 2 seconds, with a cue
of the finger going down and up during half a second. So with a total of 250 movements,
the total task finishes after around 8 minutes.
Between the two trials, there was a break of around 2 or 3 minutes to prepare the subject
and have some rest but don’t be deconcentrated.
Method 64
After all the experiment, the subjects were suppose to fill a satisfaction survey (See
Figure A.2) to know their opinion about the trials.
Method 65
5.4 Conclusions
There is a really huge database used to improve the algorithms and prove them.
First one with two classes to analyse, that could be used to prove a first attempt of the
algorithms.
Second one with four classes, that could be used to adapt the processing system for more
than two classes, and improve it before using the final dataset.
And finally, the self-acquired dataset, that is the aim of the project. From 12 subjects,
2 tasks (MI and real movement) and 250 being enough data to train, test, prove and
compare all the methods used. In general, subjects were really happy with the experi-
ment, averages results of the satisfaction surveys given after the trials are shown in the
next table:
Question Satisfaction (1 very poor - 5 very good)
Explanation and understanding 4,92
Each finger trial timing 4,58
Total duration 3,66
Interface 4,75
Comfort 4,5
Overall 4,66
Table 5.3: Satisfaction survey results.
So it demonstrates that the timing scheme was appropriate and the subjects felt well
with all the process, being able to use it with post-stroke patients with maybe some
small changes.
The worst grade of the survey was for the total duration, that most of the subjects were
agree that it was too long. Otherwise, the purpose of this experiment was to have enough
data to then do a good signal processing, with less duration maybe the data could not
be enough for training and test. To be used with post-stroke patients we should shorten
the total duration for a better comfort of the patients.
Now, in the next chapter, there are shown the results for each of the dataset used.
Chapter 6
Results and discussion
6.1 Results
Here, there are explained the different results obtained from the datasets defined be-
fore, using each of the two algorithms proposed (SBCSP and CSSBP), comparing and
discussing them to obtain a final conclusion.
6.1.1 BCI competition III: Dataset IVa
For a first approximation of designing the algorithms, it was used the Dataset IVa from
BCI competition III, detailed in the last chapter.
This 2-class dataset is perfect to do a first evaluation of the algorithm, because as it was
explained in the signal processing part, CSP is first designed for two different classes.
First, it was done a initial design of a CSP, and after the SBCSP was developed based
on it.
In one first attempt, wrong frequency bands were taken, obtaining a poor accuracy.
Decomposed signal Frequency range (Hz)
D1 [11-14]
D2 [9-11]
D3 [7-9]
D4 [5-7]
D5 [3-5]
A5 [0-3]
Table 6.1: Bandpass frequencies: first attempt
67
Results and discussion 68
Figure 6.1: Accuracy in a first approximation in the SBCSP algorithm
Afterwards, to improve the accuracy, it was decided to change the frequency into the
typical values (see section 2.2.2), obtaining the next results.
Band Decomposed signal Frequency range (Hz)
Gamma D1 [30-100]
Beta D2 [14-30]
Alpha D3 [8-14]
Theta D4 [3,5-8]
Delta A4 [0-3,5]
Table 6.2: Bandpass frequencies: first attempt
Figure 6.2: Accuracy in a second approximation in the SBCSP algorithm
Results and discussion 69
Finally, it has been done a final approximation obtained by logistic regression, instead
with SVM and Bayes, which is finally applied to obtain the next classification showed:
Figure 6.3: Accuracy in a third approximation in the SBCSP algorithm
The algorithm code can be checked in the Appendix B.2.
6.1.2 BCI Competition IV: dataset IIa
In order to prepare the algorithms to the 5-class own-acquisition of data, there are test
and improve using this 4-class dataset that was defined in the last chapter.
A 4-class dataset is taken part, so it was needed to adapt the SBCSP algorithm for more
than one class. An iterations criteria, like in CSSBP is chosen in to work for more than
2 classes. The accuracy obtained it shown in the next figure:
Results and discussion 70
Figure 6.4: Accuracy SBCSP for 4-class dataset
The CSSBP was also implemented, as it was explained in the signal processing chapter.
The accuracy obtained is:
Figure 6.5: Accuracy CSSBP for 4-class dataset
Results and discussion 71
This analysis can be complemented by a concluding approximation, the confusion ma-
trix. Left hand instead a right hand selection was chosen:
Recognised as
Right hand Left hand
Right hand (true) 22 4
Left hand (true) 0 30
Table 6.3: Confusion matrix of BCI Competition IV: dataset IIa
The Matlab codes can be checked in the Appendix B.
6.1.3 Self-acquired dataset
After test the signal processing system with some data to validate it, now it was ready
to be used in the own-data collection: 5-class dataset with MI and real movements of
the fingers of the left hand.
In order to compare the results and to have the subject a reference from one trial to
another, first a real movement trial was done, and after it, an Imagery movement trial.
All the information about this data can be checked in the Methods chapter.
6.1.3.1 REAL MOVEMENT
In this case, the subjects had to do an actual movement, similar than the hand of the
interface. Having the SBCSP and CSSBP ready from the 4-class database, there were
implemented into this data.
Applying the SBCSP algorithm, the next accuracy was obtained:
Results and discussion 72
Figure 6.6: Accuracy SBCSP for 5-class dataset: Actual movement
Then, doing the same with the CSSBP algorithm, the were the next results:
Figure 6.7: Accuracy CSSBP for 5-class dataset: Actual movement
As in the last dataset, CSSBP algorithm had better results than SBCSP one.
Results and discussion 73
Finally, the analysis is completed with the confusion matrix. It is shown for this data
to distinguish between the different fingers movements. Is one important part of the
results, because it can be known how many correct classes were predicted and with
which ones were confused the wrong ones.
True Recognised as
Thumb Index Middle Ring Pinky
Thumb 30 7 6 4 3
Index 8 27 6 4 5
Middle 12 13 11 6 8
Ring 14 11 7 9 9
Pinky 14 11 5 7 13
Table 6.4: Confusion matrix: Actual movement
6.1.3.2 IM
Here, the subjects had to imaging the movement as the same time than the hand of the
interface was moving the fingers. Again, the results on SBCSP and CSSBP algorithms
are shown.
For the SBCSP algorithm, we obtained the next accuracy:
Figure 6.8: Accuracy SBCSP for 5-class dataset: IM
Results and discussion 74
Then, using the CSSBP algorithm, the accuracy was:
Figure 6.9: Accuracy CSSBP for 5-class dataset: IM
As in the last case, a confusion matrix is showed:
True Recognised as
Thumb Index Middle Ring Pinky
Thumb 24 10 7 5 4
Index 13 18 8 6 5
Middle 12 13 11 6 8
Ring 13 11 8 10 8
Pinky 10 11 6 9 14
Table 6.5: Confusion matrix: IM
Results and discussion 75
6.2 Discussion
Here, the results obtained before are discussed from each of the dataset analysed.
6.2.1 BCI competition III: Dataset IVa
As it was showed, a good improvement have been made thanks to this 2-class dataset
used. To have a first impression of how the SBCSP algorithm is working, knowing that
CSP is defined to work with 2-classes, the best option to start is with one dataset like
this one.
First, it was done a first approach of the SBCSP algorithm, designing it until it was
working with this data. After, decisions with the filters cutoff frequencies needed to be
done. With this two different elections, we wanted to show that a correct pre-election
of the cutoff frequencies is really important for the whole process.
First, an average of a more or less 52% of accuracy was obtained, with only one subject
upper than 60%. (see Figure 6.1) After, with the correct frequency bands selected, the
results were improved. The average was near the 60%, and 2 subjects were upper than
70% of accuracy. (see Figure 6.2)
The CSP algorithm was implemented in order to get an optimal spatial filter. The idea
was to lead to a new time series whose variances are suitable for the EEG classification.
However, this objective became unrealistic when this method has to be applied to dif-
ferent subjects.
In addition to neurophysiological features, the lack of standardization in getting EEG
signals without contaminated noise, creates complications by estimating a suitable com-
posite covariance. The eigenvalue decomposition can lead to critical errors by projecting
the CSP matrix. The results of experimenting SBCSP methods have led to higher com-
putational costs and poor accuracy performance. Instead of improving classification
accuracies, SBCSP method helps to explore improvement features, such classification
covariance minimization as an alternative to the whitening transformation.
Matrix results ensure linear series more dispersed for best segmentation by linear dis-
criminant methods.
At this point, SVM and Bayes are tested, however a best approximation is obtained by
logistic regression, which is finally applied to obtain the next classification showed. (see
Figure 6.3)
The MATLAB code used is showed in the Appendix B.
Results and discussion 76
Finally, it is clear that this dataset was really useful to improve the initial SBCSP
algorithm. Now, from this last implementation, we need to prepare it for more than
2-classes, as it was explained in the Signal Processing chapter, and we will test it, with
the CSSBP in the next database of 4-classes.
6.2.2 BCI Competition IV: dataset IIa
After test the SBCSP algorithm in the 2-class dataset, it had to be prepared to work
with more than two classes, and this 4-classes dataset was perfect for this purpose. To
be able to work with more than 2-classes, an iteration method was implemented in, as
in CSSBP method. Also, CSSBP was implemented, to test it before the 5-class fingers
MI dataset was proved.
SBCSP accuracies were with an average of around 72%, being in 3 of the subjects near
the 90% and below the 60% in only 2 subjects.
Then, CSSBP was tested, obtaining an average accuracy of 79%. In the cases of the
subjects 3 and 8, the accuracy is over the 90%, and in some other cases is around the
85%. Only in one of the subjects is under 60%. What means a good accuracy for a
4-class dataset.
Compared with the SBCSP, CSSBP accuracy is greater, as it was mentioned in [2]. Also,
the computational complexity is better in the CSSBP than in the SBCSP, consuming
the last one much more time and memory.
Another point such analysis is the data source structure. It was composed of failure
observations; in other words, how many failures are caused by a miss interpretation, left
hand instead a right hand selection. This analysis was complemented by the confusion
matrix, which is a statistic based method to get a reference about data classifications. In
the results, it can be seen a good relation between the true task and the recognised one,
being the left hand competently recognised and the right hand a 85% of recognition rate
instead the left hand. That means that both were dominantly and correctly identified
by the algorithm.
Results and discussion 77
6.2.3 Self-acquired dataset
In the self-acquired 5-class dataset of the imagery and actual movement of the fingers,
there have been tested the algorithms with different results.
First, for the actual movement, using the SBCSP method it was obtained a 54% of
average accuracy. In 2 cases it was over 70% and in another one near it. The accuracy
has been decreased respect to the 4-class dataset as a consequence of the complexity of
differentiation in the fingers and the difference of quality in the dataset acquisition.
In the CSSBP accuracy, it is obtained one accuracy over 80% and an average of around
59%.
As we expected, CSSBP method had better results than the SBCSP, as it happened in
the 4-class dataset, and as it was compared in [2].
Then, it was showed a confusion matrix. With these matrix, it can be appreciated how
many failures are caused by a miss interpretation for each of the 5 fingers. As it can be
seen, the results are not really equal, but it can be appreciated that thumb and index
fingers are the ones with better results, meaning a good relation between the true task
and the recognised one and that both were dominantly and correctly identified by the
algorithm.
These two fingers are dominant in all the other possible classes, meaning that even if
these fingers have not been activated, the other fingers are confused with them by the
system.
On the other hand, for the imagery movement tasks, both algorithms have been proved
also to compare the results. Again, CSSBP results are better than the SBCSP ones.
Comparing the accuracies obtained with the actual movements ones, there is a decrease
of them, probably because of the difficulties that the subjects could have to imaging the
movement of the fingers during the tasks.
Also, a confusion matrix was provided. As in the accuracy, the results are a bit lower
than with the actual movement, in part because of the difficulties that the subjects have
to imagine the movement of the fingers. But again, it can be appreciated a domain of
the thumb and index fingers, but now index finger has lower results.
In conclusion, it has been demonstrated that the CSSBP algorithm has shown better
results than the SBCSP also in this dataset. Furthermore, SBCSP has more computa-
tional and time cost than the CSSBP.
Comparing imagery and actual movement, it has been demonstrated a lower accuracy
results, with more confusion in the imagery movement. Finally, thumb and index fingers
Results and discussion 78
are the dominant in both movements, but in the IM part, the index finger and the rest
has more confusion due to the difficulties of the subject to distinguish the movements.
So only the thumb and pinky ones has similar results than in the real movement. It can
be maybe because of the position in the hand, being situated in the extremes and thus
easier to imagine them without confusion.
6.2.4 Future work
In this project we have seen several studies and approaches of how to be able to create
and improve a BCI system for MI and actual movement of the fingers of one hand.
All these approaches and studies can be used for a future system of neurorehabilitation
in stroke people.
First, the way of doing the interface used in this project can be implemented easily for
other BCI system. Frame-by-frame techniques is easy to use, obtaining a visual effect
really attractive and real, interacting easier and in a more pleasant way with the subject
than with a basic interface, and being able the subject to be much more concentrated or
inclusive have fun with it. Thanks to the realistic movement, the subject could be easily
able to follow it, and improve the data acquired features. As the subjects said after the
trial, they were improving the actual movement or MI after some tasks, in part thanks
to how the interface was made.
Furthermore, the interface used can be implemented easily for an online system with
post-stroke patients, and can be used as a visual feedback with the subject. As it was
said in the state of the art chapter, visual feedback improve the performance significantly,
training to the subject and being able to participate in more sessions.
One example of how to implement it as a feedback could be through a game. A balloon
could appears in the palm of the hand, just below one of the fingers, and the subject
has to be able to imagine the movement of this finger in order to touch the balloon and
exploit it. If he achieves it, and the classification is correct, the balloon exploits and
some points are added to a scoreboard as a reward, beefing up the subject to do his best
and try to achieve as many points as could. This method could be really useful because it
maintain the subject concentrated, having fun and feeling comfortable with the system,
that is one of the most important things when a post-stroke patient is exposed to such
experiments.
In addition to other therapies, to have an effective rehabilitation of the patients, a BCI-
FES system could be implemented in. In the background and state of the art chapter
it has been explained how these systems are having really good results with post-stroke
Results and discussion 79
people rehabilitation. So when the subject achieve the task (feedback achieved), the
electrical stimulation is activated into this finger, being able to move it.
Referring to the algorithms, with the accuracies and results obtained, them could be
used in most of MI tasks. These algorithms are able to work now in more than 2 classes,
and have demonstrated to have good results in this project, being prepared to be used
in other BCI systems after adapting them to the aim of the system. CSP has been
consolidated as one of the best methods to be based in on this experiments.
Some improves could be done. The first thing could be to standardize the data handling
to, for example, avoid eye contamination; that is, through a shared algorithm as for
example CSP, all systems work from this algorithm (changing it, improving it) but all
based in CSP. The second is to establish standard formats that simplify reading and
data management, there are so many resources that are lost in just reading data. Third
could be to implement control algorithms in the sliders (see sliders in section 4.2.3.3).
Finally, all the processes are very manual, it could be better to set criteria controls such
as response times, etc.
Even though it is true that it’s something new and there are improvements every time.
Chapter 7
Conclusion
The objetive of this project was to compare the algorithms SBCSP and CSSBP, both
based in CSP algorithm, in fingers imaginary movement.
After a huge review of how the system works, CSP algorithm was designed. After,
SBCSP method was implemented and tested with one 2-class dataset (right hand and
foot). After improving the algorithm working in 2-classes, it was evolved to work in
more than 2 classes, thus tested in one of 4-class (left hand, right hand, feet and tongue),
preparing it for the aim of 5 classes finger IM.
In parallel, a CSSBP was designed and tested directly to the same 4-class dataset.
Both algorithms were compared, having better results in the CSSBP, with an average
of 79% and in some subjects around a 90%.
A preprocessing was needed because of some Nan values of the data that were present
in the channels affecting seriously the results.
Then, an interface was developed to acquire an own-dataset from 12 healthy subjects,
consisting in two trials: One of actual movement and one of imagery movement. The
interface consisted in a 3D hand with movements of the fingers. The interface was very
well received by subjects with very good reviews. All the facilities of recording were
provided by DTU.
As in the 4-class dataset, SBCSP and CSSBP were compared into the finger actual
movements and IM ones. In both cases, like in the 4-classes dataset, CSSBP had better
results than SBCSP algorithm, and also required a lower degree of computational com-
plexity. Moreover, the accuracies were lower than in the 4-class dataset, with an average
of around a 57%.
81
Conclusion 82
Actual and imagery movements were compared in order to know more about the dif-
ferences that it can be found into these movements. Actual movements results were a
bit better than the IM ones. Thumb and index fingers were dominant in the actual
movement, but in the case of the motor imagery tasks, index finger movement was con-
fused more times and thumb and little finger ones were maintained respect to the actual
movement, being one possible reason their position in the hand, situated in the extremes
and thus easier to imagine them without confusion.
The results showed that it could be possible to apply the algorithms into a MI-BCI
system of the fingers of one hand, and also the interface could be applied in some online
BCI systems as feedback. In the last chapter, some future works suggestions were made,
and also some possible improvements into this study and the field of BCI.
Bibliography
[1] Q. Novi, C. Guan, T. H. Dat, , and P. Xue. Sub-band common spatial pattern
(sbcsp) for brain-computer interface. proc. 3rd int. ieee/mbs conf. neural engineer-
ing, cne’07, 2007, ieee. page 204–207.
[2] Ye Liu, Hao Zhang, Min Chen, and Liqing Zhang. A boosting-based spatial-spectral
model for stroke patients’ eeg analysis in rehabilitation training. ieee transactions
on neural systems and rehabilitation engineering, vol. 24, no. 1, january 2016. pages
169–179, .
[3] Anatomy and physiology: Midbrain hindbrain anatomy and physiology
of the brain forebrain cerebral cortex. URL http://www.lnpress.com/
anatomy-and-physiology-of-the-brain-human-of-nervous-system-pdf-powerpoint-quiz-spinal-cord-function/
midbrain-hindbrain-anatomy-and-physiology-of-the-brain-forebrain-cerebral-cortex/.
[4] Central nervous system. URL http://www.highlands.edu/academics/
divisions/scipe/biology/faculty/harnden/2121/notes/cns.htm.
[5] British Neuroscience Association European Dana Alliance for the Brain. Science of
the brain. An introduction for young students.
[6] Allen and Barres. Nature. 2009 Feb 5;457(7230):675-7. doi: 10.1038/457675a.
[7] Nucleus Medical Media. Medical illustration. URL www.nucleuscatalog.com.
[8] G. Pfurtscheller, Lopes da Silva, and F. H. Summarized by Yatin Mahajan. Event-
related eeg/meg synchronization and desynchronization: Basic principles. (1999).
clinical neurophysiology. pages 110 1842–1857, .
[9] Veronika Bartosova, Oldrich Vysata, and Ales Prochazka. Graphical user inter-
face for eeg signal segmentation. institute of chemical technology, department of
computing and control engineering. pages 110 1842–1857.
[10] Wolpaw JR1, Birbaumer N, McFarland DJ, Pfurtscheller G, and Vaughan TM.
Brain-computer interface for communication and control. clin neurophysiol. 2002
jun. pages 113(6):767–91.
83
Bibliography 84
[11] L. F. Nicolas-Alonso and J. Gomez-Gil. Brain computer interfaces: a review. sen-
sors, vol. 12, no. 2. 2012. pages 1211–1279.
[12] Motor Imagery and Direct Brain–Computer Communication. Motor imagery and
direct brain–computer communication. proceedings of the ieee, vol. 89, no. 7, july
2001.
[13] Case Western University PH Peckham, Cleveland FES Center. Second-generation
implantable fes system for upper limbs. URL http://www.nature.com/sc/
journal/v46/n4/images/3102091f4.jpg.
[14] Sebastian Raschka. Linear discriminant analysis. aug 3, 2014. URL http:
//sebastianraschka.com/Articles/2014_python_lda.html.
[15] C. Brunner, R. Leeb, G.R. Muller-Putz, A.Schlogl, and G-Pfurtscheller. Bci com-
petition iv 2008 – dataset iia.
[16] Suwicha Jirayucharoensak, Setha Pan-Ngum, and Pasin Israsena. Eeg-based emo-
tion recognition using deep learning network with principal component base covari-
ate shift adaptation .hindawi publishing corporation. the scientific world journal
volume 2014, article id 627892, 10 pages, http://dx.doi.org/10.1155/2014/627892.
[17] G.usbamp brochure. URL https://www.neurospec.com/Content/Uploads/1002/
NEUROSPEC%20-%20g.USBamp.pdf.
[18] Stroke Association. State of the nation, stroke statistics. .
[19] Society for neuroscience. Brain facts: A primer on the brain and ner-
vous system. URL http://www.brainfacts.org/~/media/Brainfacts/Article%
20Multimedia/About%20Neuroscience/Brain%20Facts%20book.ashx.
[20] Mayfield clinic and spine institute. Anatomy of the brain. URL
http://www.uni-heidelberg.de/md/izn/teaching/neuroscience/img/
neuroscience-of-the-brain-english.pdf.
[21] Lauren Reed-Guy. Medically reviewed by Deborah Weatherspoon. Brain disorders.
[22] Stroke Association. Stroke for employers. march 2015. .
[23] Mayo Clinic Staff. Stroke rehabilitation: What to expect as you recover. URL
www.mayoclinic.org.
[24] Northstar Neurscience. Muscle weakness after stroke: Hemiparesis. nsa hemiparesis
brochure.
Bibliography 85
[25] Bruce H. and Dobkin. J. Bci technology as a tool to augment plasticity and out-
comes for neurological rehabilitation. physiol. 2007. pages 579;637–642.
[26] IEEE] Joseph N. Mak [Member and Jonathan R. Wolpaw. Clinical applications of
brain-computer interfaces current state and future prospects. ieee rev biomed eng.
2009. page 2: 187–199.
[27] Kai Keng Ang* and Cuntai Guan. Brain-computer interface in stroke rehabilitation.
journal of computing science and engineering, vol. 7, no. 2, june 2013. pages pp.
139–146.
[28] S. Silvoni, A. Ramos-Murguialday, M. Cavinato, C. Volpato, G. Cisotto, A. Turolla,
F. Piccione, and N. Birbaumer. Brain-computer interface in stroke: a review of
progress. clinical eeg and neuroscience, vol. 42, no. 4. 2011. pages 245–252.
[29] G. Pfurtscheller, Lopes da Silva, and F. H. Event-related eeg/meg synchronization
and desynchronization: Basic principles. (1999). clinical neurophysiology. pages 110
1842–1857, .
[30] Hong-Gi Yeom and Kwee-Bo Sim. Ers and erd analysis during the imaginary move-
ment of arms. international conference on control, automation and systems 2008 oct.
2008 in coex, seoul, korea. pages 14–17.
[31] A.K. Mohamed, T. Marwala, and L.R. John. Single-trial eeg discrimination between
wrist and finger movement imagery and execution in a sensorimotor bci. 33rd annual
international conference of the ieee embs boston, massachusetts usa, august 30 -
september 3, 2011. .
[32] J. Kubanek, K.J. Miller, J.G. Ojemann, J.R. Wolpaw, and G. Schalk. J. Decoding
flexion of individual fingers using electrocorticographic signals in humans. neural
eng. 2009 december. page 6(6): 066001, .
[33] Fei Meng, Kai yu Tong, Suk tak Chan, Wan wa Wong, Ka him Lui, Kwok wing
Tang Xiaorong Gao, and Shangkai Gao. IEEE. Bci-fes training system design and
implementation for rehabilitation of stroke patients. 2008 ieee.
[34] J. E. Sullivan and L. D. Hedman. Effects of home-based sensory and motor ampli-
tude electrical stimulation on arm dysfunction in chronic stroke. clin. neurophysiol.,
vol. 21. 2007. pages pp. 142–150.
[35] Guyton and John Hall Hall Textbook of Medical Physiology. Elsevier health sci-
ences. 13th edition, , may 31, 2015.
Bibliography 86
[36] M.R. Popovic and T.A. Thrasher. Neuroprostheses. encyclopedia of biomaterials
and biomedical engineering, g.e. wnek and g.l. bowlin, eds.: Marcel dekker, inc.,
vol. 2. 2004. pages 1056–1065.
[37] Multiple sclerosis Trust. Functional electrical stimulation (fes). URL https://
www.mstrust.org.uk/a-z/functional-electrical-stimulation-fes.
[38] Pfurtscheller G, Muller GR, Pfurtscheller J, Gerner HJ, and Rupp R. ‘thought’-
control of functional electrical stimulation to restore hand grasp in a patient with
tetraplegia. neurosci lett 2003. pages 351(1): 33–36, .
[39] Hochberg LR, Serruya MD, Friehs GM, Mukand JA, Saleh M, and Caplan AH
et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia.
nature 2006;. pages 442(7099): 164–171.
[40] A. Mohapp, R. Scherer, C. Keinrath, P. Grieshofer, G. Pfurtscheller, and C. Ne-
uper. Single-trial eeg classification of executed and imagined hand movements in
hemiparetic stroke patients. international brain-computer interface workshop and
training course, graz, austria, 2006. pages 80–81.
[41] Bai O, Lin P, Vorbach S, Floeter MK, Hattori N, and Hallett M. A high performance
sensorimotor beta rhythm-based brain-computer interface associated with human
natural motor behavior. j neural eng 2008. pages 5(1): 24–35.
[42] E. Buch, C. Weber, L. G. Cohen, C. Braun, M. A. Dimyan, T. Ard, J. Mellinger,
A. Caria, S. Soekadar, A. Fourkas, and N. Birbaumer. Think to move: a neuro-
magnetic brain-computer interface (bci) system for chronic stroke. stroke, vol. 39,
no. 3. 2008. pages pp. 910–917.
[43] K. K. Ang, C. Guan, K. S. G. Chua, B. T. Ang, C. W. K. Kuah, C. Wang, K. S.
Phua, Z. Y. Chin, and H. Zhang. A clinical study of motor imagery-based brain-
computer interface for upper limb robotic rehabilitation. in proceedings of the 31st
annual international conference of the ieee engineeringin medicine and biology so-
ciety, minneapolis, mn, 2009. pages 5981–5984, .
[44] Ang KK, Guan C, Chua KS, Ang BT, Kuah C, and Wang C et al. Clinical study of
neurorehabilitation in stroke using eeg-based motor imagery braincomputer inter-
face with robotic feedback. conf proc ieee eng med biol soc 2010. pages 1: 5549–5552.
[45] Meng F, Tong K, Chan S, Wong W, Lui K, Tang K, and Gao S. Bci-fes training
system design and implementation for rehabilitation of stroke patients. proceedings
of the ijcnn. 2008. pages 4102–4105.
Bibliography 87
[46] J. J. Daly, R. Cheng, J. Rogers, K. Litinas, K. Hrovat, and M. Dohring. Feasibility
of a new application of noninvasive brain computer interface (bci): a case study of
training for recovery of volitional motor control after stroke. journal of neurologic
physical therapy, vol. 33, no. 4. 2009. pages 203–211.
[47] Takahashi M and Ito K Gouko M. Functional electrical stimulation (fes) effects
for event related desynchronization (erd) on foot motor area. icme international
conference on complex medical engineering 2009. pages 1:1–6.
[48] Tan HG, Kong KH, Shee CY, Wang CC, Guan CT, and Ang WT. Post-acute stroke
patients use brain-computer interface to activate electrical stimulation. engineering
in medicine and biology society (embc), 2010 annual international conference of the
ieee 2010. pages 4234–4237.
[49] Tam W. K., Meng F. Tong K. Y., and Gao S. K. A minimal set of electrodes for
motor imagery bci to control an assistive device in chronic stroke subjects: a multi-
session study. ieee transactions on neural systems and rehabilitation engineering
vol.19. 2011. pages 617–62.
[50] Prasad G, Herman P, Coyle D, McDonough S, and Crosbie J. Applying a brain-
computer interface to support motor imagery practice in people with stroke for
upper limb recovery: a feasibility study. j neuroeng rehabil 2010. page 7(1): 60, .
[51] Tavella M, Leeb R, Rupp R, and Millan J del R. Towards natural non-invasive
hand neuroprostheses for daily living. conf proc ieee eng med biol soc 2010. pages
126–129.
[52] Broetz D, Braun C, Weber C, Soekadar SR, Caria A, and Birbaumer N. Combi-
nation of brain-computer interface training and goal-directed physical therapy in
chronic stroke: a case report. neurorehabil neural repair 2010. pages 24(7): 674–679.
[53] Caria A, Weber C, Brotz D, Ramos-Murguialday A, Ticini LF, Gharabaghi A,
Braun C, and Birbaumer N. Chronic stroke recovery after combined bci training
and physiotherapy: a case report. psychophysiology 2011. pages 48(4): 578–582.
[54] K. K. Ang, C. Guan, K. S. G. Chua, B. T. Ang, C. W. K. Kuah, C. Wang, K. S.
Phua, Z. Y. Chin, , and H. Zhang. A large clinical study on the ability of stroke
patients to use eeg-based motor imagery brain-computer interface. clinical eeg and
neuroscience, vol. 42, no. 4. 2011. pages 253–258, .
[55] K. K. Ang, C. Guan, K. S. Phua, C. Wang, I. Teh, C. W. Chen, , and E. Chew.
Transcranial direct current stimulation and eeg-based motor imagery bci for upper
limb stroke rehabilitation. in proceedings of the 34th annual international conference
Bibliography 88
of the ieee engineering in medicine and biology society, san diego, ca, 2012. pages
4128–4131, .
[56] Y. Kasashima, T. Fujiwara, Y. Matsushika, T. Tsuji, K. Hase, J. Ushiyama,
J. Ushiba, , and M. G. Liu. Modulation of event-related desynchronization dur-
ing motor imagery with transcranial direct current stimulation (tdcs) in patients
with chronic hemiparetic stroke. experimental brain research, vol. 221, no. 3. 2012.
pages 263–268.
[57] M. Mihara, N. Hattori, M. Hatakenaka, H. Yagura, T. Kawano, T. Hino, and
I. Miyai. Near-infrared spectroscopymediated neurofeedback enhances efficacy of
motor imagery- based training in poststroke victims: a pilot study. stroke, vol. 44,
no. 4. 2013. pages 1091–1098.
[58] Liu, B Wang M, Li T, and Liu Z. Identification and classification for finger move-
ment based on eeg. proceedings of the 27th annual internacional conference of the
ieee engineering in medicine and biology society. 2005. pages 5408–5411, .
[59] O. Bai, Z. Mari, S. Vorbach, and M. Hallett. Asymmetric spatiotemporal patterns of
event-related desynchronization preceding voluntary sequential finger movements:
a high resolution eeg study clin neurophysiol, 116 (2005). page 1213–1221.
[60] J. Lehtonen, P. Jylanki, L. Kauhanen, and M. Sams. Online classification of single
eeg trials during finger movements. ieee transactions on biomedical engineering 55
(2) (2008). page 713–720.
[61] L. Xiang, Y. Dezhong, D. Wu, and Chaoyi L. Combining spatial filters for the
classification of single-trial eeg in a finger movement task. ieee trans. biomed. eng
2007. 54. page 821–831.
[62] J. Kubanek, K. J. Miller, J. G. Ojemann, J. R. Wolpaw, and G. Schalk. Decoding
flexion of individual fingers using electrocorticographic signals in humans. j. neural
eng., vol. 6, no. 6. 2009. pages 066 001–066 001, .
[63] A.K. Mohamed, T. Marwala, and L.R. John. Single-trial eeg discrimination between
wrist and finger movement imagery and execution in a sensorimotor bci. 33rd annual
international conference of the ieee embs. pages 6289–6293, .
[64] Xiao R and Ding L. Evaluation of eeg features in decoding individual finger move-
ments from one hand comput. math. methods med. 2013.
[65] Xiao and L R., Ding. Eeg resolutions in detecting and decoding finger movements
from spectral analysis. frontiers in neuroscience. 2015. vol.9. page 308.
Appendix A: Questionnaire 89
[66] Lee S. H., Jin S. H., Jang G., Lee, Y. J., An J., and Shik H. K. Cortical activation
pattern for finger movement: A feasibility study towards a fnirs based bci. in control
conference (ascc), 2015 10th asian. pages 1–4.
[67] H. Ramoser, J. Muller-Gerking, and G. Pfurtscheller. Optimal spatial filtering of
single trial eeg during imagined hand movement. ieee trans. rehabil. eng., vol. 8,
no. 3, may 2000. page 441–446.
[68] G. Blanchard and B. Blankertz. Bci competition 2003-data set iia: Spatial patterns
of self-controlled brain rhythm modulations. ieee trans. biomed. eng., vol. 51, no.
6, jun 2004. page 062–1066.
[69] H. Yang, C. Guan, K. K. Ang, K. S. Phua, and C. Wang. Selection of effective eeg
channels in brain computer interfaces based on inconsistencies of classifiers. proc.
embc 2014, 2014 ieee. page 672–675.
[70] Z. Y. Chin, K. K. Ang, C. Wang, and C. Guan. Discriminative channel addition
and reduction for filter bank common spatial pattern in motor imagery bci. proc.
embc 2014, 2014. page 1310–1313.
[71] S. Lemm, B. Blankertz, G. Curio, and K.-R. Muller. Spatio-spectral filters for
improving the classification of single trial EEG. ieee trans. biomed. eng., vol. 52,
no. 9, sep. 2005. page 1541–1548.
[72] G. Dornhege, B. Blankertz, M. Krauledat, F. Losch, G. Curio, and K.-R. Muller.
Combined optimization of spatial and temporal filters for improving brain-computer
interfacing. ieee trans. biomed. eng., vol. 53, no. 11, nov. 2006. page 2274–2281.
[73] W. Wu, X. Gao, B. Hong, and S. Gao. Classifying single-trial eeg during motor
imagery by iterative spatio-spectral patterns learning (isspl). ieee trans. biomed.
eng., vol. 55, no. 6, jun. 2008. page 1733–1743.
[74] K. K. Ang, Z. Y. Chin, H. Zhang, and C. Guan. Mutual information-based se-
lection of optimal spatial-temporal patterns for single-trial eeg-based bcis. pattern
recognition, vol. 45, no. 6 , 2012. page 2137–2144, .
[75] T. Zhang and B. Yu. Boosting with early stopping: Convergence and consistency.
ann. statistics. 2005. page 1538–1579.
[76] Berlin BCI group: Fraunhofer FIRST, Benjamin Blankertz) Intelligent Data Analy-
sis Group (Klaus-Robert Muller, and Neurophysics Group (Gabriel Curio) Campus
Benjamin Franklin of the Charite University Medicine Berlin, Department of Neu-
rology. Data set iva motor imagery, small training sets. bci competition iii. 2005.
[77] ICT games. Hand images. URL http://www.ictgames.com/resources.html.
90
Appendix A: Questionnaire 91
Appendix A
Questionnaire
Figure A.1: Questionnaire: Before the trial
Appendix A: Questionnaire 92
Figure A.2: Questionnaire: After the trial
Appendix B
MATLAB Code
B.1 Interface
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
2 %
%In t e r f a c e .m s t a r t s and c on t r o l s the i n t e r f a c e t iming . Also save a .mat
4 %f i l e with the t a r g e t s in fo rmat ion
%
6 %INPUT: ”where” where to save the output f i l e
% ”who” Name o f the sub j e c t
8 % ”what” Tr i a l in fo rmat ion : Real or imagery movement
%
10 %OUTPUT: ” i n f o ” F i l e with a l l the t a r g e t s in fo rmat ion
%
12 %Author : Angel Post Vinuesa
%
14 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
16 c l e a r ; c l c
18 %% Fi l e in fo rmat ion
where = ’ r e co rd ing s / ’ ; % f o l d e r
20 who = ’ Angel Post ’ ; % Name o f the sub j e c t
what = ’ TargetsMI ’ ; % MI or r e a l
22
% Create f o l d e r and prepare f i l e name
24 space \ l o c a t i o n = f i nd (who==’ ’ ) ;
f o l d e r = s t r c a t (where , who , ’ / ’ ,what , ’ / ’ ) ;
26 i f ˜ e x i s t ( f o l d e r , ’ d i r ’ )
mkdir ( f o l d e r ) ;
28 end
f i l e = s t r c a t ( f o l d e r , who , what , ’ . mat ’ ) ; %name o f the f i l e
93
Appendix B: MATLAB code 94
30
markers = [ ] ; % i n i t i a l i z e markers
32 c l a s s e s = [ ] ; % i n i t i a l i z e c l a s s e s
dedo=0; %i n i t i a l i z e dedo
34
%send the order to s t a r t to the r e co rd ing s c r i p t
36 f p r i n t f ( ’ Sending the order ’ )
judp ( ’ send ’ ,12000 , ’ 1 2 7 . 0 . 0 . 1 ’ , i n t8 ( ’ Sip ’ ) )
38
%address o f the i n t e r f a c e
40 u r l=’C: / Users /User/Desktop/ s c r i p t s / S c r i p t s /mios/ f unny f i n g e r s v 2 . htm ’ ;
web( u r l ) ;%open i n t e r f a c e in browser
42 % th i s i s to s t a r t the task
pause ( 6 . 0 89 ) ; %wait 6 seconds
44 S = 0 ; %i n i t i a l i z e t imer
46 f o r i =1:2 %50 t r i a l s per f i n g e r
f i n g e r p o s i t i o n = randperm (5) ; % random order o f f i n g e r appear ing
48 f o r k=1:5 %f o r each f i n g e r
t i c
50 dedo = f i n g e r p o s i t i o n (k ) ; % f i n g e r to move
Mover ( dedo ) ; %f i n g e r movement
52 markers = [ markers S ] ; %save the time o f the task
c l a s s e s = [ c l a s s e s f i n g e r p o s i t i o n ] ; %save the f i n g e r number
54 S=S+toc ; %upload the time
end
56 end
58 %% Save f i l e
i n f o = s t r u c t ( ’Name ’ , f i l e , . . .
60 ’ Subject ’ , who , . . .
’ S t a r t t a r g e t ’ , markers , . . .
62 ’ Targets ’ , c l a s s e s ) ;
64 save ( f i l e , ’ Targets ’ ) ;
f p r i n t f ( ’ F in i shed \n ’ )
Code B.1: interface
Appendix B: MATLAB code 95
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
2 %
%Mover .m con t r o l s the i n t e r f a c e c o n t r o l l i n g a v i r t u a l mouse
4 %
%INPUT: ”dedo” f i n g e r to move
6 %
%Author : Angel Post Vinuesa
8 %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
10
f unc t i on [ ] = Mover ( dedo )
12
import java . awt . Robot ;
14 import java . awt . event . ∗ ;
16 robot = Robot ; %i n i t i a l i z e robot
18 % f i n g e r goes down
switch dedo
20 case 1
robot . mouseMove (2500 , 800) ; %mouse moves to t h i s part o f the s c r e en
22 case 2
robot . mouseMove (2600 , 680) ; %mouse moves to t h i s part o f the s c r e en
24 case 3
robot . mouseMove (2680 , 680) ; %mouse moves to t h i s part o f the s c r e en
26 case 4
robot . mouseMove (2750 , 720) ; %mouse moves to t h i s part o f the s c r e en
28 case 5
robot . mouseMove (2840 , 730) ; %mouse moves to t h i s part o f the s c r e en
30 otherwi s e
f p r i n t f ( ’ e r r o r moving ’ ) ;
32 end
34 %mouse c l i c k in the s c r e en
robot . mousePress ( InputEvent .BUTTON1MASK) ;
36 robot . mouseRelease ( InputEvent .BUTTON1MASK) ;
robot . keyPress ( java . awt . event . KeyEvent .VK SHIFT)
38 robot . keyPress ( java . awt . event . KeyEvent .VKW)
robot . keyRelease ( java . awt . event . KeyEvent .VKW)
40 robot . keyRelease ( java . awt . event . KeyEvent .VK SHIFT)
42 % Finger goes up
pause ( 0 . 5 ) ;
44
switch dedo
46 case 1
robot . mouseMove (2500 , 800) ; %mouse moves to t h i s part o f the s c r e en
Appendix B: MATLAB code 96
48 case 2
robot . mouseMove (2600 , 680) ; %mouse moves to t h i s part o f the s c r e en
50 case 3
robot . mouseMove (2680 , 680) ; %mouse moves to t h i s part o f the s c r e en
52 case 4
robot . mouseMove (2750 , 720) ; %mouse moves to t h i s part o f the s c r e en
54 case 5
robot . mouseMove (2840 , 730) ; %mouse moves to t h i s part o f the s c r e en
56 otherwi s e
f p r i n t f ( ’ e r r o r moving ’ ) ;
58 end
60 %mouse c l i c k in the s c r e en
robot . mousePress ( InputEvent .BUTTON1MASK) ;
62 robot . mouseRelease ( InputEvent .BUTTON1MASK) ;
robot . keyPress ( java . awt . event . KeyEvent .VK SHIFT)
64 robot . keyPress ( java . awt . event . KeyEvent .VKW)
robot . keyRelease ( java . awt . event . KeyEvent .VKW)
66 robot . keyRelease ( java . awt . event . KeyEvent .VK SHIFT)
pause ( 1 . 5 ) ;
68
end
Code B.2: Mover
Appendix B: MATLAB code 97
B.2 Signal processing
2 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
4 % modela .m i s the root o f the SBCSP algor i thm f o r 2− c l a s s e s .
%Some func t i on s were used based on open source l i b r a r i e s from the Toyota
Techno log i ca l I n s t i t u t e at Chicago .
6 %These f unc t i on s are used in SBCSP and are : p r o p e r t y l i s t 2 s t r u c t .m,
cu tou tTr i a l s .m, i s p r op e r t y s t r u c t .m, o r thogona l i z e .m, l b f g s .m, l o g r e g .m,
%l o s s e s .m and s e t d e f a u l t .
%There can be found in www. t t i c . edu
8 % Author : Angel Post
%
10 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
12 c l c ; c l e a r a l l ;
addpath ( ’ BCIII ’ ) ;
14 data = ’ da ta s e t IVa %s .mat ’ ;
datatrue= ’ t r u e l a b e l s %s .mat ’ ;
16 s ub j e c t s={ ’ aa ’ , ’ a l ’ , ’ av ’ , ’aw ’ , ’ ay ’ } ;s c r e en = [500 3 500 ] ;
18 %% Reduced s e t o f 168 channe l s
lambda = exp ( l i n s p a c e ( l og ( 0 . 0 1 ) , l og (100) , 20) ) ;
20 f o r i =1: l ength ( sub j e c t s )
f p r i n t f ( ’ Subject : %s \n ’ , s ub j e c t s { i }) ;22 %% Load a sub j e c t datase t and prep roce s s
load ( s p r i n t f ( data , s ub j e c t s { i }) ) ;24 [numT,numCh, nTrl ]= s i z e ( cnt ) ;
% Reduced s e t o f channe l s (49)
26 channe l s = [ (14 :22 ) , ( 3 3 : 3 9 ) , ( 5 0 : 5 8 ) , ( 6 8 : 7 6 ) , ( 8 7 : 9 5 ) , ( 1 0 4 : 2 : 1 1 4 ) ] ;
%% Se l e c t channe l s and convert cnt in to double
28 cnt=0.1∗double ( cnt ( : , channe l s ) ) ;c lab = nfo . c lab ( channe l s ) ;
30 cab = length ( c lab ) ;
%% Apply a subband f i l t e r f o r d i s t r i b u t i n g subbands
32 [ cn1 , cn2 , cn3 , cn4 ]= f i l t e r b a n k ( cnt ) ;
% Reducing s e t o f samples by windowing
34 cn1sb=cutou tTr i a l s ( cn1 , mrk . pos , screen , nfo . f s ) ;
cn2sb=cutou tTr i a l s ( cn2 , mrk . pos , screen , nfo . f s ) ;
36 cn3sb=cutou tTr i a l s ( cn3 , mrk . pos , screen , nfo . f s ) ;
cn4sb=cutou tTr i a l s ( cn4 , mrk . pos , screen , nfo . f s ) ;
38 %% Apply a CSP in with in subbands
X1 = covar iance ( cn1sb ) ;
40 X2 = covar iance ( cn2sb ) ;
X3 = covar iance ( cn3sb ) ;
42 X4 = covar iance ( cn4sb ) ;
Xi = max(X1 ,X4) ;
Appendix B: MATLAB code 98
44 % Convert {1 ,2} −> {−1, 1}Y = (mrk . y−1.5)∗ 2 ;
46 % LDA proce s s
% Find i n d i c e s o f t r a i n i n g and t e s t s e t
48 i d x t r a i n = f i nd (˜ i snan (Y) ) ;
i d x t e s t = f i nd ( i snan (Y) ) ;
50 % Whiten the t r a i n i n g data
Xtr = Xi ( : , : , i d x t r a i n ) ;
52 Ytr = Y( i dx t r a i n ) ;
load ( s p r i n t f ( datatrue , s ub j e c t s { i }) ) ;54 t rue y = ( t rue y ( i d x t e s t ) −1.5)∗ 2 ;
Xte = Xi ( : , : , i d x t e s t ) ;
56
f o r i i =1: l ength ( lambda )
58 f p r i n t f ( ’ lambda=%g\n ’ , lambda ( i i ) ) ;
[W, bias , s t a t ] = l og r e g (Xtr , Ytr , lambda ( i i ) ) ;
60 repo ( i , i i ) . lambda = lambda ( i i ) ;
repo ( i , i i ) . c l s = s t r u c t ( ’W’ ,W, ’ b i a s ’ , b ias , ’Ww’ , eye ( cab ) ) ;
62 repo ( i , i i ) . out = app ly ing l og r eg (Xte , repo ( i , i i ) . c l s ) ;
repo ( i , i i ) . l o s s = l o s s e s ( true y , repo ( i , i i ) . out ) ;
64 f p r i n t f ( ’ accuracy=%g\n ’ , 1−repo ( i , i i ) . l o s s ) ;
end
66 l o s s=ze ro s ( s i z e ( repo ) ) ;
68 f o r i i =1:prod ( s i z e ( repo ) )
l o s s ( i i )=repo ( i i ) . l o s s ;
70 end
end
72
f i gu r e ,
74 p lo t ( l og ( lambda ) , 100∗(1− l o s s ) ’ , ’ l i n ew id th ’ , 1 )
s e t ( gca , ’ f o n t s i z e ’ ,14)
76 s e t ( gca , ’ x t i ck ’ , l og ( 0 . 0 1 ) : l og (10) : l og (100) )
s e t ( gca , ’ x t i c k l a b e l ’ , { ’ 0 .01 ’ , ’ 0 . 1 ’ , ’ 1 . 0 ’ , ’ 10 ’ , ’ 100 ’ })78 g r id on ;
hold on ;
80 p lo t ( l og ( lambda ) ,100 ∗(1−mean( l o s s ) ) , ’ c o l o r ’ , [ . 7 . 7 . 7 ] , ’ l i n ew id th ’ , 2 ) ;
l e g = [ sub j e c t s { ’ average ’ } ] ;82 l egend ( l e g ) ;
x l ab e l ( ’ Regu l a r i z a t i on constant \ lambda ’ )
84 y l ab e l ( ’ C l a s s i f i c a t i o n accuracy ’ )
hold o f f ;
86
%%%%%%%%%%%%%
88
f unc t i on [ data 1 , data 2 , data 3 , data 4 ]= f i l t e r b a n k ( data )
90 %%F i l t e r bank implementation in matlab . The codes pas s e s the g iven s i g n a l
% fnorm should be between (0 , 1 )
Appendix B: MATLAB code 99
92 fnorm1 = [ 0 . 1 3 . 5 ] / 3 0 ; % f o r bandpass , here 1 and 3 are the lower and upper
c u t o f f r e s p e c t i v e l y
%[ b , a ] = butte r (n ,Wn) % n = order o f the f i l t e r , wn i s the normal ized
94 % cu t o f f f r e q
[ b1 , a1 ] = butte r (10 , fnorm1 ) ; % here the order o f the f i l t e r i s 10
96 data 1 = f i l t f i l t ( b1 , a1 , data ) ; % band pass f i l t e r
98 fnorm2 = [ 3 . 5 8 ] / 3 0 ; % f o r bandpass
[ b2 , a2 ] = butte r (10 , fnorm2 ) ;
100 data 2 = f i l t f i l t ( b2 , a2 , data ) ; % band pass f i l t e r
102 fnorm3 = [8 14 ] / 30 ; % f o r bandpass
[ b3 , a3 ] = butte r (10 , fnorm3 ) ;
104 data 3 = f i l t f i l t ( b3 , a3 , data ) ; % band pass f i l t e r
106 fnorm4 = [14 29 ] / 30 ; % f o r bandpass
[ b4 , a4 ] = butte r (10 , fnorm4 ) ;
108 data 4 = f i l t f i l t ( b4 , a4 , data ) ; % band pass f i l t e r
end
110
112 %%%%%%%%%%%%%
Code B.3: modela (SBCSP algorithm for two classes)
Appendix B: MATLAB code 100
2 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
4 % i n i t a l g .m i s the root o f the SBCSP and CSSBP algor i thms implemented .
%Some func t i on s were used based on open source l i b r a r i e s from the Toyota
Techno log i ca l I n s t i t u t e at Chicago
6 %These f unc t i on s are used in SBCSP and are : p r o p e r t y l i s t 2 s t r u c t .m,
cu tou tTr i a l s .m, i s p r op e r t y s t r u c t .m, o r thogona l i z e .m, l b f g s .m, l o g r e g .m
%There can be found in www. t t i c . edu
8 %Other f unc t i on s o f CSSBP are based on B i o s s i g . These f unc t i on s are : eogi ,
e o g i i , e o g i i i and segment . %There can be found in http :// b i o s i g .
s ou r c e f o r g e . net /
% Author : Angel Post
10 %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
12 c l c
c l e a r a l l
14 c l o s e a l l
d i sp ( ’ ’ ) ;
16 di sp ( ’ ∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗ ’ ) ;d i sp ( ’ ∗ CSSBP and SBCSP Algorithms ∗ ’ ) ;
18 di sp ( ’ ∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗ ’ ) ;
20 %% DATA READ AND ACQUISITION
% Read data f i l e and automat i ca l l y c r e a t e param s t ru c tu r e .
22 data
varsworkspace=whos ;
24 num var iables=length ( varsworkspace ) ;
f o r i =1: num var iables
26 eva l ( [ ’ param . ’ , varsworkspace ( i ) . name , ’= ’ , varsworkspace ( i ) . name , ’ ; ’ ] ) ;
end
28 f o r subj =1:1:9
[ accuracy ( subj ) ]= s e t t i n g s ( subj , param)
30 end
r=mean( accuracy ) ;
32 f p r i n t f ( ’The mean o f the accuracy i s %f \n ’ , r ) ;
34 f i g u r e (1 )
bar ( accuracy )
36 x l ab e l ( ’ Sub jec t s ’ )
y l ab e l ( ’ C l a s s i f i c a t i o n accuracy ’ )
38
40 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
func t i on [ accuracy ] = s e t t i n g s ( sub ject , param)
42 %% s e t t i n g s takes parameters from i n i t a l g
% and re tu rn s the a c cu r a c i e s o f sub j e c t image r i e s
Appendix B: MATLAB code 101
44 % Bandpass F i l t e r S e l e c t i o n
v a r i a b l e s=f i e ldnames (param) ;
46 longparam=length ( v a r i a b l e s ) ;
f o r i =1: longparam
48 eva l ( [ v a r i a b l e s { i } , ’=param . ’ , v a r i a b l e s { i } , ’ ; ’ ] ) ;end
50 switch F
case 0
52 case 1
bpFi l t = d e s i g n f i l t ( ’ bandpas s f i r ’ , ’ StopbandFrequency1 ’ , 7 , . . .
54 ’ PassbandFrequency1 ’ , 8 , ’ PassbandFrequency2 ’ , 30 , . . .
’ StopbandFrequency2 ’ , 32 , ’ StopbandAttenuation1 ’ , 60 , . . .
56 ’ PassbandRipple ’ , 1 , ’ StopbandAttenuation2 ’ , 40 , . . .
’ SampleRate ’ , 250 , ’ DesignMethod ’ , ’ ka i s e rw in ’ ) ;
58 case 2
bpFi l t = d e s i g n f i l t ( ’ bandpas s i i r ’ , ’ StopbandFrequency1 ’ , 7 , . . .
60 ’ PassbandFrequency1 ’ , 8 , ’ PassbandFrequency2 ’ , 30 , . . .
’ StopbandFrequency2 ’ , 32 , ’ StopbandAttenuation1 ’ , 60 , . . .
62 ’ PassbandRipple ’ , 1 , ’ StopbandAttenuation2 ’ , 40 , . . .
’ SampleRate ’ , 250 , ’ MatchExactly ’ , ’ passband ’ ) ;
64 case 3
N = 6 ; % Order
66 Fpass1 = 8 ; % F i r s t Passband Frequency
Fpass2 = 30 ; % Second Passband Frequency
68 Apass = 1 ; % Passband Ripple (dB)
Fs = 250 ; % Sampling Frequency
70 h = fd e s i gn . bandpass ( ’n , fp1 , fp2 , ap ’ , N, Fpass1 , Fpass2 , . . .
Apass , Fs ) ;
72 Hd = des ign (h , ’ cheby1 ’ , ’ SOSScaleNorm ’ , ’ L in f ’ ) ;
74
end
76 % STEP 1 − Data Loading ++++++++++++++++++++++++++++++++++++++++++++++++
% Loading the data
78 [ s ,HDR] = dataload ( subject , 1 ) ;
[ s2 ,HDR2] = dataload ( subject , 0 ) ;
80 load ( s t r c a t ( ’ .\ t r u e l a b e l s \ ’ , ’A0 ’ , num2str ( sub j e c t ) , ’E .mat ’ ) ) ;
% Creat ing the c l a s s l a b e l composite from a l l data
82 c l a s s t o t a l = ve r t ca t (HDR. C la s s l abe l , c l a s s l a b e l ) ;
C l a s s l a b e l=c l a s s t o t a l ( c l a s s t o t a l == 1 | c l a s s t o t a l == 2) ;
84 % STEP 2 − EOG Correc t ion ++++++++++++++++++++++++++++++++++++++++++++++
% EOG co r r e c t i o n i s performed accord ing to the s e l e c t e d mode , and EEG
86 % s i g n a l i s returned
switch EOG
88 case 0
% EOG co r r e c t i o n i s not performed
90 eegchan = HDR.CHANTYP==’E ’ ;
eegchan2 = HDR2.CHANTYP==’E ’ ;
Appendix B: MATLAB code 102
92 s3=s ( : , eegchan ) ;
s4=s2 ( : , eegchan2 ) ;
94 case 1
s3 = EOGi( s ,HDR ) ;
96 s4 = EOGi( s2 ,HDR ) ;
case 2
98 s3 = EOGii ( s ,HDR ) ;
s4 = EOGii ( s2 ,HDR ) ;
100 case 3
s3 = s ;
102 s4 = s2 ;
end
104 % Nan replacement , with 0 va lue s ( other rep lacements can be explore ,
% however , replacement f o r 0 i s the one that performed best in
106 % s imu la t i on s )
s3 ( i snan ( s3 ) )=0;
108 s4 ( i snan ( s4 ) )=0;
% Segmentation
110 [ X MI3 , szMI3 , ˜ , ˜ ] = segment (HDR. C la s s l abe l ,HDR, s3 , MInit , . . .
MIend ) ;
112 [ X MI4 , szMI4 , ˜ , ˜ ] = segment ( c l a s s l a b e l ,HDR2, s4 , MInit , MIend ) ;
% number o f samples in an epoched
114 MIlong=1+(MIend−MInit ) ∗HDR. SampleRate ;
i f EOG == 3 %Here , EOG = 3 , r e g r e s s i o n f o r i nd i v i dua l epochs , i s performed
116 X MI3=EOGiii ( szMI3 (3 ) ,X MI3 , MIlong ,HDR) ’ ;
X MI4=EOGiii ( szMI4 (3 ) ,X MI4 , MIlong ,HDR2) ’ ;
118 end
X MI=horzcat (X MI3 , X MI4) ; %Fina l ly , the s i g n a l i s cons t ructed mergin
120 % the data from both s t r u c t u r e s .
% STEP 4 − F i l t e r i n g +++++++++++++++++++++++++++++++++++++++++++++++++++
122 switch F
case {1 ,2}124 X MI= f i l t f i l t ( bpFi l t , X MI ’ ) ;
case 0
126 X MI=X MI ’ ;
case 3
128 X MI=f i l t e r (Hd,X MI ’ ) ;
case 4
130 [ cn1 , cn2 , cn3 , cn4 ]= f i l t e r b a n k (X MI ’ ) ;
X MI=cn2 ;
132 end
% Feature ex t r a c t i on and c l a s s i f i c a t i o n are computed .
134 ndata = length ( C l a s s l a b e l ) ;
nva l s = f l o o r ( ndata∗ nva l s ) ;136 nte s t = f l o o r ( ndata∗ nte s t ) ;
n t ra in = ndata−nvals−nte s t ;138 rng (42) ;
r1 = randperm ( ndata , n t e s t ) ;
Appendix B: MATLAB code 103
140 r1 = so r t ( r1 ) ;
Fu l l Idx=1:ndata ;
142 raux=s e t d i f f ( Ful l Idx , r1 ) ;
144 c l a s s t e s t=C l a s s l a b e l ( r1 ) ;
z=1; i i =1; X tes t= 0 ;
146 whi le z<= length ( C l a s s l a b e l )
148 x mi=X MI( i i : i i+MIlong −1 , : ) ;
i f ismember ( z , r1 )
150 i f ( X tes t == 0)
X tes t = x mi ;
152 e l s e
X tes t = ve r t ca t ( X test , x mi ) ;
154 end
end
156 z=z+1;
i i= i i+MIlong ;
158 end
160 i i i =1;
whi l e i i i <= rounds
162 rng (42+ i i i ) ;
raux2 = randperm ( length ( raux ) , nva l s ) ;
164 r2=raux ( raux2 ) ;
r2=so r t ( r2 ) ;
166 r3=s e t d i f f ( Ful l Idx , horzcat ( r1 , r2 ) ) ;
r3=so r t ( r3 ) ;
168 c l a s s t r a i n=C l a s s l a b e l ( r3 ) ;
c l a s s e v a l=C l a s s l a b e l ( r2 ) ;
170 z=1; i i =1; X tra in=0; X val=0; X train R=0; X tra in L=0;
whi l e z<= length ( C l a s s l a b e l )
172
x mi=X MI( i i : i i+MIlong −1 , : ) ;
174 i f ismember ( z , r2 )
i f ( X val == 0)
176 X val = x mi ;
e l s e
178 X val = ve r t ca t ( X val , x mi ) ;
end
180 e l s e i f ismember ( z , r3 )
i f ( X tra in == 0)
182 X tra in = x mi ;
i f C l a s s l a b e l ( z )==1
184 X tra in L = x mi ;
e l s e
186 X train R = x mi ;
end
Appendix B: MATLAB code 104
188 e l s e
X tra in = ve r t ca t ( X train , x mi ) ;
190 i f C l a s s l a b e l ( z )==1
i f ( X tra in L == 0)
192 X tra in L = x mi ;
e l s e
194 X tra in L = ve r t ca t ( X train L , x mi ) ;
end
196 e l s e
i f ( X train R == 0)
198 X train R = x mi ;
e l s e
200 X train R = ve r t ca t ( X train R , x mi ) ;
end
202 end
end
204 end
z=z+1;
206 i i= i i+MIlong ;
end
208 % Option s e l e c t i o n
switch opt ion
210 case 1
% CSP
212 [ f e a tu r e s , ˜ ,W, ˜ ] = CSP3feautex ( c l a s s t r a i n , . . .
c s p f i l t , X train , X train L , X train R , MIlong ) ;
214 [ b , theta ] = l d a t r a i n ( f e a tu r e s , c l a s s t r a i n , c s p f i l t ) ;
% Va l idat i on
216 CSP val=W∗X val ’ ;
[ f e a t u r e s ] = feautex ( c l a s s e v a l , c s p f i l t , . . .
218 CSP val , MIlong ) ;
[ accus ] = l d a t e s t (b , theta ’ , f e a tu r e s , c l a s s e v a l ) ;
220 W cel l { i i i }=W;
B( i i i )=b ;
222 THETA( i i i , : )=theta ;
ACCUS( i i i )=accus ;
224 case 2
% CSP
226 [ f e a tu r e s , ˜ ,W, ˜ ] = CSPfeautex ( c l a s s t r a i n , . . .
c s p f i l t , X train , X train L , X train R , MIlong ) ;
228 [ b , theta ] = l d a t r a i n ( f e a tu r e s , c l a s s t r a i n , c s p f i l t ) ;
% Va l idat i on
230 CSP val=W∗X val ’ ;
[ f e a t u r e s ] = feautex ( c l a s s e v a l , c s p f i l t , . . .
232 CSP val , MIlong ) ;
[ accus ] = l d a t e s t (b , theta ’ , f e a tu r e s , c l a s s e v a l ) ;
234 W cel l { i i i }=W;
B( i i i )=b ;
Appendix B: MATLAB code 105
236 THETA( i i i , : )=theta ;
ACCUS( i i i )=accus ;
238 case 4
% CSP
240 [ f e a tu r e s , ˜ ,W, ˜ ] = CSP3featex ( c l a s s t r a i n , . . .
c s p f i l t s , X train , X train L , X train R , MIlong ) ;
242 [ b , theta ] = l d a t r a i n ( f e a tu r e s , c l a s s t r a i n , c s p f i l t s ) ;
% Va l idat i on
244 CSP val=W∗X val ’ ;
[ f e a t u r e s ] = f eau t ex t ( c l a s s e v a l , c s p f i l t s , . . .
246 CSP val , MIlong ) ;
[ c spaccus ] = l d a t e s t (b , theta ’ , f e a tu r e s , c l a s s e v a l ) ;
248 W cel l { i i i }=W;
B( i i i )=b ;
250 THETA( i i i , : )=theta ;
end
252 i i i= i i i +1;
end
254 %% Test ing
[ ˜ , I ]=max(ACCUS) ;
256 theta max=THETA( I , : ) ;
b max=B( I ) ;
258 D=cat (3 , W cel l { :} ) ;W max=D( : , : , I ) ;
260 CSP test=W max∗X test ’ ;
[ f e a t u r e s ] = feautex ( c l a s s t e s t , c s p f i l t , . . .
262 CSP test , MIlong ) ;
[ accuracy ] = l d a t e s t (b max , theta max , f e a tu r e s , . . .
264 c l a s s t e s t ) ;
end
266 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
268 f unc t i on [ b , theta ] = l d a t r a i n ( f e a tu r e s , c l a s s l a b e l s , n f e a tu r e s )
% Bui ld ing LDA model
270 i =1; u1=ze ro s ( n f ea ture s , 1 ) ; u2=ze ro s ( n f ea ture s , 1 ) ; cnu=0; cnt=0;
whi l e i<=length ( f e a t u r e s )
272 i f ( c l a s s l a b e l s ( i ) == 1)
u1=u1+f e a t u r e s ( : , i ) ;
274 cnt=cnt+1;
e l s e
276 u2=u2+f e a t u r e s ( : , i ) ;
cnu=cnu+1;
278 end
i = i +1;
280 end
u1=u1/ cnt ;
282 u2=u2/cnu ;
i =1; C1=0; C2=0;
Appendix B: MATLAB code 106
284 whi le i<=length ( f e a t u r e s )
i f ( c l a s s l a b e l s ( i ) == 1)
286 C1=C1+( f e a t u r e s ( : , i )−u1 ) ∗ ( f e a t u r e s ( : , i )−u1 ) ’ ;
e l s e
288 C2=C2+( f e a t u r e s ( : , i )−u2 ) ∗ ( f e a t u r e s ( : , i )−u2 ) ’ ;
end
290 i = i +1;
end
292 theta = (C1+C2) \(u2−u1 ) ;b = ((− theta ’ ) ∗ ( u1+u2 ) ) /2 ;
294 end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
296 f unc t i on [ accuracy ] = l d a t e s t (b , theta , f e a tu r e s , c l a s s l a b e l s )
z=1; accuracy=0;
298 n t r i a l s=s i z e ( f e a tu r e s , 2 ) ;
whi l e z<=n t r i a l s
300 y=s ign ( theta ∗ f e a t u r e s ( : , z ) + b) ;
p r ed i c t ed ( z )=(y>0)+1;
302 i f c l a s s l a b e l s ( z )==pred i c t ed ( z )
accuracy=accuracy+1;
304 end
z=z+1;
306 end
accuracy=accuracy ∗100/ n t r i a l s ;
308 end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
310 f unc t i on [ f e a t u r e s ]= feautex ( c l a s s t r a i n , option , . . .
CSPMI, MIleng )
312 i 1 =1; j =1; z=1;
whi l e z<= length ( c l a s s t r a i n )
314 whi le j<=opt ion % ex t r a c t s epochs
x MI ( j , : )=CSPMI( j , i 1 : i 1+MIleng−1) ;316 j=j +1;
end
318 j =1;
i 1=i 1+MIleng ;
320 % Computes the bandpower ( i t has been p r ev i ou s l y f i l t e r e d ) f o r each
% channel
322 BP MI=log ( var ( x MI ’ ) ) ’ ;
f e a t u r e s ( : , z )=BP MI ;
324 z=z+1;
end
326 end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
328 f unc t i on [ s ,HDR] = dataload ( User , s t a r t )
%I n i t i a l i z a t i o n
330 i f s t a r t %depends on i f i s c a l l e d f o r t r a i n i n g (1 ) or v a l i d a t i o n (0 )
fname = s t r c a t ( ’A0 ’ , num2str ( User ) , ’T. gdf ’ ) ;
Appendix B: MATLAB code 107
332 [ s ,HDR] = s load ( fname , 0 ) ;
e l s e
334 fname = s t r c a t ( ’A0 ’ , num2str ( User ) , ’E . gdf ’ ) ;
[ s ,HDR] = s load ( fname , 0 ) ;
336 end
end
338 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% data .m
340 %% In t h i s s c r i p t s i t i s p o s s i b l e to change the opt ions o f p r ep ro c e s s i ng
and pro c e s s i ng the s i g n a l .
%% This i s a b r i e f exp lanat ion o f the a lgor i thm
342 % At f i r s t , the EOG data i s preproce s s ed :
% EOG = 0 ; % a zero means , that the re i s not c o r r e c t i o n
344 EOG = 1 ; % EOG data i s handled by r e g r e s s i o n con s i d e r i ng the EOG
t r a i n i n g s e t
% EOG = 2 ; % EOG data i s handled by r e g r e s s i o n con s i d e r i ng the whole
EOG
346 % EOG = 3 ; % EOG data i s handled by r e g r e s s i o n s i n g l e epoched
F = 3 ;
348 % F = 0 ; % The s i g n a l i s not f i l t e r e d
% F = 1 ; % The s i g n a l i s f i l t e r e d with a high order k a i s e r window
350 % F = 2 ; % The s i g n a l i s f i l t e r e d with a 66 order k a i s e r window
% F = 3 ; % The s i g n a l i s f i l t e r e d with a IIR Chebyshev f i l t e r
352 opt ion = 1 ; % CSP+LDA approach i s app l i ed
% opt ion = 4 ; % Trains both CSP+LDA new approach and MDRM,
354 c s p f i l t= 6 ; % Number o f CSP f i l t e r s .
%% data s p l i t t i n g ( t ra in , t e s t and va l i d a t i o n s e t s )
356 % datase t f o r t r a i n i n g (%)
nt ra in= 0 . 7 ;
358 % datase t f o r t e s t i n g (%)
nt e s t = 0 . 2 ;
360 % datase t f o r v a l i d a t i n g (%)
nva l s = 0 . 1 ;
362 rounds= 10 ;
% time in seconds
364 MInit = 3 ;
MIend = 5 . 5 ;
366 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
func t i on [ f e a tu r e s , CSP MI ,W,D]=CSPfeautex ( c l a s s t r a i n , . . .
368 option , X train , Xtrain L , Xtrain R , MIlong )
c l e f t = cov ( Xtrain L ) ;
370 c r i g h t= cov ( Xtrain R ) ;
[W,D] = e i g ( c l e f t , c l e f t+c r i g h t ) ;
372 W=W’ ;
% Sor t ing i s not necessary , as the returned data i s a l r eady so r t ed
374 i f (mod( option , 2 )==0)
W=W( [ 1 : ( opt ion /2) ( end−(( opt ion /2)−1) ) : end ] , : ) ;
376 e l s e
Appendix B: MATLAB code 108
W=W( [ 1 : ( ( opt ion /2) +0.5) ( end−((( opt ion /2) +0.5)−1) ) : end ] , : ) ;
378 end
CSP MI=W∗X train ’ ;
380 % BandPower ex t r a c t i on loop
f e a t u r e s = feautex ( c l a s s t r a i n , option ,CSP MI , MIlong ) ;
382 end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
384 f unc t i on [ f e a tu r e s , CSP MI ,W,D]=CSP3feautex ( c l a s s t r a i n , . . .
option , X train , X l e f t , X right , MIlong )
386 i 1 =1; j =1; z=1; Cl=0; C l e f t =0;
whi l e z<=(length ( X l e f t ) /MIlong )
388 % ex t r a c t s an epoched
whi l e j<=s i z e ( X l e f t , 2 )
390 x l e f t ( : , j )=X l e f t ( i 1 : i 1+MIlong−1, j ) ;j=j +1;
392 end
j =1;
394 i 1=i 1+MIlong ;
Cl = cov ( x l e f t ) ;
396 Cl = Cl/ t r a c e ( Cl ) ;
C l e f t = C l e f t+Cl ;
398 z=z+1;
end
400 Cl e f t = C l e f t /( z−1) ;i 1 =1; j =1; z=1; Cr=0; Cright=0;
402 whi le z<=(length ( X r ight ) /MIlong )
% ex t r a c t s an epoched
404 whi le j<=s i z e ( X right , 2 )
x r i g h t ( : , j )=X r ight ( i 1 : i 1+MIlong−1, j ) ;406 j=j +1;
end
408 j =1;
i 1=i 1+MIlong ;
410 Cr = cov ( x r i g h t ) ;
Cr = Cr/ t r a c e (Cr ) ;
412 Cright = Cright+Cr ;
z=z+1;
414 end
Cright= Cright /( z−1) ;416 [W,D] = e i g ( Cle f t , C l e f t+Cright ) ;
W=W’ ;
418 % Sort ing i s not necessary , as the returned data i s a l r eady so r t ed
i f (mod( option , 2 )==0) %even number
420 W=W( [ 1 : ( opt ion /2) ( end−(( opt ion /2)−1) ) : end ] , : ) ;
e l s e
422 W=W( [ 1 : ( ( opt ion /2) +0.5) ( end−((( opt ion /2) +0.5)−1) ) : end ] , : ) ;
end
424 CSP MI=W∗X train ’ ;
Appendix B: MATLAB code 109
% BandPower ex t r a c t i on loop
426 f e a t u r e s = feautex ( c l a s s t r a i n , option ,CSP MI , MIlong ) ;
end
428
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
430 f unc t i on [ data 1 , data 2 , data 3 , data 4 ]= f i l t e r b a n k ( data )
%%F i l t e r bank implementation in matlab . The codes pas s e s the g iven s i g n a l
432 % fnorm should be between (0 , 1 )
fnorm1 = [ 0 . 1 3 . 5 ] / 3 0 ; % f o r bandpass , here 1 and 3 are the lower and upper
c u t o f f r e s p e c t i v e l y
434 %[ b , a ] = butte r (n ,Wn) % n = order o f the f i l t e r , wn i s the normal ized
% cu t o f f f r e q
436 [ b1 , a1 ] = butte r (10 , fnorm1 ) ; % here the order o f the f i l t e r i s 10
data 1 = f i l t f i l t ( b1 , a1 , data ) ; % band pass f i l t e r
438
fnorm2 = [ 3 . 5 8 ] / 3 0 ; % f o r bandpass
440 [ b2 , a2 ] = butte r (10 , fnorm2 ) ;
data 2 = f i l t f i l t ( b2 , a2 , data ) ; % band pass f i l t e r
442
fnorm3 = [8 14 ] / 30 ; % f o r bandpass
444 [ b3 , a3 ] = butte r (10 , fnorm3 ) ;
data 3 = f i l t f i l t ( b3 , a3 , data ) ; % band pass f i l t e r
446
fnorm4 = [14 29 ] / 30 ; % f o r bandpass
448 [ b4 , a4 ] = butte r (10 , fnorm4 ) ;
data 4 = f i l t f i l t ( b4 , a4 , data ) ; % band pass f i l t e r
450 end
452 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Code B.4: initialg.m (CSSBP and SBCSP)
Appendix C
Interface
Figure C.1: Background of the Interface
110
Appendix C: Interface 111
(a) (b) (c)
Figure C.2: Part of the hand (1)
(a) (b) (c)
Figure C.3: Part of the hand (2)
Appendix C: Interface 112
(a) (b) (c)
Figure C.4: Part of the hand (3)
(a) (b) (c)
Figure C.5: Part of the hand (4)
Appendix C: Interface 113
(a) (b) (c)
Figure C.6: Part of the hand (5)
(a) (b) (c)
Figure C.7: Part of the hand (6)
Appendix C: Interface 114
(a) (b) (c)
Figure C.8: Part of the hand (7)
(a) (b) (c)
Figure C.9: Part of the hand (8)
Appendix C: Interface 115
(a) (b)
Figure C.10: Part of the hand (9)
(a) (b) (c)
Figure C.11: Part of the hand (10)
Figure C.15: Part of the hand (14)
Appendix D: JavaScript Code Interface 116
(a) (b) (c)
Figure C.12: Part of the hand (11)
Appendix D: JavaScript Code Interface 117
(a) (b) (c)
Figure C.13: Part of the hand (12)
Appendix D: JavaScript Code Interface 118
(a) (b)
Figure C.14: Part of the hand (13)
Appendix D
Interface JS Code
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 ex i t b tn . onRelease = func t i on ( )
{6 unloadMovieNum (0) ;
getURL(” http ://www. ictgames . com/ r e s ou r c e s . html ”) ;
8 } ;r o o t . l e f t mc . stop ( ) ;
10 r o o t . r ight mc . stop ( ) ;
i n i t = func t i on ( )
12 {r o o t . l e f t mc . Value = 5 ;
14 r o o t . l e f tTex t . t ex t = roo t . l e f t mc . Value ;
r o o t . r ight mc . Value = 5 ;
16 r o o t . r ightText . t ex t = roo t . r ight mc . Value ;
} ;18 warning = 1 ;
rightWarning = 1 ;
20 i n i t ( ) ;
r o o t . l e f t mc . middle mc . onEnterFrame = func t i on ( )
22 {i f ( r o o t . l e f t mc . indexUp == f a l s e && roo t . l e f t mc . thirdUp == f a l s e &&
roo t . l e f t mc . l i t t l eUp == f a l s e && roo t . l e f t mc . middleUp == true )
24 {t h i s . gotoAndStop (”down”) ;
26 r o o t . l e f t mc . Value = roo t . l e f t mc . Value − r o o t . warning ;
r o o t . warning = 0 ;
28 r o o t . l e f tTex t . t ex t = roo t . l e f t mc . Value ;
} // end i f
30 } ;r o o t . r ight mc . middle mc . onEnterFrame = func t i on ( )
119
Appendix D: JavaScript Code Interface 120
32 {i f ( r o o t . r ight mc . indexUp == f a l s e && roo t . r ight mc . thirdUp == f a l s e &&
roo t . r ight mc . l i t t l eUp == f a l s e && roo t . r ight mc . middleUp == true )
34 {t h i s . gotoAndStop (”down”) ;
36 r o o t . r ight mc . Value = roo t . r ight mc . Value − r o o t . r ightWarning ;
r o o t . r ightWarning = 0 ;
38 r o o t . r ighText . t ex t = roo t . r ight mc . Value ;
} // end i f
40 } ;r o o t . l e f t mc . index mc . onRelease = func t i on ( )
42 {t h i s . play ( ) ;
44 r o o t . l e f tTex t . t ex t = roo t . l e f t mc . Value ;
} ;46 r o o t . l e f t mc . middle mc . onRelease = func t i on ( )
{48 i f ( r o o t . l e f t mc . indexUp == f a l s e && roo t . l e f t mc . thirdUp == f a l s e &&
roo t . l e f t mc . l i t t l eUp == f a l s e )
{50 }
e l s e
52 {t h i s . play ( ) ;
54 warning = 1 ;
} // end e l s e i f
56 } ;r o o t . l e f t mc . third mc . onRelease = func t i on ( )
58 {t h i s . play ( ) ;
60 } ;r o o t . l e f t mc . l i t t l e m c . onRelease = func t i on ( )
62 {t h i s . play ( ) ;
64 } ;r o o t . l e f t mc . thumb mc . onRelease = func t i on ( )
66 {t h i s . play ( ) ;
68 } ;r o o t . r ight mc . index mc . onRelease = func t i on ( )
70 {t h i s . play ( ) ;
72 } ;r o o t . r ight mc . middle mc . onRelease = func t i on ( )
74 {i f ( r o o t . r ight mc . indexUp == f a l s e && roo t . r ight mc . thirdUp == f a l s e &&
roo t . r ight mc . l i t t l eUp == f a l s e )
76 {
Appendix D: JavaScript Code Interface 121
}78 e l s e
{80 t h i s . play ( ) ;
r ightWarning = 1 ;
82 } // end e l s e i f
} ;84 r o o t . r ight mc . third mc . onRelease = func t i on ( )
{86 t h i s . play ( ) ;
} ;88 r o o t . r ight mc . l i t t l e m c . onRelease = func t i on ( )
{90 t h i s . play ( ) ;
} ;92 r o o t . r ight mc . thumb mc . onRelease = func t i on ( )
{94 t h i s . play ( ) ;
} ;
Code D.1: JS Interface (main)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
parent . indexUp = true ;
6
// [ Action in Frame 2 ]
8 parent . indexUp = f a l s e ;
−− parent . Value ;
10 r o o t . r ightText . t ex t = parent . Value ;
12 // [ Action in Frame 3 ]
stop ( ) ;
14 parent . indexUp = f a l s e ;
16 // [ Action in Frame 4 ]
parent . indexUp = true ;
18 ++ parent . Value ;
r o o t . r ightText . t ex t = parent . Value ;
Code D.2: JS Interface (finger1)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
Appendix D: JavaScript Code Interface 122
parent . middleUp = true ;
6
// [ Action in Frame 2 ]
8 −− parent . Value ;
parent . middleUp = f a l s e ;
10 r o o t . r ightText . t ex t = parent . Value ;
12 // [ Action in Frame 3 ]
stop ( ) ;
14 r o o t . r ightText . t ex t = parent . Value ;
16 // [ Action in Frame 4 ]
++ parent . Value ;
18 parent . middleUp = true ;
r o o t . r ightText . t ex t = parent . Value ;
Code D.3: JS Interface (finger2)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
8 parent . l i t t l eUp = f a l s e ;
r o o t . r ightText . t ex t = parent . Value ;
10
// [ Action in Frame 3 ]
12 stop ( ) ;
14 // [ Action in Frame 4 ]
++ parent . Value ;
16 parent . l i t t l eUp = true ;
r o o t . r ightText . t ex t = parent . Value ;
Code D.4: JS Interface (finger3)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
8 parent . thirdUp = f a l s e ;
r o o t . r ightText . t ex t = parent . Value ;
10
Appendix D: JavaScript Code Interface 123
// [ Action in Frame 3 ]
12 stop ( ) ;
14 // [ Action in Frame 4 ]
++ parent . Value ;
16 parent . thirdUp = true ;
r o o t . r ightText . t ex t = parent . Value ;
Code D.5: JS Interface (finger4)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
8 r o o t . r ightText . t ex t = parent . Value ;
10 // [ Action in Frame 3 ]
stop ( ) ;
12
// [ Action in Frame 4 ]
14 ++ parent . Value ;
r o o t . r ightText . t ex t = parent . Value ;
Code D.6: JS Interface (finger5)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
parent . indexUp = true ;
6
// [ Action in Frame 2 ]
8 parent . indexUp = f a l s e ;
−− parent . Value ;
10 r o o t . l e f tTex t . t ex t = parent . Value ;
12 // [ Action in Frame 3 ]
stop ( ) ;
14 parent . indexUp = f a l s e ;
16 // [ Action in Frame 4 ]
parent . indexUp = true ;
18 ++ parent . Value ;
r o o t . l e f tTex t . t ex t = parent . Value ;
Code D.7: JS Interface (finger6)
Appendix D: JavaScript Code Interface 124
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
parent . middleUp = true ;
6
// [ Action in Frame 2 ]
8 −− parent . Value ;
parent . middleUp = f a l s e ;
10 r o o t . l e f tTex t . t ex t = parent . Value ;
12 // [ Action in Frame 3 ]
stop ( ) ;
14
// [ Action in Frame 4 ]
16 ++ parent . Value ;
parent . middleUp = true ;
18 r o o t . l e f tTex t . t ex t = parent . Value ;
Code D.8: JS Interface (finger7)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
8 parent . l i t t l eUp = f a l s e ;
r o o t . l e f tTex t . t ex t = parent . Value ;
10
// [ Action in Frame 3 ]
12 stop ( ) ;
14 // [ Action in Frame 4 ]
++ parent . Value ;
16 parent . l i t t l eUp = true ;
r o o t . l e f tTex t . t ex t = parent . Value ;
Code D.9: JS Interface (finger8)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
Appendix D: JavaScript Code Interface 125
8 parent . thirdUp = f a l s e ;
r o o t . l e f tTex t . t ex t = parent . Value ;
10
// [ Action in Frame 3 ]
12 stop ( ) ;
14 // [ Action in Frame 4 ]
++ parent . Value ;
16 parent . thirdUp = true ;
r o o t . l e f tTex t . t ex t = parent . Value ;
Code D.10: JS Interface (finger9)
// Action s c r i p t . . .
2
// [ Action in Frame 1 ]
4 stop ( ) ;
6 // [ Action in Frame 2 ]
−− parent . Value ;
8 r o o t . l e f tTex t . t ex t = parent . Value ;
10 // [ Action in Frame 3 ]
stop ( ) ;
12
// [ Action in Frame 4 ]
14 ++ parent . Value ;
r o o t . l e f tTex t . t ex t = parent . Value ;
Code D.11: JS Interface (finger10)
// Action s c r i p t . . .
2
// [ Action in Frame 55 ]
4 i n s t r u c t i o n s . v i s i b l e = f a l s e ;
Code D.12: JS Interface (Instructions)