neurophysiological heat maps for human-robot interaction

4
Neurophysiological Heat Maps for Human-Robot Interaction Evaluation Chris S. Crawford Department of Computer Science University of Alabama Tuscaloosa, Alabama 35487 Marvin Andujar Department of Computer Science and Engineering University of South Florida Tampa, Florida 33620 Juan E. Gilbert Department of Computer and Information Science and Engineering University of Florida Gainesville, Florida 32611 Abstract Human-Robot Interaction (HRI) effectiveness is often evalu- ated subjectively to gauge the user experience. These methods may result in responses that are inaccurate or inconsistent due to participants’ false responses or inability to communicate their affective or cognitive states. In this paper, we discuss an approach featuring heat maps that link users’ cognitive state to a robot’s location. Users’ cognitive state was interpreted from electroencephalography (EEG) signals while complet- ing a robot navigation task. We tested this method during an experiment featuring solo and cooperative control. Our goal is to present a step towards visualizing position-based mea- surements of users’ cognitive state. Introduction Recent advances in HRI have resulted in metrics and standards that enable more efficient assessment of sys- tems. These advances have assisted researchers with gain- ing a better understanding of human behavior in areas such as industry, home, education, and others. Currently, the most commonly used method of evaluation in HRI stud- ies is self-assessments (Bethel and Murphy 2009). This method usually involves gathering information from partic- ipants via physical or digital questionnaires. Although self- assessments can provide useful information, it often causes problems with validity and consistency (Bethel and Murphy 2010). Other methods used in HRI include behavior mea- sures and task performance. The behavior measures method enables researchers to observe the actual behaviors of par- ticipants. Currently, this is often done by video recording participants and coding the visual and auditory information. Although behavior measures through observations can be less tedious compared to other forms of subjective measures, it is susceptible to the Hawthorne effect, which is known to cause bias (Sim and Loo 2015). Task performance measures are useful for measuring how well users are able to complete tasks while interacting with robots but should be combined with one of the methods mentioned previously for efficient HRI evaluation. Copyright c 2017, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Novel methods and approaches that allow researchers to gather information that reflects the user experience (UX) without the limitations of traditional evaluation methods may enhance HRI evaluation. Neurophysiological measures can be used to supplement traditional evaluation methods and address some of the issues mentioned earlier. This work presents an exploratory investigation of map- ping neurophysiological measures of engagement to a robot’s position via heat maps as a supplementary approach to evaluating users’ cognitive state. To explore this concept two condition were observed. The initial condition consisted of evaluating a user’s engagement level while they cogni- tively controlled a robot using two commands. In the second experiment, two participants controlled the robot coopera- tively using one command each. In this paper, we briefly discuss previous HRI research that features Brain-Computer Interfaces (BCI) and describes how heat map visualizations can provide useful details about users’ cognitive state while performing robot navigation tasks. Brain-Computer Interfaces in HRI The use of BCI in Human-Computer Interaction (HCI) has recently become of interest to researchers (Tan and Nijholt ). Furthermore, HRI researchers have also began to inves- tigate BCI systems. Early BCI research featuring robots commonly investigated active BCI robot-architectures fea- turing direct one-way control. However, HRI researchers of- ten utilize passive BCI systems to monitor users’ mental state while interacting with robots or completing robot ma- nipulation tasks. A variety of BCI sensors exist. These in- clude functional magnetic resonance imaging (fMRI), mag- netoencephalography (MEG), positron emission tomogra- phy (PET), functional Near InfraRed(fNIR) electrocorticog- raphy (ECoG), and electroencephalography (EEG). How- ever, due to factors such as low temporal resolution, porta- bility, and expensiveness, EEG and fNIR sensors are more common in HRI research than the alternative sensors. Much of this work involves BCI systems that collect in- formation about users’ cognitive states during robot navi- gation and manipulation tasks (Solovey et al. 2012; Bozi- novski and Bozinovski 2015; Crawford and Gilbert 2015). BCI systems are also commonly used to evaluate and in- Artificial Intelligence for Human-Robot Interaction AAAI Technical Report FS-17-01 90

Upload: others

Post on 16-May-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Neurophysiological Heat Maps for Human-Robot Interaction

Neurophysiological Heat Maps for Human-Robot Interaction EvaluationChris S. Crawford

Department of Computer ScienceUniversity of Alabama

Tuscaloosa, Alabama 35487

Marvin AndujarDepartment of Computer Science and Engineering

University of South FloridaTampa, Florida 33620

Juan E. GilbertDepartment of Computer and Information Science and Engineering

University of FloridaGainesville, Florida 32611

Abstract

Human-Robot Interaction (HRI) effectiveness is often evalu-ated subjectively to gauge the user experience. These methodsmay result in responses that are inaccurate or inconsistent dueto participants’ false responses or inability to communicatetheir affective or cognitive states. In this paper, we discuss anapproach featuring heat maps that link users’ cognitive stateto a robot’s location. Users’ cognitive state was interpretedfrom electroencephalography (EEG) signals while complet-ing a robot navigation task. We tested this method during anexperiment featuring solo and cooperative control. Our goalis to present a step towards visualizing position-based mea-surements of users’ cognitive state.

IntroductionRecent advances in HRI have resulted in metrics andstandards that enable more efficient assessment of sys-tems. These advances have assisted researchers with gain-ing a better understanding of human behavior in areas suchas industry, home, education, and others. Currently, themost commonly used method of evaluation in HRI stud-ies is self-assessments (Bethel and Murphy 2009). Thismethod usually involves gathering information from partic-ipants via physical or digital questionnaires. Although self-assessments can provide useful information, it often causesproblems with validity and consistency (Bethel and Murphy2010). Other methods used in HRI include behavior mea-sures and task performance. The behavior measures methodenables researchers to observe the actual behaviors of par-ticipants. Currently, this is often done by video recordingparticipants and coding the visual and auditory information.

Although behavior measures through observations can beless tedious compared to other forms of subjective measures,it is susceptible to the Hawthorne effect, which is known tocause bias (Sim and Loo 2015). Task performance measuresare useful for measuring how well users are able to completetasks while interacting with robots but should be combinedwith one of the methods mentioned previously for efficientHRI evaluation.

Copyright c© 2017, Association for the Advancement of ArtificialIntelligence (www.aaai.org). All rights reserved.

Novel methods and approaches that allow researchers togather information that reflects the user experience (UX)without the limitations of traditional evaluation methodsmay enhance HRI evaluation. Neurophysiological measurescan be used to supplement traditional evaluation methodsand address some of the issues mentioned earlier.

This work presents an exploratory investigation of map-ping neurophysiological measures of engagement to arobot’s position via heat maps as a supplementary approachto evaluating users’ cognitive state. To explore this concepttwo condition were observed. The initial condition consistedof evaluating a user’s engagement level while they cogni-tively controlled a robot using two commands. In the secondexperiment, two participants controlled the robot coopera-tively using one command each. In this paper, we brieflydiscuss previous HRI research that features Brain-ComputerInterfaces (BCI) and describes how heat map visualizationscan provide useful details about users’ cognitive state whileperforming robot navigation tasks.

Brain-Computer Interfaces in HRIThe use of BCI in Human-Computer Interaction (HCI) hasrecently become of interest to researchers (Tan and Nijholt). Furthermore, HRI researchers have also began to inves-tigate BCI systems. Early BCI research featuring robotscommonly investigated active BCI robot-architectures fea-turing direct one-way control. However, HRI researchers of-ten utilize passive BCI systems to monitor users’ mentalstate while interacting with robots or completing robot ma-nipulation tasks. A variety of BCI sensors exist. These in-clude functional magnetic resonance imaging (fMRI), mag-netoencephalography (MEG), positron emission tomogra-phy (PET), functional Near InfraRed(fNIR) electrocorticog-raphy (ECoG), and electroencephalography (EEG). How-ever, due to factors such as low temporal resolution, porta-bility, and expensiveness, EEG and fNIR sensors are morecommon in HRI research than the alternative sensors.

Much of this work involves BCI systems that collect in-formation about users’ cognitive states during robot navi-gation and manipulation tasks (Solovey et al. 2012; Bozi-novski and Bozinovski 2015; Crawford and Gilbert 2015).BCI systems are also commonly used to evaluate and in-

Artificial Intelligence for Human-Robot InteractionAAAI Technical Report FS-17-01

90

Page 2: Neurophysiological Heat Maps for Human-Robot Interaction

fluence human-agent interaction (Szafir and Mutlu 2012;Strait, Canning, and Scheutz 2014). These investigations of-ten feature measurements of user states such as engagement,relaxation, and attention. Visualizations of user state infor-mation collected during HRI research featuring BCI playsa significant role in understanding the relationship betweenusers’ cognitive states and interactions with the robot. Thesevisualizations are often presented as a line graph reflectingusers’ dynamic states. Although this information is useful,it could be difficult to connect line graph data to the robot’svarying position. In an attempt to address this issue, we haveattempted to implement a process that combines informationabout a user’s cognitive state with a robot’s positions.

Design

Figure 1: Neurophysiological-based heat map process.

Previous literature suggests that cooperative brain-robotinteraction could decrease fatigue (Wang et al. 2011), in-crease engagement (Nijholt and Gurkok 2013), and improvecontrol (Poli et al. 2013) in comparison to solo brain-robotinteraction. The investigation presented in the following sec-tions builds on these previous assumptions by investigatingwhether cognitive state heat map visualizations reflect thesefindings. A robot simulation application was used during theexperiment. The simulation featured a top down view of arobot in a simple 2D maze environment developed usingHTML5 and JavaScript. The yellow background served asthe main stage for the simulation. Black lines shown in fig-ure 1 represent the environments walls. During the task, acollision with these objects was logged as an error. The redobject with two smaller black objects attached is the robotusers controlled. The green object in the simulation providedvisual feedback of the target for users.

During a preliminary experiment neurophysiological-based heat maps were generated using EEG data collectedfrom users navigating a simulated robot in a simple maze

environment. Both solo control and cooperative control weretested during the experiment. During solo control, users per-formed two types of cognitive commands with the BCI:forward and clockwise rotate. The clockwise rotate com-mand updated the robots current angle. Each clockwise ro-tate command received from the BCI rotated the robot clock-wise by 2 degrees. During cooperative control, users wereonly responsible for one command. Task completion timeand errors were recorded. As shown in Fig. 1, key compo-nents of the system used to create the neurophysiological-based heat maps are EEG data acquisition, processing, andvisualization generation.

EEG Data Acquisition and ProcessingLaptops and an Emotiv Epoc (non-invasive wireless EEGheadset), were the main hardware components of the sys-tem used during this preliminary experiment. The BCI de-vice consists of 14 channels (AF3, F7, F3, FC5, T7, P7, O1,O2, P8, T8, FC6, F4, F8, AF4) based on the international10-20 locations. It performs at a sampling rate of 2048 Hz.Using EmoKey, a component of the Emotiv application pro-gram interface (API), cognitive commands were convertedinto keyboard events. These commands were triggered atthe system level therefore allowing them to be passed to therobot simulation application environment.

The Emotiv was also used to passively collect raw EEGdata that reflects users’ cognitive states. The raw EEG datawas logged and stored in an edf file that was used dur-ing the post-processing phase. EEGlab, a Matlab packagefor analyzing EEG data, was used to calculate the powerof EEG frequency bands. These bands include Alpha (8-13Hz), Beta (13-30 Hz), and Theta (4-8 Hz). Alpha is relatedto relaxation, beta is related to alertness, and theta is asso-ciated with creative inspiration and deep meditation. Theband powers values were applied to a formula (E = β / α+ θ) shown to be highly correlated with task engagement(Pope, Bogart, and Bartolome 1995). These engagement val-ues were used to generate heat maps that reflect users’ cog-nitive states.

Heat Map GenerationThe neurophysiological-based heat maps presented in thiswork provide visualizations that reflect position-based userengagement levels. As illustrated in figure 1 the first stepof this process consists of processing the engagement data.This data is timestamped and saved in a comma separatedvalues (CSV) file. Afterwards this data is normalized to fitin a range from 0 to 1. Each line graph shown in figures 2 -4 reflect smoothed values of engagement calculations usingan exponentially-weighted moving average (EWMA). Thisapproach was used previously to smooth the values of en-gagement (Szafir and Mutlu 2012). This value was mappedto the robots location throughout the tasks in a new CSV filewith columns holding information about the robot’s posi-tion, time, and user engagement level. This is accomplishedby parsing the engagement CSV file and a CSV file contain-ing timestamped robot position information. The new CSVfile is used to generate heat map visualizations over the sim-ulation environment using the heatmaps.js library.

91

Page 3: Neurophysiological Heat Maps for Human-Robot Interaction

Preliminary ExperimentThis section discusses results from a preliminary experimentfeaturing a single user during a solo condition and two usersduring a cooperative condition. The average engagement forthe solo participant was 0.5347. The participant completedthe task in 464.9 seconds. This participant made 76 errorswhile navigating the robot through the simulation environ-ment. During the cooperative control case participant 1 wasresponsible for the forward command and participant 2 con-trolled the clockwise rotate command. Participant 1 had anaverage engagement level of 0.5270 and participant 2 av-erage level of engagement was 0.53330. These participantscompleted the task in 416 seconds with 191 errors. This in-formation is provided to provide the reader context of theexperiment. However, the main goal of this paper is to sharehow neurophysiological-based heat maps may provide sup-plementary information for HRI evaluation.

Figure 2: Solo condition neurophysiological-based heatmap.

DiscussionIn each heat map, red areas correspond to high levels of en-gagement. The green, yellow, blue, and areas without colorall refer to decreasing levels of engagement respectfully.Similarly to what was observed in the solo study, the cornersthat required collaboration during the cooperative conditionshow increased levels of engagement. This is especially vis-ible in figure 3. The robot path during the cooperative con-dition took an early turn at the start of the task. This resultedin the robot heading towards an obstacle instead of contin-uing towards the completion point. The neurophysiologicalheat maps not only showed this robot path behavior but alsoprovides information about how participant’s 1 engagementincreased during this event (figure 3).

Traditional measures would require recall from partici-pants during an interview at the end of a task to get this

Figure 3: Cooperative condition neurophysiological-basedheat map (Participant 1: Forward).

kind of information. Cooperative participant 2 also seem tohave higher levels of engagement as the robot approachedcorners (figure 4). Using neurophysiological measures com-bined with position data could provide these kinds of in-sights about users cognitive state during robot navigationtasks. It could later be used to supplement conversations dur-ing user interviews or focus groups. Heat map visualizationshave also been used in non-BCI HRI experiments to evalu-ate engagement (Anzalone et al. 2015). Leveraging previousapproaches in combination with the neurophysiological ap-proach presented in this work may provide convergent va-lidity in the future.

LimitationsThough this approach presents a first step at visualizing neu-rophysiological heat maps, there are shortcomings that re-quire further investigation. This work does not provide sta-tistical analysis of the EEG data which limits claims relatedto engagement comparisons across conditions. This is dueto the small number of participants presented in this prelim-inary work. Although this paper discusses the creation ofheat maps using EEG, multiple other signals such as heartrate variability (HRV/EKG), electromyography (EMG), skintemperature (ST), and skin conductance activity (SCA) /Galvanic Skin Response (GSR) could also be tested in thefuture. There is a chance that engagement values are cor-related with the cognitive commands due to the underlyingcommand classification approach. However, due to the pro-prietary nature of the EmoKey software, this work does notconfirm this assumption which limits insights that can bedrawn from this work. This work investigates solo and co-operative control from a BCI perspective. Similar studies inthe future that use BCI solely for evaluation and traditionalinteraction methods may provide additional insights on theapplicability of neurophysiological heat maps for HRI re-search in the future.

92

Page 4: Neurophysiological Heat Maps for Human-Robot Interaction

Figure 4: Cooperative condition neurophysiological-basedheat map (Participant 2: Rotate).

ConclusionIn this paper we discuss a novel use of heat maps to informHRI researchers of users’ cognitive state. We present a dis-cussion on how neurophysiological-based heat maps couldassist with discovering relations between user’s mental stateand a robot’s position. Although engagement derived fromEEG signals is used in the provided example, this conceptcould be applied with various physiological signals. We planto extend this work to see if the results hold with a largeramount of participants in more complex environments. Wealso intend to expand this project by evaluating researchersperception of cognitive data graphed on line plots alongsideneurophysiological heat maps.

ReferencesAnzalone, S. M.; Boucenna, S.; Ivaldi, S.; and Chetouani,M. 2015. Evaluating the engagement with social robots.International Journal of Social Robotics 7(4):465–478.Bethel, C. L., and Murphy, R. R. 2009. Use of large sam-ple sizes and multiple evaluation methods in human-robotinteraction experimentation. In AAAI Spring Symposium:Experimental Design for Real-World Systems, 9–16.Bethel, C. L., and Murphy, R. R. 2010. Review of humanstudies methods in hri and recommendations. InternationalJournal of Social Robotics 2(4):347–359.Bozinovski, S., and Bozinovski, A. 2015. Mental states, eegmanifestations, and mentally emulated digital circuits forbrain-robot interaction. IEEE Transactions on AutonomousMental Development 7(1):39–51.Crawford, C. S., and Gilbert, J. E. 2015. Towards analyzingcooperative brain-robot interfaces through affective and sub-jective data. In Proceedings of the Tenth Annual ACM/IEEE

International Conference on Human-Robot Interaction Ex-tended Abstracts, 231–232. ACM.Nijholt, A., and Gurkok, H. 2013. Multi-brain games:cooperation and competition. In International Conferenceon Universal Access in Human-Computer Interaction, 652–661. Springer.Poli, R.; Cinel, C.; Matran-Fernandez, A.; Sepulveda, F.; andStoica, A. 2013. Towards cooperative brain-computer inter-faces for space navigation. In Proceedings of the 2013 inter-national conference on Intelligent user interfaces, 149–160.ACM.Pope, A. T.; Bogart, E. H.; and Bartolome, D. S. 1995. Bio-cybernetic system evaluates indices of operator engagementin automated task. Biological psychology 40(1):187–195.Sim, D. Y. Y., and Loo, C. K. 2015. Extensive assessmentand evaluation methodologies on assistive social robots formodelling humanrobot interactiona review. Information Sci-ences 301:305–344.Solovey, E.; Schermerhorn, P.; Scheutz, M.; Sassaroli, A.;Fantini, S.; and Jacob, R. 2012. Brainput: enhancing interac-tive systems with streaming fnirs brain input. In Proceedingsof the SIGCHI conference on Human Factors in ComputingSystems, 2193–2202. ACM.Strait, M.; Canning, C.; and Scheutz, M. 2014. Let metell you! investigating the effects of robot communicationstrategies in advice-giving situations based on robot appear-ance, interaction modality and distance. In Proceedings ofthe 2014 ACM/IEEE international conference on Human-robot interaction, 479–486. ACM.Szafir, D., and Mutlu, B. 2012. Pay attention!: designingadaptive agents that monitor and improve user engagement.In Proceedings of the SIGCHI Conference on Human Fac-tors in Computing Systems, 11–20. ACM.Tan, D., and Nijholt, A. Brain-computer interaction: Ap-plying our minds to human-computer interaction. 2010.Springer-Verlag London.Wang, Y.; Wang, Y.-T.; Jung, T.-P.; Gao, X.; and Gao, S.2011. A collaborative brain-computer interface. In Biomed-ical Engineering and Informatics (BMEI), 2011 4th Interna-tional Conference on, volume 1, 580–583. IEEE.

93