complexus 2003 spread of crime traffic jams marriage
TRANSCRIPT
ComPlexUs 2003Spread of Crime
Traffic jams
Marriage
Complex Systems Theory and MethodsAgent-based modeling (including cell automata)
Chaos, Fractals, & Dynamical systems theoryComplex Network science
Game TheoryMulti-scale modelingDistributed control
Nonlinear pattern recognition and data miningEvolutionary, Adaptive, Developmental approaches
Bio-inspired Computing (ANNs, EC)Etc.
Data Collection TechnologyHigh-throughput Microarrays
Molecular ImagingDistributed Sensor Networks
Web mining, Etc.
Computational CapabilitiesMassive Data Storage
Rapid Data AccessProcessor speed
Distributed and parallel computingEtc.
Increasing Recognition of
the Limits of Reductionism
Complex Systems Science is at an exciting time of Maturation
712.1.2011Älytekniikan ja konenäön uusia mahdollisuuksia / Pekka
Toivanen
Emergenssi
Applications
9
• In the security sector affective behavioural cues play a crucial role in establishing or detracting from credibility
• In the medical sector, affective behavioural cues are a direct means to identify when specific mental processes are occurring
• Neurology (in studies on dependence between emotion dysfunction or impairment and brain lesions)
• Psychiatry (in studies on schizophrenia and mood disorders)• Dialog/Automatic call center Environment – to reduce
user/customer frustration• Academic learning• Human Computer Interaction (HCI)
Zhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; , "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," Pattern Analysis and Machine Intelligence
Future Research Directions
10
• So far Context has been overlooked in most Affection Computing researches
• Collaboration among Affection researchers from different disciplines
• Fast real-time processing• Multimodal detection and recognition to achieve
higher accuracy• On/Off focus• Systems that can model conscious and subconscious
user behaviour
Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
Context Aware Multimodal Affection Analysis Based Smart Learning Environment
11
12Application System Architecture
Face
Analysis
Voice
Analysis
Posture
Analysis
Physiology
Analysis
System Controller
Hardw
are Calibration M
anager
Decision Support System
Multimedia Note
Reading Behavior Report
Lesson Length Suggestion
Class Efficiency Report
Multimodal Affect Input Parameter Adjustment
Output
13
Driver Emotion Aware Multiple Agent Controlled Automatic Vehicle
1414
Navigation Agent
Safety Agent
Driving Aid Agent
Affective Multimedia Agent
AudioLinguistic / Non-linguistic
Facial Expression
Seat Pressure
Actions •Steering Movement•Interaction with Gas / Break Paddle
Bio-signals
Stress LevelBasic Emotions
Feature Detector
Feature Detector
Feature Detector
…………...
Feature Estimator
Complex Emotions
……Route Selection
Inter agent communication to aid decision making
Notify in case of Emergency
Speed, ABS,Traction Control
Music, ClimateControl
Alert the Driver
Affective ComputingConcerned Issues
15
Privacy concerns [4] [5] I do not want the outside world to know
what goes through my mind…Twitter is the limit
Ethical concerns [5] Robot nurse or caregivers capable of
effective feedback Risk of misuse of the technology
In the hand of impostors Computers start to make emotionally
distorted, harmful decisions [18] Complex technology
Effectiveness is still questionable, risk of false interpretation
Conclusion
16
Strategic Business Insight (SBI) –
“Ultimately, affective-computing technology could eliminate the need for devices that today stymie and frustrate users…
Affective computing is an important development in computing, because as pervasive or ubiquitous computing becomes mainstream, computers will be far more invisible and natural in their interactions with humans.” [4]
Toyota’s thought controlled wheelchair [19]
17Picard, R. 1995. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical Report Picard, R. 1995. Affective Computing. The MIT Press
18
References
19
[1] Picard, R. 1995. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 321[2] Picard, R. 1995. Affective Computing. The MIT Press. ISBN-10: 0-262-66115-2.[3] Picard, R., & Klein, J. (2002). Computers that recognize and respond to user emotion: Theoretical and practical implications. Interacting
With Computers, 14, 141-169. [4] http://www.sric-bi.com/[5] Bullington, J. 2005. ‘Affective’ computing and emotion recognition systems: The future of biometric surveillance? Information Security
Curriculum Development (InfoSecCD) Conference '05, September 23-24, 2005, Kennesaw, GA, USA. [6] Boehner, K., DePaula, R., Dourish, P. & Sengers, P. 2005. Affect: From Information to Interaction. AARHUS’05 8/21-8/25/05 Århus,
Denmark.[7] Zeng, Z. et al. 2004. Bimodal HCI-related Affect Recognition. ICMI’04, October 13–15, 2004, State College, Pennsylvania, USA.[8] Taleb, T.; Bottazzi, D.; Nasser, N.; , "A Novel Middleware Solution to Improve Ubiquitous Healthcare Systems Aided by Affective
Information," Information Technology in Biomedicine, IEEE Transactions on , vol.14, no.2, pp.335-349, March 2010[9] Khosrowabadi, R. et al. 2010. EEG-based emotion recognition using self-organizing map for boundary detection. International
Conference on Pattern Recognition, 2010.[10] R. Cowie, E. Douglas, N. Tsapatsoulis, G. Vostis, S. Kollias, w. Fellenz and J. G. Taylor, Emotion Recognition in Human-computer
Interaction. In: IEEE Signal Processing Magazine, Band 18 p.32 - 80, 2001.[11] Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications," IEEE
Transactions on Affective Computing, pp. 18-37, January-June, 2010 [12] Zhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; , "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous
Expressions," Pattern Analysis and Machine Intelligence, IEEE Transactions on , vol.31, no.1, pp.39-58, Jan. 2009[13] Norman, D.A. (1981). ‘Twelve issues for cognitive science’, Perspectives on Cognitive Science, Hillsdale, NJ: Erlbaum, pp.265–295.[14] R. Sharma, V. Pavlovic, and T. Huang. Toward multimodal human-computer interface. In Proceedings of the IEEE, 1998.[15] Vesterinen, E. (2001). Affective Computing. Digital media research seminar, spring 2001: “Space Odyssey 2001”.
References[16] Burkhardt, F.; van Ballegooy, M.; Engelbrecht, K.-P.; Polzehl, T.; Stegmann, J.; , "Emotion detection in dialog systems: Applications,
strategies and challenges," Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on , vol., no., pp.1-6, 10-12 Sept. 2009
[17] Leon, E.; Clarke, G.; Sepulveda, F.; Callaghan, V.; , "Optimised attribute selection for emotion classification using physiological signals," Engineering in Medicine and Biology Society, 2004. IEMBS '04. 26th Annual International Conference of the IEEE , vol.1, no., pp.184-187, 1-5 Sept. 2004
[19] http://www.engadget.com/2009/06/30/toyotas-mind-controlled-wheelchair-boast-fastest-brainwave-anal/[20] http://www.w3.org/TR/2009/WD-emotionml-20091029/[21] M. Pantic, N. Sebe, J. F. Cohn, and T. Huang. Affective multimodal human-computer interaction. In ACM International Conference
on Multimedia (MM), 2005.[22] Gong, L., Wang, T., Wang, C., Liu, F., Zhang, F., and Yu, X. 2010. Recognizing affect from non-stylized body motion using shape of
Gaussian descriptors. In Proceedings of the 2010 ACM Symposium on Applied Computing (Sierre, Switzerland, March 22 - 26, 2010). SAC '10. ACM, New York, NY, 1203-1206.
[23] Khalili, Z.; Moradi, M.H.; , "Emotion recognition system using brain and peripheral signals: Using correlation dimension to improve the results of EEG," Neural Networks, 2009. IJCNN 2009. International Joint Conference on , vol., no., pp.1571-1575, 14-19 June 2009
[24] Huaming Li and Jindong Tan. 2007. Heartbeat driven medium access control for body sensor networks. In Proceedings of the 1st ACM SIGMOBILE international workshop on Systems and networking support for healthcare and assisted living environments (HealthNet '07). ACM, New York, NY, USA, 25-30.
[25] Ghandi, B.M.; Nagarajan, R.; Desa, H.; , "Facial emotion detection using GPSO and Lucas-Kanade algorithms," Computer and Communication Engineering (ICCCE), 2010 International Conference on , vol., no., pp.1-6, 11-12 May 2010
[26] Lucey, P.; Cohn, J.F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I.; , "The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression," Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on , vol., no., pp.94-101, 13-18 June 2010
[27] Ruihu Wang; Bin Fang; , "Affective Computing and Biometrics Based HCI Surveillance System," Information Science and Engineering, 2008. ISISE '08. International Symposium on , vol.1, no., pp.192-195, 20-22 Dec. 2008
20
24
• Pepper is a humanoid robot by Aldebaran Robotics and SoftBank Mobile designed with the ability to read emotions.
• It was introduced in a conference on 5 June 2014, and has been showcased to the public at Softbank mobile stores in Japan since 6 June.
• It will be available on February 2015 at a base price of JPY 198,000 ($1,931) at Softbank Mobile stores.
• Pepper's emotion comes from the ability to analyze expressions and voice tones.
• The robot’s head has • 4 microphones, • 2 HD cameras (in the mouth and forehead), and • a 3-D depth sensor (behind the eyes).
• There is a gyroscope in the torso and touch sensors in the head and hands.
• The mobile base has two sonars, six lasers, three bumper sensors, and a gyroscope.
https://www.youtube.com/watch?v=kCFYw8mIqcc
RoboCup
• Robots must cooperate in…– Strategy acquisition– Real-time reasoning– Multi-agent collaboration– Competition against another
team of robots
RoboCup is an international research effort to promote autonomous robots.
RoboCup
• Each robot has…– Pentium 233MHz– Linux OS– Video camera and
frame grabber– Sensor System– Kicker
How to the robots make decisions?
• Control is based on a set of behaviors• Each behavior has a set of preconditions that either…
– Must be satisfied– Are desired
• A behavior is selected when all of the “musts” become true
• A behavior is selected from several behaviors based on how many desired conditions are true
Applications of AI and Robotics
• Industrial Automation• Services for the Disabled• Vision Systems• Planetary Exploration• Mine Site Clearing• Law Enforcement• And Many Others…
48Älytekniikan ja konenäön uusia mahdollisuuksia / Pekka
Toivanen
49Älytekniikan ja konenäön uusia mahdollisuuksia / Pekka
Toivanen
https://www.youtube.com/watch?v=3IFuv1AVouM
https://www.youtube.com/watch?v=hlHrvQ7D5OU
http://time.com/2917394/japan-human-robots/#2917394/japan-human-robots/
https://wtvox.com/robotics/top-10-humanoid-robots/
https://www.youtube.com/watch?v=IhVu2hxm07E