autonomous robot dancing

15
AUTONOMOUS ROBOT DANCING DRIVEN BY BEATS AND EMOTIONS OF MUSIC ((in the name of the best creator)) Distributed Artificial Intelligence Course

Upload: sahar-seifzadeh

Post on 30-Jun-2015

421 views

Category:

Education


2 download

DESCRIPTION

Autonomous Robot Dancing Driven by Beats and Emotions of Music- Nao

TRANSCRIPT

Page 1: Autonomous Robot Dancing

AUTONOMOUS ROBOT DANCING DRIVEN BYBEATS AND EMOTIONS OF MUSIC

((in the name of the best creator))

Distributed Artificial IntelligenceCourse

Page 2: Autonomous Robot Dancing

INTRODUCTION

Many robot dances are preprogrammed by choreographers.

choreography:

2

OU

T O

F 1

5

Page 3: Autonomous Robot Dancing

primitive emotions

key frame (static poses)

Our work is made upof two parts:

(1) The first algorithm plans a sequence ofdance movements that is driven by the beats and the emo-tions detected through the preprocessing of selected dancemusic. (2) We also contribute a real-time synchronizing al-gorithm to minimize the error between the execution of themotions and the plan.

3

OU

T O

F 1

5

Page 4: Autonomous Robot Dancing

WE CREATE A LARGE LIBRARY OF MOTION PRIM-ITIVES, BY DIVIDING THE JOINTS OF THE NAO HUMANOID ROBOTINTO 4 CATEGORIES, WHERE EACH CATEGORY OF JOINTS CAN ACTUATEINDEPENDENTLY.

4

OU

T O

F 1

5

Page 5: Autonomous Robot Dancing

PUAL EKMAN PROPOSED 6 PRIMARY EMOTIONS:

1-happy 2-sad 3-surprised 4-angry 5-fear 6-disgust

5

OU

T O

F 1

5

Page 6: Autonomous Robot Dancing

JOINTS:

1. Head (Head): HeadYaw, HeadPitch

2. Left Arm (LArm): LShoulderPitch, LShoulderRoll, LElbowYaw, LElbowRoll

3. Right Arm (RArm): RShoulderPitch, RShoulderRoll, RElbowYaw, RElbowRoll

4. Legs (Legs): LHipYawPitch, LHipRoll, LHipPitch, LKneePitch, LAnklePitch, LAnkleRoll, RHipRoll, RHipPitch, RKneePitch, RAnklePitch, RAnkleRoll

6

OU

T O

F 1

5

Page 7: Autonomous Robot Dancing

emotion extraction:

emotion presentation:

(a , v)

a and v is between -1 , 1 7

OU

T O

F 1

5

SMERS(SVR)

94% agreement

super vector regression

Page 8: Autonomous Robot Dancing

beat tracking

beat tracking+ autocorrelation analysis+ neural network

amplitude ( the best candidate)

8

OU

T O

F 1

5

Page 9: Autonomous Robot Dancing

MAPPING MOTION PRIMITIVE TO ACTIVATION-VALENCE SPACE FROM STATIC POSTURES DATA

We collected 4 static postures of the NAO humanoid robot

for each of Ekman's 6 basic emotions:

Happy, Sad, Angry, Surprised, Fear and Disgust.

we have a totalof 24 emotional static postures.

6*4=24 9

OU

T O

F 1

5

Page 10: Autonomous Robot Dancing

10

OU

T O

F 1

5

Page 11: Autonomous Robot Dancing

EMOTION FOR NEXT MOTION PRIMITIVE

Motion primitives are selected sequentially and stretched

to all a whole number of beat times. To choose the next

motion primitive, we need the emotion at the end of the

previous motion primitive. We simply estimate the emotion

at each beat time by linearly interpolating the (a; v) values

11

OU

T O

F 1

5

Page 12: Autonomous Robot Dancing

THE MARKOV DANCER MODEL

12

OU

T O

F 1

5

exclusion we use an adaptive real-time synchronizing algorithm

primitive motions:

(i) be continuous

(ii) reflect the musical emotion

(iii) be interestingly non-deterministic

Page 13: Autonomous Robot Dancing

13

OU

T O

F 1

5

Page 14: Autonomous Robot Dancing

CONCLUSION

We show that we can automate robot dancing by formingschedules of motion primitives that are driven by the emo-tions and the beats of any music on a NAO humanoid robot.The algorithms are general and can be used on any robot.From emotion labels given for static postures, we can es-timate the activation-valence space locations of the motionprimitives and select the appropriate motion primitives foremotions detected in music.

14

OU

T O

F 1

5

Page 15: Autonomous Robot Dancing

THE END

15

OU

T O

F 1

5