humanoid robot head

1
Humanoid Robot Head May 09-11 Team Members: Client/Faculty Advisor: Dan Potratz (CprE) Tim Meer (EE) Dr. Alex Stoytchev Cody Genkinger (CprE) Jason Pollard (EE) Abstract/Problem Statement The purpose of this project is to create a functional human like robot head for Dr. Alex Stoytchev to use. Currently, there is a robotic frame with two mobile robotic arms, but only a static shell for the head. The head needs to be capable of showing human-like facial emotions and movements. The finished head (along with the rest of the robot) will be used to research and test the autonomous tool use and learning of tool affordances by robots. Design Requirements Functional Requirements: Head shall move front to back, left to right, and rotate in a 90 degree arc of motion, within one second with a velocity that will be equivalent to 90 degrees per second. Mouth shall be handled by a single servo, with a 180 degree arc of motion. Eyebrows shall be handled by two servos , with 180 degree arcs of motion. Eyes shall be able to move on two axes, with a 30 degree arc in each direction. A camera shall be implemented within the head or body to provide visual feedback for processing by the operator. Nonfunctional Requirements: The head shall look clean and nonthreatening, while retaining human-like attributes API shall be done within C/C++. Interface will be done in C# Movement of the head shall be smooth and well transitioned Motors shall be quiet and not distracting. Head API shall follow format of existing arm API. The microcontroller board shall be connected to our PC via serial or USB. Servo wiring shall be twisted pair to maintain low noise emission. System Analysis: Two RS-232 Servo Controllers will handle all pulse width control signals to all servos. Power supplies will have enough power for all servos and controllers. Programming will provide user communication to controller. Concept Sketches Functional Decomposition This chart deconstructs how the robot head will function. User sends script through the user interface to the program library, which sends the signals to the micro- controllers. From there, each servo is operated separately. Technical Details Testing Hardware Testing: For the microcontroller and servos we hooked them up to a DC power supply and monitoring their response when given commands. We sent commands to the microcontroller via a serial connection with a computer and viewed the boards output on an oscilloscope. The servos were controlled using a function generator to simulate the pulse width modulated signal that will eventually come from the microcontroller board. Software Testing: The testing took a multi-layered approach with the unit test method, testing the lowest module and running automated tests on each method. The second layer of testing was the control library as a whole (which tested the integration of the working modules as a system by testing manually with the robot head and viewing the output signals.) Last we tested the GUI on top of the control library and manually made output from the robotic head is what we expect to receive. We used oscilloscope readings and software positioning to determine that the servos were within the scope we needed. Step sizes and out-of-range signals were collected to put restrictions on the servo bounds. Budget Summary The development of a humanoid robot head will add a realistic look to Dr. Stoytchev’s robot. Without a humanoid head, the robot will not be able to function as a truly interactive robot. This interaction is imperative as Dr. Stoytchev continues research in robotic-capable learning. The facial aspects of the head also gives it more appeal for demonstrations and the general public. With careful planning and development, we’ll be able to deliver a humanoid head that is visually appealing, functional, and responsive. Hardware Specification: Microcontroller: The PV203 Servo Motor Controller Board developed by Pontech was selected based on its performance against similar models and recommendations from other customers who have used the board to great effect without complications. Facial Servomotors: The Futaba S3114 is the smallest and the fastest servomotor that meets the needs of our project. Neck Servomotors: The HiTec HS-645MG is the superior servomotor which fits in the required chassis! As such this is the servo we have selected to use inside the face. Eye Frame: Software Specification: The general scope of the software specification encompasses the functions contained within our control library. This library serves as a general building block for our user interface usability and expansion for future users. The library itself is based on a set of specific functions that talk specifically to the microcontroller and monitor the servo locations. Finally, the user interface is the highest level, providing direct interaction from user to robot. User Interface Specification: The user interface is the top most layer, and is the layer that most users will experience when they interact with our robot. The user interface will expand upon the control libraries functions, and will introduce some new functions of its own. This layer is intended to interact directly with the user via a Xbox360 controller. The controller will allow the user to control the head in free movement.

Upload: osgood

Post on 22-Feb-2016

59 views

Category:

Documents


0 download

DESCRIPTION

Abstract/Problem Statement - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Humanoid Robot Head

Humanoid Robot Head

May 09-11Team Members: Client/Faculty Advisor:Dan Potratz (CprE) Tim Meer (EE) Dr. Alex Stoytchev Cody Genkinger (CprE) Jason Pollard (EE)Andrew Taylor (EE)

Abstract/Problem Statement The purpose of this project is to create a functional human like robot head for Dr. Alex Stoytchev to use. Currently, there is a robotic frame with two mobile robotic arms, but only a static shell for the head. The head needs to be capable of showing human-like facial emotions and movements. The finished head (along with the rest of the robot) will be used to research and test the autonomous tool use and learning of tool affordances by robots.

Design RequirementsFunctional Requirements:Head shall move front to back, left to right, and rotate in a 90 degree arc of motion, within one second with a velocity that will be equivalent to 90 degrees per second.Mouth shall be handled by a single servo, with a 180 degree arc of motion.Eyebrows shall be handled by two servos , with 180 degree arcs of motion.Eyes shall be able to move on two axes, with a 30 degree arc in each direction.A camera shall be implemented within the head or body to provide visual feedback for processing by the operator.

Nonfunctional Requirements:The head shall look clean and nonthreatening, while retaining human-like attributesAPI shall be done within C/C++. Interface will be done in C#Movement of the head shall be smooth and well transitionedMotors shall be quiet and not distracting.Head API shall follow format of existing arm API.The microcontroller board shall be connected to our PC via serial or USB.Servo wiring shall be twisted pair to maintain low noise emission.

System Analysis:Two RS-232 Servo Controllers will handle all pulse width control signals to all servos.Power supplies will have enough power for all servos and controllers.Programming will provide user communication to controller.

Concept Sketches

Functional DecompositionThis chart

deconstructs how the robot head will function. User sends script through the user interface to the program library, which sends the signals to the micro-controllers. From there, each servo is operated separately.

Technical Details

TestingHardware Testing: For the microcontroller and servos we hooked them up to a DC power supply and monitoring their response when given commands. We sent commands to the microcontroller via a serial connection with a computer and viewed the boards output on an oscilloscope. The servos were controlled using a function generator to simulate the pulse width modulated signal that will eventually come from the microcontroller board.

Software Testing: The testing took a multi-layered approach with the unit test method, testing the lowest module and running automated tests on each method. The second layer of testing was the control library as a whole (which tested the integration of the working modules as a system by testing manually with the robot head and viewing the output signals.) Last we tested the GUI on top of the control library and manually made output from the robotic head is what we expect to receive. We used oscilloscope readings and software positioning to determine that the servos were within the scope we needed. Step sizes and out-of-range signals were collected to put restrictions on the servo bounds.

Budget

SummaryThe development of a humanoid robot head will add a realistic look to Dr. Stoytchev’s robot. Without a humanoid head, the robot will not be able to function as a truly interactive robot. This interaction is imperative as Dr. Stoytchev continues research in robotic-capable learning. The facial aspects of the head also gives it more appeal for demonstrations and the general public. With careful planning and development, we’ll be able to deliver a humanoid head that is visually appealing, functional, and responsive.

Hardware Specification:Microcontroller: The PV203 Servo Motor Controller Board developed by Pontech was selected based on its performance against similar models and recommendations from other customers who have used the board to great effect without complications. Facial Servomotors: The Futaba S3114 is the smallest and the fastest servomotor that meets the needs of our project. Neck Servomotors: The HiTec HS-645MG is the superior servomotor which fits in the required chassis! As such this is the servo we have selected to use inside the face. Eye Frame:

Software Specification: The general scope of the software specification encompasses the functions contained within our control library. This library serves as a general building block for our user interface usability and expansion for future users. The library itself is based on a set of specific functions that talk specifically to the microcontroller and monitor the servo locations. Finally, the user interface is the highest level, providing direct interaction from user to robot.

User Interface Specification: The user interface is the top most layer, and is the layer that most users will experience when they interact with our robot. The user interface will expand upon the control libraries functions, and will introduce some new functions of its own. This layer is intended to interact directly with the user via a Xbox360 controller. The controller will allow the user to control the head in free movement.