robot middleware and its standardization in omg - researchgate
TRANSCRIPT
Robot Middleware and its Standardization in OMG
October 10, 2006International convention Center
Beijing, China
Tetsuo KOTOKU
IntroductionRobot Society in the 21st century •how to support the elderly in
their daily lives
•how to keep enough labour force in industrial and social activities
increase of the high elderly population
Expanding Robot Applicationfrom industry to non-industry
ManufacturingAutomation
MaintenanceMedical
Home ServiceSecurity
CommunicationEntertainment
etc
Introduction
With the rapid progress in computer and communication technology, the robot systems are fast becoming larger and more complicated. Therefore, there is a real need for the software technologies for efficient developments. Now various software technologies are proposed and implemented respectively.
Rapid progress:
ComputerTechnology
NetworkTechnology
Robot Systems
• larger
• more complicated
Single robot
Networked robot
Efficient DevelopmentEfficient DevelopmentEfficient Development
Robot Society in the 21st century
How new robotic products will be produced?
Made-to-Order BusinessMade-to-Order BusinessMade-to-Order Business
Needs IntegratingComponents New Robots
Components Market( RT Components )
Motors Sensors Robot arm
Technology Strategy (JARA)
Technology Strategy (JARA)
Customer SystemIntegrator
ComponentCompanies
Solution Business
OrderDesignInformation
Orderto make
manufacturer
Supply
SupplyingComponent
Academia
Technicalseeds
21st Century Business Model
ComponentCompaniesComponent
Companies
from robots to RT
Made-to-Order BusinessMade-to-Order BusinessMade-to-Order Business
RT Middleware Project
Robot A
Conventional Robot Systems
Robot CRobot B
Robot Maker makes Everything of each robot.
Interfaces of modules in each robot are not defined well. So, it is difficult to re-use them in other robot systems.
Cost of development is high.It is difficult to create a variety of robots
MotorForce Sensor
Servo Control
Network
MotorForceSensor
ServoControl
Application of Robot A
Application of Robot B
Application of Robot C
Component Based Robot Systems
It will be easy to create new robot by re-using existing modules.
Cost of development of new robot will be low.Module suppliers, software module suppliers and system
integrators can join the new robot business.It will be easy to develop a variety of robots.
Important Issues• Preparing for Technological Infrastructure for the
System Integration Industry
• Robotic components with open architecture controller should be supplied to the market.
• Middle-ware, a kind of software which standardizes robotic component connection should be considered.
• A specially designed processor for open controller of robotic system should be developed.
Object Management Group• Worldwide software consortium
– Distributed Object Middleware (CORBA)– Object Model Language (UML)– Model Driven Architecture (MDA)
• Application Fields Specific Standardization(Business Enterprise Integration, C4I, Finance, Healthcare, Life Science Research, Manufacture, Software-based Communication, Space, Robotics)-> Domain Technology Committee
http://www.omg.org/http://www.omg.org/http://www.omg.org/
Standardization Activity in OMG
• Yun-Koo Chung (ETRI, Korea)• Tetsuo Kotoku (AIST, Japan)• Hung Pham (RTI, USA)
http://robotics.omg.org/p g ghttp://robotics.omg.org/http://http://robotics.omg.orgrobotics.omg.org//
(Since Dec. 9, 2005)
Standardization Activities
Robotics standardsobot cs sta da dsRobotics standardsRobotics standards
Interoperability
Design D
Unfortunately, most of pioneering initiatives are developed independently of the others, driven by specific applications and objectives. In order to settle this state of chaos, we would like to contribute to the promotion of standardization in the field of robotics based on the mutual understanding between the relevant parties.
Design E
Design F
Design A
Design C
Design B
for application A
for application B
for application C
for objective D
for objective E
for objective F
Integration of robot systems based on modular components
Standardization Process
• Agreement based on discussion• Strict process for fairness• Leadership by volunteers
Initial Survey
RFI issue
Priority
Roadmap
Volunteer
WG
Scope
RFP Issue
Agreement
Initial and Revisedsubmission
Review
AB endorsementTC recommendationBoD adoption
Documentation
5 meetings/year
• Recruiting discussion members• Get information for various activities
• Presentation and discussion• Setting up working groups
http://robotics.omg.org/http://robotics.omg.org/http://robotics.omg.org/
Request For Information (RFI)
Initial Survey:
Robotics Domain Task ForceInitial Survey Priority Volunteer Scope Agreement Review
Robotic Systems RFI
Large variation of physical characteristics
mobile robotshumanoid robotspet robots,manipulator robotsautonomous vehiclesrobot houseetc.
Scope of Robotic Systems :
Broad span of applicationscommunication and entertainment robotslifestyle support robotsrescue robotstransportation robotsmedical robotsetc.
“Systems that provide intelligent services and information by interacting with their environment,
including human beings, via the use of various sensors, actuators and human interfaces.”
“Systems that provide intelligent services and information by interacting with their environment,
including human beings, via the use of varioussensors, actuators and human interfaces.”
“Systems that provide intelligent services and information by interacting with their environment,
including human beings, via the use of various sensors, actuators and human interfaces.”
Robotic Systems RFI • An RFI may be issued to gather industry
requirements and comments at the beginning of standardization phase
• Any person or company may respond• OMG member decide how to proceed, based on
input from both inside and outside the organization
• Access the RFI documents athttp://robotics.omg.org/robotic_systems_rfi.htm
• Response deadline was April 4, 2006
RFI Responses• 1st Batch: 9 presentations in Burlingame
(RTI, Systronix, SNU, ETRI * 2, NEC, NTT, ATR, Toshiba)
• 2nd Batch: 14 presentations in Tampa(Hitachi, ADA Software, SEC, Mayekawa MFG, ETRI*3, Tsukuba Univ., AIST, Coroware, IHI, PrismTech, THALES, Toshiba)
• 3rd Batch: 6 presentations in St. Louis(Samsung*2, Fujitsu, ETRI, SAIT, SEC)
Total: 29 presentations
Organization
Robotics-DTFYun-Koo Chung (ETRI Korea)Tetsuo Kotoku (AIST, Japan)Hung Pham (RTI, USA
Publicity Sub-Committee
Steering CommitteeAll volunteers
Robotic Services WG
Profile WG
Infrastructure WG
Abheek Bose (ADA Software Indea)Masayoshi Yokomachi (NEDO, Japan)Olivier Lemaire (AIST, Japan)Yun-Koo Chung (ETRI Korea)
Noriaki Ando AIST, JapanRick Warren (RTI, USASaehwa Kim SNU Korea
Bruce Boyes (Systronix USA)Seung-Ik Lee (ETRI Korea)
Soo-Young Chi (ETRI Korea)Olivier Lemaire(AIST, Japan)
Contacts Sub-Committee
Technical WGs
Makoto Mizukawa (Shibaura-IT, Japan)Yun-Koo Chung (ETRI Korea)
Robotic Functional Services WG
Co-chairs :– Soo-Young Chi (ETRI)– Olivier Lemaire (JARA/AIST)
Robotics Services WG Mission Statement
• The goal of the Robotics Services WG is :– Establish a clear definition of Robotic service – Identify and categorize services commonly
used in robotic application and the technologies involved
– Define standard interfaces that expose these technologies to robotic application developers
– Coordinate with other groups within the OMG Robotics Task Force to keep specification consistent
Robotic Device and Data Profile WG
Co-chairs :– Bruce Boyes (Systronix)– Seung-Ik Lee (ETRI)
Profile WG Mission StatementApplication Programmer's View
1. Define scope and model of API2. Define typical devices3. Device hierarchies (like class hierarchies)4. Define interfaces & Data structures
1. Consider standards such as JAUS5. Device Profiles
1. Enumeration of available resources2. Resource configuration and capabilities
Physical Resource View1. Apply relevant standards (IEEE, etc) to robotics
1. Smart sensors IEEE-14512. Precision networked clock IEEE-15883. Arrange presentations on the above at OMG meetings
1. 1451 in Anaheim?2. 1588 in Wash DC? (near NIST)
2. I/O point tagging, provides:1. Enumeration of available resources2. Storage of configuration and capabilities
1. on the actual device or as close to it as possible
Infrastructure WGCo-chairs:
– Saehwa Kim (Seoul National Univ.)– Rick Warren (RTI)– Noriaki Ando (AIST)
Infrastructure WG Mission Statement
• The purpose of the Infrastructure Working Group ofthe Robotics Domain Task Force is to standardize fundamental models, common facilities, and middleware to support the development and integration of a broad range of robotics applications.
• This working group should collaborate with other groups within OMG.
– Common facilities• Fundamental services general to wide range of robotics
applications.
Robotic Technology Component (RTC)
• Adopted in the Anaheim Meeting(September 29, 2006)
• Component model for robotics– Basis for software modularization and integration
at infrastructure/ middleware level in this domain– Builds on – does not replace – general-purpose
component models
National Institute of Advanced Science & Technology (AIST)
Real-Time Innovations (RTI)
Call for ParticipationOMG Technical Meeting in Washington DCDecember 4-8, 2006
Robotics-DTF meeting• WG meetings [Mon., Tue., Thu.]• Plenary meeting [Tue., Wed.]• Steering committee [Monday]
(any volunteers are welcome!)
http://www.omg.org/registration/p g g ghttp://www.omg.org/registration/http://http://www.omg.orgwww.omg.org/registration//registration/
Contacts:Home Page:
http://robotics.omg.org/Mailing List:
– [email protected]– [email protected]– [email protected]– [email protected]
Conclusions< Key Technology of RT >
Module-base Open Architecture– Inter operability– reusability– portability– development tool
Customer SystemIntegrator
ComponentCompanies
Solution Business
Order
DesignInformation
Orderto make
manufacturerSupply
SupplyingComponent
Academia
< meet the market needs >
New RT Industryy< meet the market needs >
New RT Industry< meet the market needs >
New RT IndustryStandardization
<Development and diffusion of RT middleware>
Alexei Makarenko1 IROS’06 Workshop on Software Standardization
Orca: Components for Robotics
Alexei Makarenko,Alex Brooks,
Tobias Kaupp
ARC Centre of Excellence for Autonomous Systems
The University of Sydney
Alexei Makarenko3 IROS’06 Workshop on Software Standardization
Uni. of Sydney:where we are coming from
• Distributed applications• Sensor networks, decentralized information
fusion• Naturally leads to components
• Industry funded projects• Need for robust implementations
• Complexity of robotic software• Robotic software – the bottleneck of what is
possible today (not computing hardware, sensors, algorithms)
• “The PhD problem”
Alexei Makarenko4 IROS’06 Workshop on Software Standardization
Component-Based Software Engineering (CBSE)
• Advantages• Modularization: “collection of small problems is
easier to solve than one big problem”• Minimizing source-code cross-dependencies• Maximizing software reuse• Parallel development by large distributed teams
Alexei Makarenko5 IROS’06 Workshop on Software Standardization
Component Model
• Fundamentals• Interface definition• Communication mechanism
• Options• .NET (Microsoft)• Enterprise Java Beans (Sun)• CORBA (OMG and implementers)• Ice (ZeroC)• Custom (e.g. Player, Orca-1)
Alexei Makarenko6 IROS’06 Workshop on Software Standardization
Real Choices for Robotics
• Middleware: Off-the-shelf vs. Custom• Player uses custom and is very successful• We believe there is a limit to what can be done
with custom middleware (reliability, features, performance)
• Off-the-shelf Middleware: CORBA vs. Ice• The concept is similar if not identical• For comparison of design, features,
performance, etc. see ZeroC.• We’ve used Ace/Tao and found it difficult• We find Ice much more straightforward
Alexei Makarenko7 IROS’06 Workshop on Software Standardization
Ice – Internet Communication Engine
• Background• Company: ZeroC, since 2001• http://www.zeroc.com• Open-source, proprietary
• Support• OS: Linux, Win, MacOS X, Solaris • Language: C++, Java, C#, Visual Basic, Python,
PHP• Transport: TCP/IP, UDP/IP, SSL
• Licensing• GPL for open-source projects• LGPL for Orca• Commercial license otherwise
Alexei Makarenko8 IROS’06 Workshop on Software Standardization
Orca Project: Components for Robotics
• Main objective: Software Reuse• Enable: by defining a set commonly-used interfaces• Simplify: by providing libraries with a high-level convenient
API• Encourage: by maintaining a repository of components
• What makes it different?• adopts a Component-Based Software Engineering
approach• uses an industrial-strength library (Ice) for communication
and interface definition• aims to be general, flexible and extensible• provides optional tools to assist in the development of
individual components and the management of large systems
• maintains a repository of free, reusable components
Alexei Makarenko9 IROS’06 Workshop on Software Standardization
“An object-oriented middleware platform”
• Developer• Defines interfaces in an Interface Definition
Language (IDL) • A bunch of remote procedure calls
• Writes server application• Implements RPC’s
• Writes client application• Calls RPC’s
• Ice• Takes care of communication, serialization, etc.
Alexei Makarenko10 IROS’06 Workshop on Software Standardization
Linux OS
Software Architecture
Ice C++
libOrcaIce
Component 1
Win OS
Ice Java
Component 2
transport
e.g. TCP/IP
Alexei Makarenko11 IROS’06 Workshop on Software Standardization
Ice Services
• IceGrid – grid computing• Registry (centralized)• On-demand activation, reactivation• Replication, etc.
• IceBox – application server• Internal comms are optimized to a function call (optional)
• IceStorm – event service (an IceBox service)• Publish/subscribe of any interface• Federation
• IcePatch2 – file distribution• Freeze – persistence service• Glacier2 – firewall• Ice-E – “embedded Ice”
• Light-weight version for PDA’s.
Alexei Makarenko12 IROS’06 Workshop on Software Standardization
Orca Project
• Hosted on SourceForge• Distribution includes
• Common interface definitions (Slice IDL)• Utility libraries• Component repository
Alexei Makarenko13 IROS’06 Workshop on Software Standardization
Project History and Size
Alexei Makarenko14 IROS’06 Workshop on Software Standardization
Interface Definitions
• Slice: an Interface Definition Language (IDL)
// Slice
class Position2dData extends OrcaObject{
Frame2d pose;Twist2d motion;
};
interface Position2d{
nonmutating Position2dData getData()throws DataNotExistException, HardwareFailedException;
void subscribe( Position2dConsumer* subscriber )throws SubscriptionFailedException;
idempotent void unsubscribe( Position2dConsumer* subscriber );};
Alexei Makarenko15 IROS’06 Workshop on Software Standardization
libOrcaIce
• Aims• To set naming conventions• To simplify component development
• Provides• Classes to derive from• Classes to use• Helper functions
• Strictly optional• Can choose not to use it and call libIce
directly
Alexei Makarenko16 IROS’06 Workshop on Software Standardization
Stand-Alone Component
Alexei Makarenko17 IROS’06 Workshop on Software Standardization
Component as an IceBox Service
Alexei Makarenko18 IROS’06 Workshop on Software Standardization
Component Repository
• FaithLocaliser• FeatureMapLoader• LaserFeatureExtractor• LocalNav• Logger• LogPlayer• OgMapLoader• OrcaView• RegistryView• SegwayRmp• SickLaser• SimLocaliser• Teleop
Alexei Makarenko19 IROS’06 Workshop on Software Standardization
Performance
Alexei Makarenko20 IROS’06 Workshop on Software Standardization
Example: Simple 2-D Demo with Stage
Alexei Makarenko21 IROS’06 Workshop on Software Standardization
Current work : ACFR Segway Project
Alexei Makarenko22 IROS’06 Workshop on Software Standardization
Current work: ACFR/Berkeley Team for 2007 Urban Grand Challenge
Alexei Makarenko23 IROS’06 Workshop on Software Standardization
Potential Issues to Be Aware of
• Developer skills• Harder than monolithic applications• Requires certain discipline in
• Structuring the project• Writing individual components
• But … for large projects, CBSE is a solution not a problem
• Testing• Harder than monolithic applications• But … using a middleware package removes
many testing issues.
Alexei Makarenko24 IROS’06 Workshop on Software Standardization
http://orca-robotics.sf.net
Human-Robot Interaction (HRI) is a core technology that can naturally interact between human and robots through robot camera, microphone, and various sensors for intelligent service robots.
HRI technology is different from Human-Computer Interaction (HCI) in that robots have an autonomous movement, bidirectional feature of interaction, and diversity of control level.
To development an effective HRI, the system with module architecture to implement convenience between human and robots, cooperation, and friendship should be needed.
Vision-based HRI : face recognition, gesture recognition, behavior recognition, facial expression recognition
Audio-based HRI : speech recognition, speaker recognition, sound localization, emotional recognition
Others : PDA-based HRI interface, emotion generation and expression
Face detection/tracking/recognition: the face detection/tracking allows robot to detect a human face from images obtained through robot camera and to track a movement of the human face during natural conversation with the robot. The face recognition/verification allows robot to recognize a member of family and to verify the identity of the human face.
User identification using semi-biometricsThe face recognition/verification is furthermore allows robot toknow the direction, distance, and identity of the user using semi-biometrics information such as user’s cloth color and height.
Gesture recognition: Meaningful gestures could be represented by both temporal hand movements and static hand postures. It can be efficiently used in noisy environment and can also give active services by recognizing user’s intension for intelligent robots
Facial expression recognition: Robotic systems for natural user interaction performs facial expression recognition, since facialexpressiveness is regarded as a key component to developing personal attachment. The six basic facial expressions are the expressions of the six basic emotions of humans: anger, disgust, fear, happiness, sadness, and surprise.
Caller identificationThe robot recognizes the face image of caller with hand gesture.The hand gesture detection is obtained by hand shape around faceimage of caller. After recognizing hand gesture, robot moves forward caller.
Human followingThe robot follows specific person by using color histogram of cloth and information of depth map obtained from stereo camera of moving robot. Human following obviously requires real-time ability to respond to the changing position of the person to be followed.
Human detection and trackingObject recognitionHuman followingBehavior recognitionPosture recognition
Speaker recognition: the text-independent speaker recognition allows robot to recognize the identity of a speaker during natural conversation with robot. Furthermore, this technique allows a speaker to communicate with robot through spontaneous speech.
Sound localization: the main goal of this technique is to find the direction of the call voice by uttering its name or hand clapping sound. After that, robot moves toward the position where the sound is generated,
Speaker and speech recognitionThis technique allows a speaker to communicate with robot through spontaneous speech. With text-independent speaker recognition, it is able to provide useful information such as daily life schedule and TV program suitable to the speaker.
Emotion recognitionSound source separationSpeech understandingSpeech synthesis
PDA-based HRI interfacePDAs can be used to interact with robots. Furthermore, they provide a suitable interaction device for teleoperation and a touch-screen interaction capability
Emotion generation and expressionThis is a technique that the robots generate and express their emotion corresponding to human’s behavior
Our proposed standard for HRI consists of one model andtwo interfaces
– information model and recognition and expression interfaces
Information model designs the types and structures ofinformation objects as well as their management methods.
The objects are two typesrecognition and expression.
First of all, information model consists of the types structuresand protocols of information from sensors and applications of robot.
Sense information model says When, Who say, What, Where from Human to Robot informationSense Information model consists of 4W Model
Who information : face recognition, speaker recognition, etc.Where information : vision based location information,
landmark based location info., etc.What information : speech recognition, gesture
recognition, etc.When information: scheduler, clock, etc.
Who information modelGenerally it can be obtained by face recognition , speaker recognition, etc.We define whoinfo as path + userId information
• Ex) “//korea/south/taejon/etri/mrsong” : “//korea/south/taejon/etri/” Is path, “mrsong” is id
If the robot do not know the correct user id, they send WhoHint(face image or voice sound) to server
• Ex) WhoHint = { Face Image or Voice sound }• if server do not know location
• WhoHint = { (Image or Sound), Location path}
…
Where information modelGenerally it can be obtained by location finding algorithm
– We define whereinfo as address(or path) + position information
• Ex) Can be direct type or near type• direct type:
“//korea/south/taejon/etri/7thbuilding/L864/x=10;y=13”
• near type : have to translate direct position.•
“//korea/south/taejon/etri/robotdivion/hriteam/song”
…
• What information model– Generally it can be obtained by speech recognition, gesture
recognition, any other command recognition methods.– We define whatinfo as natural string or path command
• Ex)• Natural string
• “change the TV channel”, or “show today schedule”• Command format with path information
• TV (“//korea/south/taejon/etri/7thbuilding/L864/38TV”).channel.changeUp();
– If the robot do not understand a correct command, the robot sendWhatHint( gesture image, voice or any other command data) to server
• Ex) WhatHint = { Image, voice or other command data(IR remote controller info, etc.) }
…
• When information model– Generally it can be obtained by clock and scheduler, etc.– WhenInfo show current time – We define wheninfo as time information
• Ex)• Date Time : from clock
• “2006-09-13-14:33:23”• If command use when info( in case users say
“response until tommorow”), it can be processed robot AI with Whatinfo and Wheninfo.
– If the robot do not send wheninfo, servers use their internal clock.
…
• Sense Interfaces use sense information model as their arguments and results.
• include interfaces for 4W info model and Hint model• include sensor type interfaces and processor type
interfaces– Sensor type: active, event style– Processor type: passive, it can be used in a
application.
• Sensor(Event) type Interface:– On… interfaces
• OnWho(WhoInfo), OnWho(WhoHint)• OnWhere(WhereInfo), OnWhere(WhereHint)• OnWhat(WhatInfo), OnWhat(WhatHint)• OnWhen(WhenInfo)• OnSee(ViewImage), OnSound(SoundData)
– If robot cannot sense anything, In some case, robot send all images or sounds to be processed to server.
• Processor(Query) type Interface :– Get… interfaces
• GetWho(), GetWhoHint()• GetWhere(), GetWhereHint()• GetWhat(), GetWhatHint()• GetWhen()• GetSee(), GetSound()
– Get… interfaces have block mode and non-block mode
• Expression Information model consists of– How information
• Just say abstract expression commands: first step• It can be translated each methods for each robots:
second step• Therefore, Expression information model require
some special architecture
• Expression Model– It can be produced from Robot AI or service
application in server– It can be processed on robot side or server side.
• In case of robot side, robots process expression words
• In case of server side, servers process words by calling robot control interfaces.
• Expression Model– It have to be defined with standard word we have
to define some robot expression word.• Ex) “say(<text>)”, “approach(<user>)”, etc.• It may match to same interface
– “say(hello)” can match to robot.say(“hello”)– Some case, an expression model may not matched
to an expression interface• Ex) expression can be “greeting” but a robot may
not do “greeting” because they do not know the meaning of “greeting”
– In this case, a robot use expression hint.
• Expression Hint Model– If a robot cannot process expression, they want
expression hint.– ExpressionHint shows action flow instead of
abstract words.
• Expression Hint Model– It can be differently implemented by each robot
makers or robot service application developer .• ExpressionModel: Greeting
– A maker:– Expression(Greeting) =
ExpressionHint( TTS(“Hello”))– B maker
– Expression(Greeting) = ExpressionHint( robot.arm.shakeHands());
• Expression Interfaces use expression information model or expression hint model as their arguments and results.– Expression Interface…
• ExpressionRenderer.Expression(“someexpression”)
• ExpressionRenderer.Expression(“expression”,ExpressionHint hint)
• Expression Renderer process Expression, ExpressionHint
• If Renderer cannot process Expression, it try to get ExpressionHint from robot makers.
• Renderer is a process engine or task engine.• It can be placed on application server or robot side.