mainhhh

21
Composant logiciel standard pour le calibrage et la fusion de donn ´ ees structur ´ ees issues de capteurs de profondeur 3D Master’s Thesis in Robotics Hamed SAMIE Institut de Syst` emes Intelligents et de Robotique Campus Jussieu, 75005 Paris Master’s Thesis 2014:2

Upload: hamed-samie

Post on 28-Dec-2015

5 views

Category:

Documents


0 download

DESCRIPTION

hhh

TRANSCRIPT

Page 1: Mainhhh

Composant logiciel standard pour lecalibrage et la fusion de donneesstructurees issues de capteurs deprofondeur 3DMaster’s Thesis in Robotics

Hamed SAMIE

Institut de Systemes Intelligents et de RobotiqueCampus Jussieu, 75005 ParisMaster’s Thesis 2014:2

Page 2: Mainhhh
Page 3: Mainhhh

Abstract

Concerning the huge increase in Industrial Robots’s use in industry the safety criterionof Human-Robot shared workspace become more and more important.

In this project we work on a part of a system that helps increae of safetyness in Human-Robot shared workspace based on Real-Time distance computation between human bodyand robot’s joints.

Our work consists of developing a system to improve the simplicity of Robot Calibrationprocedure and also by increase the number of installed camera in the workspace, improvethe mentioned safety system’s Detection and Motion Tracking part’s angle of view tominimize the Blind Spots of the Camera’s View.

Page 4: Mainhhh

Contents

1 Introduction 11.1 Employed Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.1.1 KUKA LWR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.1.2 Microsoft KINECT . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.1.3 ROS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.1.4 OROCOS Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1.2 An abstract of previous work . . . . . . . . . . . . . . . . . . . . . . . . . 6

2 Robot Calibration 92.1 Previous Calibration P. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.2 Proposed Calibration Procedure . . . . . . . . . . . . . . . . . . . . . . . 122.3 Realized Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4 To do job! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Bibliography 17

i

Page 5: Mainhhh

1Introduction

Industrial robots are found throughout industry wherever high productivity demandsmust be met. The use of robots, however, requires design, application and implementa-tion of the appropriate safety controls in order to avoid creating hazards to productionpersonnel, programmers, maintenance specialists and system engineers.

Figure 1.1: Example of a human-robot workspace

One of the most important risks of a human-robot is the possibility of collision betweenrobot joints and human body.

Karama SRITI GUIMBAL and Prof. Vincent PADOIS have worked to propose a sys-tem to help the safety of the human-robot workspace. The proposed solution is basedcomputation of the distance between a human operator and a robotic manipulator for

1

Page 6: Mainhhh

1.1. EMPLOYED PLATFORMS CHAPTER 1. INTRODUCTION

shared workspace.

The work of this internthip focuses on a part of the proposed solution by Karama SRITIGUIMBAL, to improve the robustness of the result. Hence imprimis it should give anabstract of the previous work.

1.1 Employed Platforms

In this section the used elements in term of software and the hardware used to implementhuman detection and motion tracking are presented.

1.1.1 KUKA LWR

KUKA Light Weight Robot (LWR)is a robot made of aluminium. It contains all thenecessary components; such as motors, wires and sensors. This arm has 7 degrees offreedom which gives the LWR greater flexibility and precision than the more common6-axis robots. It also has a payload capacity of 7 kg.

Figure 1.2: KUKA LWR

Key Features:

• Kuka highlights that the LWR is very responsive due to its integrated sensors.Since they are not situated in the end effector, but in the robot itself, simple toolscan be use for any task.

2

Page 7: Mainhhh

1.1. EMPLOYED PLATFORMS CHAPTER 1. INTRODUCTION

• Its weight (about 16 kg) makes the robot portable. LWR also has a low powerconsumption rate which allows it to be powered by batteries.

• The robot can also be programmed by hand moving it. This along with its 7-axises,means that it can easily reach any point within its work zone. It also retains itsposition during programming, so, Kuka affirms that programming their LWR isvery intuitive and efficient.

• Its rounded shape contains no sharp edges that could be harmful to a humanworking beside it. Since it is a collaborative robot, its sensors detect externalforces made by an obstacle or a human. These sensors are also independent fromeach other. These characteristics make the LWR a safe working partner, either forfeeding human workers pieces or holding them while your employees work on it.

1.1.2 Microsoft KINECT

Kinect (codenamed in development as Project Natal) is a line of motion sensing inputdevices by Microsoft for Xbox 360 and Xbox One video game consoles and WindowsPCs. Based around a webcam-style add-on peripheral, it enables users to control andinteract with their console/computer without the need for a game controller, through anatural user interface using gestures and spoken commands.

Microsoft released Kinect software development kit for Windows 7 on June 16, 2011.This SDK was meant to allow developers to write Kinecting apps in C++/CLI, C#, orV isualBasic .NET .

Kinect builds on software technology developed internally by Rare, a subsidiary of Mi-crosoft Game Studios owned by Microsoft, and on range camera technology by Prime-Sense,which developed a system that can interpret specific gestures, making completelyhands-free control of electronic devices possible by using an infrared projector and cam-era and a special microchip to track the movement of objects and individuals in threedimensions. This 3D scanner system called Light Coding employs a variant of image-based 3D reconstruction.

Kinect sensor is a horizontal bar connected to a small base with a motorized pivot andis designed to be positioned lengthwise above or below the video display. The devicefeatures an ”RGB camera, depth sensor and multi-array microphone running propri-etary software”, which provide full-body 3D motion capture, facial recognition and voicerecognition capabilities.

The depth sensor consists of an infrared laser projector combined with a monochromeCMOS sensor, which captures video data in 3D under any ambient light conditions.The sensing range of the depth sensor is adjustable, and Kinect software is capable ofautomatically calibrating the sensor based on player(human)’s physical environment.

3

Page 8: Mainhhh

1.1. EMPLOYED PLATFORMS CHAPTER 1. INTRODUCTION

Figure 1.3: Microsoft KINECT

Figure 1.4: Body Calibration example by KINECT

1.1.3 ROS

ROS ( Robot Operating System) is an open-source, meta-operating system for robots.It provides the services you would expect from an operating system, including hardwareabstraction, low-level device control, implementation of commonly-used functionality,message-passing between processes, and package management. It also provides toolsand libraries for obtaining, building, writing, and running code across multiple com-puters. ROS is similar in some respects to ’robot frameworks,’ such as Player, YARP,Orocos(1.1.4), CARMEN, Orca, MOOS, and Microsoft Robotics Studio.

4

Page 9: Mainhhh

1.1. EMPLOYED PLATFORMS CHAPTER 1. INTRODUCTION

Figure 1.5: ROS logo

ROS is not a realtime framework, though it is possible to integrate ROS with realtimecode. ROS also has seamless integration with the Orocos(1.1.4) Real-time Toolkit.

1.1.4 OROCOS Project

“Orocos” is the acronym of the Open Robot Control Software project. The project’saim is to develop a general-purpose, free software, and modular framework for robotandmachine control. The Orocos project supports four C++ libraries: the Real-Time Toolkit,the Kinematics and Dynamics Library, the Bayesian Filtering Library and the OrocosComponent Library.

Figure 1.6: OROCOS Project

• The Orocos Real-Time Toolkit (RTT) is not an application in itself, but itprovides the infrastructure and the functionalities to build robotics applicationsin C++. The emphasis is on real-time, on-line interactive and component basedapplications.

• The Orocos Components Library (OCL) provides some ready to use con-trol components. Both Component management and Components for control andhardware access are available.

5

Page 10: Mainhhh

1.2. AN ABSTRACT OF PREVIOUS WORK CHAPTER 1. INTRODUCTION

• The Orocos Kinematics and Dynamics Library (KDL) is a C++ librarywhich allows to calculate kinematic chains in real-time.

• The Orocos Bayesian Filtering Library (BFL) provides an application in-dependent framework for inference in Dynamic Bayesian Networks, i.e., recursiveinformation processing and estimation algorithms based on Bayes’ rule, such as(Extended) Kalman Filters, Particle Filters (Sequential Monte methods), etc.

Orocos is a free software project, hence its code and documentation are released underFree Software licenses.[1]

1.2 An abstract of previous work

In this section the proposed approach and the strategies to implement human detectionand motion tracking are presented. For more information it should refer to KaramaSRITI GUIMBAL’s master thesis [2].

A. SEELEUTHNER [3] has proposed a new safety criterion which tends to limit thedissipated energy at a zone of a collision, and depends, among other things, on distancesbetween all robot segments and the considered obstacles.

Figure 1.7: Implemented Controller by A. SEELEUTHNER [3]

To reach the desired position X∗ , the proposed controller Figure 1.2, calculates thetorque τ considering the robot state (calculated from the joint’s position q and velocityq) and using the safety criterion calculated from the distances between robot and staticobstacles dx , and robot and human limbs dhuman.

Based on the A. SEELEUTHNER [3] ’s criterion and implemented controller in pre-vious project one solution has ben proposed. The proposed solution for intuitive andsafe human-robot interaction is composed of several modules. Such a modular architec-ture offers many advantages. Applications are independent, interchangeable and theirimplementation is faster and easier. Furthermore, a modular architecture allows to buildcomponents based on different API Application Programming Interface. For example,the optical system, the robot and the controller are implemented differently. Then theyare linked together as pieces of puzzle (Figure 1.3).

6

Page 11: Mainhhh

1.2. AN ABSTRACT OF PREVIOUS WORK CHAPTER 1. INTRODUCTION

Figure 1.8: Architecture of the system

Based on this architecture we can present the proposed system:

Figure 1.9: All the different parts of the system

7

Page 12: Mainhhh

1.2. AN ABSTRACT OF PREVIOUS WORK CHAPTER 1. INTRODUCTION

You can see the all Three parts of the system :

• FRI : All the communications with the robo should be done by this part

• Optic part : This part is designed in ROS, and contains different componentswhich their duty is that by calibrating Kuka, send the THHuman/Robot (transformmatrix betwwen human body and robot joints) to the Controller part (OROCOS).

• Controller part: This part id designed in OROCOS, and this part will computein Real-Time the minimum distance between human body and robot’s joints, andsend this distance to the controller.

For more information it should refer to [2].

In this internship we will focus on Calibration package. We will propose a new cali-bration procedure and then we will improve the robustness of Optic part by installingmore than one camera to develop the angle of view of the Optic part.

8

Page 13: Mainhhh

2Robot Calibration

In this chapter the idea of Robot Calibration and the proposed system to improve therobustness of the result will be introduced.

First we should answer two important questions about Robot Calibration:

• What is Robot Calibration?Robot calibration is the process of determining the actual values of Kinematic andDynamic parameters of an industrial robot (IR). Kinematic parameters describethe relative position and orientation of links and joints in the robot while the dy-namic parameters describe arm and joint masses and internal friction.

Or in other word:

Robot calibration is a process by which robot positioning accuracy can be im-proved by modifying the robot positioning software instead of changing or alteringthe design of the robot or its control system. To fulfill this, the procedure involvesdeveloping a model whose parameters accurately represent the real robot. Next,specifically chosen f eatures of the robot are accurately measured.This step is fol-lowed byidentification procedure to compute those parameter values, which wheninput in the robot model accurately reflect the measurements made.

• Why Calibration?A calibrated robot has a higher absolute positioning accuracy than an uncali-brated one, i.e., the real position of the robot end effector (Tool Center Point)corresponds better to the position calculated from the mathematical model of therobot. Absolute positioning accuracy is particularly relevant in connection withrobot exchangeability and off-line programming of precision applications. Robotcalibration is typically used to reduce manual programming time, improve process

9

Page 14: Mainhhh

2.1. PREVIOUS CALIBRATION P. CHAPTER 2. ROBOT CALIBRATION

quality, or to enable standard robots to achieve accuracies of much more expensiveequipment (i.e. CNC machines, CMMs, etc.).

2.1 Previous Calibration P.

By looking to the proposed solution and focus to the Calibration component,

Figure 2.1: Calibration component in previous work

Calibration is a ROS (1.1.3) component, which has two inputs:

• Image: Kinect’s image information from ROS’s OpenNi component.

• XPosition: Kuka’s joints position from OROCOS’s DirectKinematics component(2.2) which is not compatible to ROS, so a Typrkit has been used to adapt ROSmessages to the OROCOS data type and vis-versa.

Figure 2.2: Calibration component’s inputs and outputs

10

Page 15: Mainhhh

2.1. PREVIOUS CALIBRATION P. CHAPTER 2. ROBOT CALIBRATION

And the output of this component is the Transfer MAtrix THKinect/Robot :

PR = THKinect/RobotPK

Which PR is the vector of robot’s joints respect to Kuka’s base, and PK is the vector ofrobot’s joints respect to Kinect’s base.

Working with Calibration component has some difficulties such as:

• For each time robot calibrating it should use a Checker-board for the calibrationprocedure. The choice of the pattern is also important to have a successful cali-bration. Indeed, if checker-board is too small, extrinsic variables of the camera arenot well estimated

Figure 2.3: Calibration checkerboard

• For each time robot calibrating a specific tool is added at the robot end effector.Indeed, this procedure requires to place robot gripper, in certain specific points ofthe checker-board. If this is note made precisely,calibration does not be successful.This involves a change in the kinematics chain defined in Direct Kinematics [4].

11

Page 16: Mainhhh

2.2. PROPOSED CALIBRATION P. CHAPTER 2. ROBOT CALIBRATION

Figure 2.4: The added end effector for the calibration

Based on this difficulties we decided to propose an other Calibration procedure in orderto get rid of necessity of using the checker-board and the added end effector.

2.2 Proposed Calibration P.

In this section we will introduce the proposed calibration method and the used and de-signed components.

The basic idea of new calibration method is to calibrate the robot without using thechecker-board, but to keep using the other hardwares.

We proposed that instead of the checker-board we could use a Coloured Object on theRobot’s body and by using special ROS’s component(s) try to detect the it’s positionsrespect to Kinect camera and Robot’s base. Then by using the least square methodcompute the THKinect/Robot which is the main task of Calibration Component in previ-ous safe human-robot shared work-space (1.2).

Proposed procedure has a modular architecture and contains a part in ROS and onepart in OROCOS:

12

Page 17: Mainhhh

2.2. PROPOSED CALIBRATION P. CHAPTER 2. ROBOT CALIBRATION

Figure 2.5: Proposed Calibration Procedure

It contains several components:

• OpenNi : Is an open source framework and also a ROS Package which enable usto get the information of the Image from Kinect.

• CMVISION : Is a ROS package that listens to the Image from OpenNi and canDetect a coloured object(s) and gives the 2D image position (XPixel, YPixel).

13

Page 18: Mainhhh

2.2. PROPOSED CALIBRATION P. CHAPTER 2. ROBOT CALIBRATION

Figure 2.6: CMVISION output 2D image position

• Detector1 : Is a ROS package which by employing Point Cloud Library (PCL) canexchange the 2D image position to 3D position (respect to Kinect base), and asoutput it gives x, y and z of the Colored object respect to Kinect.

Figure 2.7: Depth detecting by Point Cloud Library

• lwr-fri : Is an OROCOS package which contains OROCOS drives for Kuka lwr.This package has been used for any connection with Robot(Kuka). As one of theoutput, this package can gives us the position of TCP (end effector) of the Kuka.

• fri-example : is a OROCOS package which has been designed for manipulatingKuka (for example a Point to Poit moving) and contains a n interpolation (designedby G. HAMON). This package always communicate with ”lwr-fri”be cause the doorof the communication with robot is ”lwr-fri”.

• Detector2: Is an OROCOS package which listens to ”lwr-fri”and gives us as outputthe 3D position of the coloured object respect to robot’s base.

14

Page 19: Mainhhh

2.3. REALIZED WORK CHAPTER 2. ROBOT CALIBRATION

• Typekit: As the OROCOS data is not compatible with ROS, so we used a Typekiteto integration between ROS and OROCOS

• TH-MAtrix Fider: Is a ROS package which listens to Detector1 and Detector2(byTypekit) and by using the Least Square Methode, finds the Transfer Matrix be-tween Kinect and Robot [6].

Each component will repeatedly performs it’s duty by specific frequency, and to pre-vent any disorder in different tasks, we applied two Signals. One from ”fri-example” to”CMVISION”. When robot moved reached the new position, ”fri-example” sends thissignal to ”CMVISION” to let him start to detect the colored object. Second Signal isfrom ”Detector1”(or it can be from ”CMVISION” too) to ”fri-example” that the detectionduty is finished and now robot can reach to new position.

By enforcing this procedure we can calibrate our robot without using any added part onthe end effector and also without using checkerboard.

One of the advantage of this system is that we could perform the calibration just byone click and we can have an Automatic Calibration.

2.3 Realized Work

At this moment of the internship, I achieved to design this solution and also to implementall the components independently, and also integration between ROS and OROCOS. Andfor calibration it just remains the implement ALL the components together.

2.4 To do job!

After the Automatic Calibration, we will focus to improve the robustness of Calibrationand Human body detection.

At this moment the system works very well, but as we are just working with one camera,there are some Blind Spots in work space for our camera. For example behind the Robot,when Robot is hiding the human body. In this case Kinect can not detect the humanbody and the systems does not work very well.

15

Page 20: Mainhhh

2.4. TO DO JOB! CHAPTER 2. ROBOT CALIBRATION

Figure 2.8: Example of a work space with Blind Spots

The idea to improve the results is based on extending the angle of view of the OpticPart of the system (1.2). This is not possible except increase number of the cameras andinstall them in different angles.

The challenge of this step is dealing with different informations which we receive fromdifferent cameras. For example the out put of CMVISION package for Three camerawill be three different Xpixel and YP ixel, so for making decision, it should pass a datafusion! but with which method? and how?... This is the question that should wait tillend of the internship and final report.

16

Page 21: Mainhhh

Bibliography

[1] Orocos project official web page, http://www.orocos.org/.

[2] K. S. GUIMBAL, Real-time computation of the distance between a human operatorand a robotic manipulator for shared workspace applications, Master’s thesis, ISIR(2012 - 2013).

[3] A. SEELEUTHNER, Commande sure de robots manipulateurs collaboratifs, Ph.D.thesis, Universite Pierre et Marie Curie (2013).

[4] Sovannara-hak’s github page - directkinematics orocos component, https://github.com/sovannarahak/directKinematics/blob/master/README.md.

[5] Calibrating a kinect to the turtlebot arm for manipulation, http:

//wiki.ros.org/turtlebot_kinect_arm_calibration/Tutorials/

CalibratingKinectToTurtleBotArm.

[6] T. S. H. K. S. ARUN, S. D. BLOSTEIN, Least-squares fitting of two 3d point sets,IEEE (September 1987).

17