overview of our sensors for robotics

82
Overview of Overview of Our Sensors Our Sensors For For Robotics Robotics

Upload: sasha-hanson

Post on 31-Dec-2015

33 views

Category:

Documents


2 download

DESCRIPTION

Overview of Our Sensors For Robotics. Machine vision. Computer vision To recover useful information about a scene from its 2-D projections. To take images as inputs and produce other types of outputs (object shape, object contour, etc.) Geometry + Measurement + Interpretation - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Overview of Our Sensors  For Robotics

Overview of Overview of Our SensorsOur Sensors For Robotics For Robotics

Page 2: Overview of Our Sensors  For Robotics

Machine visionMachine vision

• Computer vision• To recover useful information about a scene from

its 2-D projections.• To take images as inputs and produce other types

of outputs (object shape, object contour, etc.)• Geometry + Measurement + Interpretation• To create a model of the real world from images.

Page 3: Overview of Our Sensors  For Robotics

Topics

• Computer vision system

• Image enhancement

• Image analysis

• Pattern Classification

Page 4: Overview of Our Sensors  For Robotics

Related fields• Image processing

– Transformation of images into other images– Image compression, image enhancement– Useful in early stages of a machine vision system

• Computer graphics• Pattern recognition• Artificial intelligence• Psychophysics

Page 5: Overview of Our Sensors  For Robotics

Vision system hardware

Page 6: Overview of Our Sensors  For Robotics

Image Processing SystemImage Processing System

Page 7: Overview of Our Sensors  For Robotics
Page 8: Overview of Our Sensors  For Robotics

Image RepresentationImage Representation

Page 9: Overview of Our Sensors  For Robotics

Image• Image : a two-dimensional array of pixels• The indices [i, j] of pixels : integer values that

specify the rows and columns in pixel values

Page 10: Overview of Our Sensors  For Robotics
Page 11: Overview of Our Sensors  For Robotics

Sampling, pixeling and quantization

• Sampling– The real image is sampled at a finite number

of points.– Sampling rate : image resolution

• how many pixels the digital image will have• e.g.) 640 x 480, 320 x 240, etc.

• Pixel– Each image sample– At the sample point, an integer value of the

image intensity

Page 12: Overview of Our Sensors  For Robotics

• Quantization– Each sample is represented with

the finite word size of the computer.

– How many intensity levels can be used to represent the intensity value at each sample point.

– e.g.) 28 = 256, 25 = 32, etc.

Page 13: Overview of Our Sensors  For Robotics

Color models• Color models for images,

– RGB, CMY• Color models for video,

– YIQ, YUV (YCbCr)• Relationship between color models :

Page 14: Overview of Our Sensors  For Robotics
Page 15: Overview of Our Sensors  For Robotics

6.7. Digital Cameras6.7. Digital Cameras

Page 16: Overview of Our Sensors  For Robotics

Digital Cameras

• Technology• CCD (charge coupled devices)

• CMOS (complementary metal oxide semiconductor)

• Resolution• 60x80 black/white up to

• several Mega-Pixels in 32bit color

However: Embedded system has to have computing power to deal with this large amount of data!

Page 17: Overview of Our Sensors  For Robotics

Vision (camera + framegrabber)

Page 18: Overview of Our Sensors  For Robotics

Digital CamerasDigital Cameras

• Performance of embedded system: 10% - 50% of standard PC

Page 19: Overview of Our Sensors  For Robotics

Interfacing Digital Cameras to CPU

• Interfacing to CPU:• Completely depends on sensor chip specs

• Many sensors provide several different interfacing protocols

• versatile in hardware design

• software gets very complicated

• Typically: 8 bit parallel (or 4, 16, serial)

• Numerous control signals required

Page 20: Overview of Our Sensors  For Robotics

Interfacing Digital Cameras to CPU• Digital camera sensors are very complex units.

– In many respects they are themselves similar to an embedded controller chip.

• Some sensors buffer camera data and allow slow reading via handshake (ideal for slow microprocessors)

• Most sensors send full image as a stream after start signal – (CPU must be fast enough to read or use hardware buffer or

DMA)

• We will not go into further details in this course. However, we consider camera access routines

Page 21: Overview of Our Sensors  For Robotics

Simplified diagram of Simplified diagram of camera to CPU interfacecamera to CPU interface

Page 22: Overview of Our Sensors  For Robotics

Problem with Digital CamerasProblem with Digital Cameras• Problem

• Every pixel from the camera causes an interrupt

• Interrupt service routines take long, since they need to store register contents on the stack

• Everything is slowed down

• Solution

• Use RAM buffer for image and read full image with single interrupt

Page 23: Overview of Our Sensors  For Robotics

• Idea• Use FIFO as image data buffer

• FIFO is similar to dual-ported RAM, it is required since there is no synchronization between camera and CPU

• When FIFO is half full, interrupt is generated

• Interrupt service routine then reads FIFO until empty• (Assume delay is small enough to avoid FIFO overrun)

Page 24: Overview of Our Sensors  For Robotics
Page 25: Overview of Our Sensors  For Robotics
Page 26: Overview of Our Sensors  For Robotics

Bayer PatternBayer Pattern

Page 27: Overview of Our Sensors  For Robotics

De-MosaicDe-Mosaic

Page 28: Overview of Our Sensors  For Robotics
Page 29: Overview of Our Sensors  For Robotics

Conversion in Digital Cameras

• Bayer Pattern• Output format of most digital cameras

• Note: 2x2 pattern is not spatially located in a single point!

• Can be simply converted to RGB (drop one green byte) 160x120 Bayer → 80x60 RGB

• Can be better converted using “demosaicing” technique 160x120 Bayer → 160x120 RGB

Page 30: Overview of Our Sensors  For Robotics
Page 31: Overview of Our Sensors  For Robotics

CMUCAM2+ CAMERA www.seattlerobotics.com

• The camera can track user defined color blobs at up to 50 fps (frames per second)

• Track motion using frame differencing at 26 fps

• Find the centroid of any tracking data

• Gather mean color and variance data

• Gather a 28 bin histogram of each color channel

• Manipulate horizontal pixel differenced images

• Arbitrary image windowing

• Adjust the camera’s image properties

This camera cando a lot of processing

Page 32: Overview of Our Sensors  For Robotics

• Dump a raw image

• Up to 160 X 255 resolution

• Support multiple baud rates

• Control 5 servos outputs

• Slave parallel image processing mode off of single camera bus

• Automatically use servos to do two axis color tracking

• B/W analog video output (Pal or NTSC)

• Flexible output packet customization

• Multiple pass image processing on a buffered image

This camera cando a lot of processing

Page 33: Overview of Our Sensors  For Robotics
Page 34: Overview of Our Sensors  For Robotics

Vision Guided Vision Guided RoboticsRobotics

and Applications in Industry and Medicine

Page 35: Overview of Our Sensors  For Robotics

Contents• Robotics in General• Industrial Robotics• Medical Robotics• What can Computer Vision do for Robotics?• Vision Sensors• Issues / Problems• Visual Servoing• Application Examples• Summary

Page 36: Overview of Our Sensors  For Robotics

Industrial Robot vs Human• Robot Advantages:

– Strength

– Accuracy

– Speed

– Does not tire

– Does repetitive tasks

– Can Measure

Human advantages:

– Intelligence– Flexibility– Adaptability– Skill– Can Learn– Can Estimate

Robot needs vision

Page 37: Overview of Our Sensors  For Robotics

• Requirements:

– Accuracy– Tool Quality– Robustness– Strength– Speed – Price Production Cost– Maintenance

Industrial Robot

Production Quality

Page 38: Overview of Our Sensors  For Robotics

Medical (Surgical) Robot

• Requirements

– Safety

– Accuracy

– Reliability

– Tool Quality

– Price

– Maintenance

– Man-Machine Interface

Page 39: Overview of Our Sensors  For Robotics

What can Computer Vision do for (industrial and medical) Robotics?

• Accurate Robot-Object Positioning

• Keeping Relative Position under Movement

• Visualization / Teaching / Telerobotics

• Performing measurements• Object Recognition• Registration

Visual Servoing

Page 40: Overview of Our Sensors  For Robotics

Vision Sensors

• Single Perspective Camera

• Multiple Perspective Cameras (e.g. Stereo Camera Pair)

• Laser Scanner

• Omnidirectional Camera

• Structured Light Sensor

Page 41: Overview of Our Sensors  For Robotics

Vision Sensors• Single Perspective Camera

XPx x43Single projection

Page 42: Overview of Our Sensors  For Robotics

Vision Sensors

• Multiple Perspective CamerasMultiple Perspective Cameras (e.g. Stereo Camera Pair)

Page 43: Overview of Our Sensors  For Robotics

Vision Sensors• Multiple Perspective Cameras (e.g.

Stereo Camera Pair)

Page 44: Overview of Our Sensors  For Robotics

Vision Sensors

• Laser Scanner

Page 45: Overview of Our Sensors  For Robotics

Vision Sensors• Laser Scanner

Page 46: Overview of Our Sensors  For Robotics

Vision Sensors• Omnidirectional Camera

Page 47: Overview of Our Sensors  For Robotics

Vision Sensors

• Omnidirectional Camera

Page 48: Overview of Our Sensors  For Robotics

Vision Sensors

• Structured Light Sensor

                                                                             

                                                             

                                                         

Figures from PRIP, TU Vienna

Page 49: Overview of Our Sensors  For Robotics

Issues/Problems of Vision Guided Robotics

• Measurement Frequency

• Measurement Uncertainty

• Occlusion, Camera Positioning

• Sensor dimensions

Page 50: Overview of Our Sensors  For Robotics

Visual ServoingVisual Servoing• Vision System operates in a closed control loop.• Better AccuracyBetter Accuracy than „Look and Move“ systems

Figures from S.Hutchinson: A Tutorial on Visual Servo Control

Page 51: Overview of Our Sensors  For Robotics

Visual Servoing

• Example: Maintaining relative Object Position

Figures from P. Wunsch and G. Hirzinger. Real-Time Visual Tracking of 3-D Objects with Dynamic Handling of Occlusion

Page 52: Overview of Our Sensors  For Robotics

Camera Configurations for Visual Servoing

End-Effector Mounted Fixed

Figures from S.Hutchinson: A Tutorial on Visual Servo Control

Page 53: Overview of Our Sensors  For Robotics

Visual Servoing Architectures

Figures from S.Hutchinson: A Tutorial on Visual Servo Control

Page 54: Overview of Our Sensors  For Robotics

Position-based vs Image Based control in Position-based vs Image Based control in Visual ServoingVisual Servoing

– Position based: • Alignment in target coordinate system

• The 3D structure of the target is rconstructed

• The end-effector is tracked

• Sensitive to calibration errors

• Sensitive to reconstruction errors

– Image based:• Alignment in image coordinates

• No explicit reconstruction necessary

• Insensitive to calibration errors

• Only special problems solvable

• Depends on initial pose

• Depends on selected features

target

End-effector

Image of target

Image of end effector

Page 55: Overview of Our Sensors  For Robotics

EOL and ECL control in Visual Servoing– EOL: endpoint open-loop; only the target is

observed by the camera

– ECL: endpoint closed-loop; target as well as end-effector are observed by the camera

EOL ECL

Page 56: Overview of Our Sensors  For Robotics

Visual Servoing• Position Based Algorithm:

1. Estimation of relative pose

2. Computation of error between current pose and target pose

3. Movement of robot

• Example: point alignment

p1

p2

Page 57: Overview of Our Sensors  For Robotics

Visual Servoing• Position based point alignmentPosition based point alignment

• Goal: bring e to 0 by moving p1

e = |p2m – p1m|

u = k*(p2m – p1m)

• pxm is subject to the following measurement errors: sensor position, sensor calibration, sensor measurement error

• pxm is independent of the following errors: end effector position, target position

p1m p2m

d

Page 58: Overview of Our Sensors  For Robotics

Visual Servoing• Image based point alignmentImage based point alignment

• Goal: bring e to 0 by moving p1

e = |u1m – v1m| + |u2m – v2m|

• uxm, vxm is subject only to sensor measurement error

• uxm, vxm is independent of the following measurement errors: sensor position, end effector position, sensor calibration, target position

p1 p2

c1 c2

u1

u2

v1 v2

d1d2

Page 59: Overview of Our Sensors  For Robotics

Visual Servoing• Example Laparoscopy

Figures from A.Krupa: Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing

Page 60: Overview of Our Sensors  For Robotics

Visual Servoing• Example Laparoscopy

Figures from A.Krupa: Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing

Page 61: Overview of Our Sensors  For Robotics

Registration• Registration of CAD models to scene features:

Figures from P.Wunsch: Registration of CAD-Models to Images by Iterative Inverse Perspective Matching

Page 62: Overview of Our Sensors  For Robotics

Registration• Registration of CAD models to scene features:

Figures from P.Wunsch: Registration of CAD-Models to Images by Iterative Inverse Perspective Matching

Page 63: Overview of Our Sensors  For Robotics

Summary on tracking and servoing

• Computer Vision provides accurate and versatile measurements for robotic manipulators

• With current general purpose hardware, depth and pose measurements can be performed in real time

• In industrial robotics, vision systems are deployed in a fully automated way.

• In medicine, computer vision can make more intelligent „surgical assistants“ possible.

Page 64: Overview of Our Sensors  For Robotics

Omnidirectional Vision Systems

CABOTO Robot’s task: Building a topological map of an unknown

environment;

Sensor:Omnidirectional vision system;

Work’s aim:Prove effectiveness of omnidirectional sensors for Spatial

Semantic Hierarchy; SSHSSH

Page 65: Overview of Our Sensors  For Robotics

Spatial Semantic Hierarchy...

... A model of the human knowledge of large spaces

Layers:Layers:

–Sensory LevelSensory Level

–Control LevelControl Level

–Causal LevelCausal Level

–Topological LevelTopological Level

–Metrical LevelMetrical Level

Interface with the robot’s Interface with the robot’s sensory systemsensory system

Control Laws, Transition of Control Laws, Transition of State, Distinctiveness State, Distinctiveness

MeasureMeasure

View, Action, Distinct PlaceView, Action, Distinct Place

Abstracts Discrete from Abstracts Discrete from ContinousContinous

Minimal set ofMinimal set ofPlaces, Paths and RegionsPlaces, Paths and Regions

Distance, Direction, ShapeDistance, Direction, Shape

Useful, but seldom Useful, but seldom essentialessential

Page 66: Overview of Our Sensors  For Robotics

Tracking• Instrument tracking in laparoscopy

Figures from Wei: A Real-time Visual Servoing System for Laparoscopic Surgery

Page 67: Overview of Our Sensors  For Robotics

Omnidirectional Camera

Composed of:• Standard Color

Camera • Convex Mirror• Perspex Cylinder

Page 68: Overview of Our Sensors  For Robotics

Pros e ConsAdvantages

• Wide vision field

• High speed

• Vertical Lines

• Rotational Invariance

Disadvantages

• Low Resolution

• Distortions

• Low readability

Page 69: Overview of Our Sensors  For Robotics

Omnidirectional Vision and SSH

• View Omnidirectional image

Robot should discriminate between “Robot should discriminate between “turns”turns” and and ““travels”travels”

We need an Effective Distinctiveness measureWe need an Effective Distinctiveness measure

P2P2

P4P4 P3P3

P5P5

P1P1

–Exploring around the blockExploring around the block

Page 70: Overview of Our Sensors  For Robotics

Assumptions for vision system

• Man-made environment

• Floor flat and horizontal

• Wall and objects surfaces are vertical

• Static objects

• Constant Lighting

• Robot translates or rotates

• No encoders

Page 71: Overview of Our Sensors  For Robotics

Features and Events

Feature:– Vertical Edges

Events:– A new edge– An edge disappears– Two edges 180° apart– Two pairs of edges 180° apart

Page 72: Overview of Our Sensors  For Robotics

ExperimentsTasks of Caboto robot:• Navigation;• Map building;

Techniques:• Edge detection;• Colour marking;

Page 73: Overview of Our Sensors  For Robotics

Caboto’s Images

Page 74: Overview of Our Sensors  For Robotics
Page 75: Overview of Our Sensors  For Robotics

Results• Correct tracking of edges

• Recognition of actions

• Calculation of the turn angle

The path The path segmentationsegmentation

Page 76: Overview of Our Sensors  For Robotics

Mirror Design

• Design custom mirror profile

• Maximise resolution in

ROIs

Mirror ProfileMirror Profile

Mirror shape should depend on robotMirror shape should depend on robot task!task!

Page 77: Overview of Our Sensors  For Robotics

The new mirror

Page 78: Overview of Our Sensors  For Robotics

Conclusion on Omnivision Conclusion on Omnivision cameracamera

• Omnidirectional vision sensor is a good sensor for map building with SSH

• Motion of the robot was estimated without active vision

• The use of a mirror designed for this application will improve the system

Page 79: Overview of Our Sensors  For Robotics

Omnidirectional Cameras• Compound-eye

camera (from Univ. of Maryland, College Park. )

• Panoramic cameras (from Apple)

• Omnidirectional cameras

(from University of Picardie - France)

Page 80: Overview of Our Sensors  For Robotics

Student info.

• % of lab marks can be deducted if rules and regulation are not followed ex: by not cleaning up your bench or sliding your chairs back underneath bench top.• For more technical information on boards, devices and sensors check out my web page at :

www.site.uottawa.ca/~alan• Students are responsible for their own extra parts ex: if you want to add a sensor or device that the

dept. doesn’t have you are responsible for the purchase and delivery of that part, on rare occasion did the school purchase those parts.

• Back packs off bench tops• TA’s will have student # based on station #• Important issue regarding the design of a new project is to do a current analysis before the start of

your design• Setup a leader among your team so that you are better organized• Do not wait, before starting your project start now !• Prepare yourself before coming to the lab• It doesn’t work ! Ask yourself is it software or hardware, use the scope to trouble shoot• Fuses keeps on blowing, stop and do some investigation.• Do not cut any servo, battery and other device wire connectors. If you must please come and see

me• No design must exceed 50 volts, ex: do not work with 120 volts AC• I can give you what I have regarding metal, wood and plastic recycled pieces and do some cuts or

holes with my band saw and drill press for you, • PLEASE DO NOT ask me to barrow my tools. If you need to do a task with a special tool that I

have then I shall do it for you.

Page 81: Overview of Our Sensors  For Robotics

Problems for students1. Hardware and software components of a vision system for a mobile

robot2. Image representation for intelligent processing3. Sampling, pixeling and Quantization4. Color models5. Types of digital cameras.6. Interfacing digital cameras to CPU.7. Problems with cameras.8. Bayer Patterns and conversion.9. What is good about CMUCAM?10. Use of vision in industrial robots.11. Use of multiple-perspective cameras.12. Use of omnivision cameras.13. Types of visual servoing.14. Applications of visual servoing15. Visual servoing in surgery16. Explain tracking applications of vision.

Page 82: Overview of Our Sensors  For Robotics

References

• Photo’s ,Text and Schematics Information

• www.acroname.com

• www.lynxmotion.com

• www.drrobot.com

• Alan Stewart

• Dr. Gaurav Sukhatme

• Thomas Braunl

• Students 2002, class 479 • E. Menegatti, M. Wright, E. Pagello