Development and Implementation of a High-Level Command System and
Compact User Interface for Nonholonomic Robots
Hani M. Sallum
Masters Thesis Defense
May 4, 2005
2
Outline• Overview and Goals• Development
– Control System
– Data Analysis and Mapping
– Graphical User Interface
• Results
3
Overview
This work details the design and development of a goal-based user-interface for unmanned ground vehicles which is maximally simple to operate, yet imparts ample information (data and commands) between the operator and the UGV.
4
Typical UGV Usage
5
Other UGV Issues• Multi-person crews• Proprietary Operator Control Units
(OCU’s) for each UGV
Is there a way to have local users control UGV’s without the operational overhead currently required?
6
Current State of the Art
7
Why UA/GV’s?• Over the last two decades there has been a dramatic
increase in the complexity/availability of manufactured electronics.
• As a result, the capital cost of robotic systems in general has decreased, making them more feasible to implement.
Example: N.A.S.A. Mars Pathfinder/Sojourner system was built largely out of commercially available (OTS) parts (sensors, motors, radios, etc.)1.
• Additionally, the capacity and functionality of devices such as PDA’s and cellular phone has increased as well.
1. N.A.S.A., Mech. Eng. Magazine, Kodak
8
Motivation:
Considering the ubiquity of PDA’s, smartphones, etc., is it possible to develop a method of using these devices as a form of common O.C.U.?
Q: Do custom O.C.U.’s need to be developed when commercial technology is evolving so rapidly?
9
Goals:
1. Develop a control system for a UGV • Automates low-level control tasks
2. Develop a method of rendering sensor data into maps of the UGV’s environment
3. Design a GUI which runs on a commercially available PDA
10
Hardware: RobotiRobot B21R Mobile
Research Robot (nonholonomic)
Camera
Sonar
Laser Rangefinder
11
Definition of Nonholonomic
Unable to move independently in all possible degrees of freedom.
Example: Cars have 3 degrees
of freedom (x, y, ), but
can not move in x or alone.
12
Hardware: PDA
Hewlett-Packard iPAQ
802.11/BluetoothAntenn
a240x320 Color Screen
Touch Sensitive
Windows CE Operating System
13
Navigation Control System
Two aspects of the navigation process:
•Target Approach
•Obstacle Avoidance
Multimodal Controller
Separate control laws depending on the desired operation of the robot.
14
Proven Method• Schema Architecture [Chang et al.]
• Discrete shifts between control modes
• Straightforward to implement
• “chattering” between modes
15
Proposed Method• Fuzzy Control [Wang, Tanaka, Griffin]
• Gradual shifts between control modes
• More complicated controller
• Smoother trajectory through state-space
16
Fuzzy Controller
Sensor Data
Target Approach
Mode
Obstacle Avoidance
Mode
Fuzzy Blending
{K, {K,
Fuzzy Control Signals {v,
17
Target ApproachControl of Turning Velocity
• Final orientation unconstrained
• Implement a proportional controller driving the robot heading to a setpoint equal to the current bearing of the target (i.e. DEV 0)
• Produce APP
• Saturate the controller at the max allowable turning speed
• Use high proportional gain to approximate an arc-line path
18
Target ApproachControl of Forward Velocity
• Final position close to target
• Implement a proportional controller to scale the forward velocity, based on the robot’s distance to the target coordinates (i.e. DTAR 0)
• Produce APP
• Saturate the controller at APP =1 (scale to the maximum allowable forward speed)
19
Obstacle AvoidanceControl of Turning Velocity
• Implement a proportional controller driving the robot heading to a setpoint 90º away from the nearest obstacle (i.e. OBS ±90º)
• Produce AVOID
20
Obstacle AvoidanceControl of Forward Velocity
• Implement a proportional controller to reduce (scale down) the forward velocity when nearing an obstacle
• Produce AVOID
21
Obstacle Avoidance• Forward Control
• Inner threshold elliptical to avoid being stuck to obstacles:
22
Final Control LawTurning Control:• Blend the target approach and obstacle avoidance control
signals using a weighted sum:
WAPPAPP + WAVOIDAVOID = FUZZY
• Determine weights using membership functions based on the robot’s distance to the nearest obstacle.
23
Final Control LawForward Control:• Blend the target approach and obstacle avoidance control
signals by multiplying the maximum forward velocity by the scaling factors produced by each control mode.
KAPPKAVOIDvMAX = vFUZZY
24
Outline• Overview and Goals• Development
– Control System
– Data Analysis and Mapping
– Graphical User Interface
• Results
25
Data Analysis and Mapping
• Render data from the laser rangefinder into significant features of the environment:
“Fiducial Points”e.g. corners, ends of walls, etc.
• Use these fiducial points to generate primitive geometries (line segments) which represent the robot’s environment.
26
Distillation Process
RAW DATA
Object Detection
Segment Detection
Finding Intersections
FIDUCIAL POINTS
Categorizing Points
Line Fitting
Finding Fiducial Points
27
Why Find Fiducial Points?Laser Rangefinder Data
28
Object DetectionRange vs. Bearing
(used by Crowley [1985])
29
Object Detection
30
Object Detection
31
Segment DetectionRecursive Line Splitting Method used by Crowley [1985], B.K.
Ghosh et al. [2000]
32
Proposed Threshold FunctionCREL: Relative Threshold CABS: Absolute Max Threshold
Segment Detection
33
Line FittingUse perpendicular offset least-squares line fitting
34
Finding Intersections
35
Categorization
• Each fiducial point is either interior to an object, or at the end of an object.
• Fiducial points at the ends of objects are either occluded or unoccluded.
36
DistillationFinding Fiducial Points
37
Mapping
• Fiducial Points provide a clear interpretation of what is currently visible to the robot
• Provide a way to add qualitative information about previously observed data to local maps Global map
38
Creating a Global Map
Global Map:Occupancy Evidence Grid [Martin, Moravec, 1996] based on laser rangefinder data collection.
39
Creating a Global Map
Global Map:Occupancy Evidence Grid [Martin, Moravec, 1996] based on laser rangefinder data collection.
40
Local MappingMap Image:• Sample section of global map for qualitative a
priori information about local area.• Overlay map primitives.
41
Local MappingExample:
Local map with and without a prior information.
42
Vision MappingVision Map:• Transform map primitives to perspective frame and
overlay camera image of local area.
43
Vision MappingFind common geometries for defining vertical and
horizontal sight lines.
44
GUI• Serve web content from the robot to the
iPAQ
• Use image-based linking (HTML standard) to allow map images to be interactive on the iPAQ
• Use web content to call CGI scripts onboard the robot, which run navigation programs on the robot
45
GUIMain Map/Command
Screen
46
GUILong-Range
Map/Command ScreenClose-Range
Map/Command Screen
47
GUIRotation
Map/Command ScreenVision
Map/Command Screen
48
Results
• GUI: Main Map/Command Screen
49
Results
• GUI: Rotation Map/Command Screen
50
Results
• GUI: Vision/Command Screen
51
Results
• Obstacle Avoidance
52
53
54
55
Conclusions
• The infrastructure exists for implementing an OCU on a PDA• OTS devices
• Networking/web standards
• Fuzzy logic methods can be applied to mobile robot control• Obstacle Avoidance
• Economic Path Generation
• Variable thresholds can be used for more robust range data interpretation
• Object detection based on incident angle
• Segment detection based on two parameters
• Fusion of data can impart more information to the operator• Occupancy information and fiducial points
• Fiducial points and visual data
56
Future Work
• Use fiducial points to implement Simultaneous Localization and Mapping
• Address control system limitations
• Streamline/upgrade web content and programming for the GUI
57
Thanks To:
•My Committee:•Professor Baillieul
•Professor Wang
•Professor Dupont
•ARL/MURI
•Colleagues in IML
•Family and Friends
•AME Staff
•Professor Baillieul