hardware and software development of a wireless imaging ...€¦ · wireless imaging system for...

75
Hardware and Software Development of a Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the Master of Science Degree The University of Tennessee, Knoxville Balaji Ramadoss December 2003

Upload: others

Post on 18-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

1

Hardware and Software Development of a

Wireless Imaging System for Under Vehicle

Inspection Robot

PILOT

(Project In Lieu Of Thesis)

Presented for the

Master of Science

Degree

The University of Tennessee, Knoxville

Balaji Ramadoss

December 2003

Page 2: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

i

Acknowledgements

Many people shape our life and mold our path in various stage of our career. I am deeply indebted to my family, especially my mother Kalavathy, my father Ramadoss, my uncle Kalyaperumal, and my sister Vidubala. There is one great person in my life so far, whom I respect equally to my Parents for his support throughout my Masters without which I could not think of my Masters career. It is my Professor Dr. Mongi A. Abidi. Thank you Sir. I admire his personality and smiling attitude of encouraging the students.

Secondly I would like to thank Dr. David Page for his sincere support to my project lieu of thesis and my research work. He is the one who shaped my research work and presentation skills. He has spent his valuable time in making of this report. I would like to thank Dr. Besma Abidi for her guidance and help in developing localized enhancement algorithm for X-ray images. I would like to acknowledge Dr.Andreas Koschan for his moral support in my research work.

I would like to thank Tak Motoyama for his sincere help with the hardware part of my Masters work. I admire his hard working nature. My sincere thanks to Mark Mitches for helping me in drafting works. I would also like to acknowledge Vicky Courtney Smith and Justin Acuff for their moral support.

Page 3: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

ii

Abstract

Under vehicle inspections for security threats pose serious challenges due to low illumination and difficult to access areas under the vehicle. This project develops the hardware and software requirements of an integrated robotic imaging system for real time under vehicle inspection. The integrated hardware design for the robotic imaging system consists of infrared, near infrared and high resolution video sensors, which are used to visualize the real time data under the vehicle from a hand held computer such as TabletPC. The robot is capable of capturing the real time data under the vehicle and transmits to a remote computer using TCP/IP protocols. The system has networked controls with electronic control devices that can manipulate the illumination constraints, power controls and channel selections. The graphical user interface (GUI) software has been developed using Visual Studio.NET and is capable of running on a TabletPC. The main application of this software is data acquisition and network control of video imaging data. Real time data transmitted from the IRIS under vehicle robot is uploaded to a network using TCP/IP protocols. The software captures the real time data using ActiveX controls and has features to view and store a video sequence with a time and date stamp. Additionally, the software have been integrated and tested with the U.S. Army’s Omni-Directional Inspection System (ODIS). This report presents results from this series of experimental tests to demonstrate the functionality of the robot. This report concludes with a discussion of accomplishments and future directions of this project.

Page 4: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

iii

Contents Abstract............................................................................................................................... ii

1 Introduction...................................................................................................................... 1

1.1 Background....................................................................................................... 2

1.2 Proposed Approach in Developing the Hardware ............................................ 2

1.3 Proposed Approach in Developing the Software.............................................. 3

1.4 Document Organization.................................................................................... 3

2. Survey on Existing Systems ........................................................................................... 4

2.1 Hardware Systems ............................................................................................ 4

2.1.1 Search Systems Incorporated............................................................. 4

2.1.2 Wintron Technologies....................................................................... 4

2.1.3 Militech International ........................................................................ 5

2.1.4 Law Enforcement Associates, Inc ..................................................... 5

2.1.5 Lumenyte International Corporation- ................................................ 5

2.1.6 Perceptics ........................................................................................... 6

2.1.7 Professional Search System............................................................... 6

2.1.8 Vehicle Inspection Technologies....................................................... 7

2.1.9 Prolite Armor Systems...................................................................... 7

2.2 Software Interfaces .......................................................................................... 8

2.2.1 Multimodal User Interfaces for Mobile Robots................................ 8

2.2.2 Internet Robots and User Interface ................................................... 9

2.2.3 Control Strategies for Teleoperated Internet Assembly ................... 9

2.2.4 Real-Time Haptic.............................................................................. 9

2.2.5 An Advanced Telereflexive Tactical Response Robot ................... 10

2.2.6 An Advanced Supervisory Control of Mobile Robots.................... 10

2.2.7 Mobile Robot Control Interfaces .................................................... 11

2.2.8 Sharing Control Presenting a Framework....................................... 11

2.2.9 Web Top Robotics .......................................................................... 12

3. Hardware Development ................................................................................................ 13

3.1 Phase1-IRIS1 Design...................................................................................... 13

Page 5: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

iv

3.2 Phase2-IRIS2 Design...................................................................................... 14

3.2.1 Initial Design.................................................................................... 15

3.2.2 Design Improvements ...................................................................... 15

3.2.3 Components of IRIS2 ...................................................................... 16

3.2.4 ODIS Robot ..................................................................................... 18

3.2.5 Raytheon Thermal Camera .............................................................. 18

3.2.6 Polaris Wp-300c Lipstick Cameras ................................................. 19

3.2.7 Ever Focus Quad Splitter................................................................. 19

3.2.8 Axis Video Server............................................................................ 20

3.2.9 AXIS Device Point .......................................................................... 21

3.3 Network Controls ........................................................................................... 21

3.3.1 National Output Controller .............................................................. 21

3.3.2 Input Controls .................................................................................. 22

3.3.3 Battery Supply Monitor ................................................................... 23

3.3.4 Overload Current Monitor ............................................................... 25

3.3.5 Power Status Monitor ...................................................................... 25

3.3.6 Output Controls Monitor ................................................................. 26

3.3.7 Illumination Controls Monitor......................................................... 26

3.3.8 Quad Splitter Controls ..................................................................... 29

4. Software Development ................................................................................................. 31

4.1 Software Overview ......................................................................................... 31

4.2 GUI Design ..................................................................................................... 32

4.2.1 Section 1-Hardware Selection ......................................................... 32

4.2.2 Section 2-Time Stamp and Robot Status ......................................... 34

4.2.3 Section 3-Robot Control and Image Capture................................... 35

4.2.4 Viewer1-Real Time Video Display ................................................. 36

4.2.5 Viewer2-Video Snap Shot Display.................................................. 38

4.2.6 Viewer3-Future Display (Reserved)............................................... 39

4.3 Input /Output Control Software ...................................................................... 39

4.3.1 Input Port ......................................................................................... 40

4.3.2 Output Port....................................................................................... 43

Page 6: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

v

4.3.3 Serial Port ........................................................................................ 44

4.3.4 User Parameters Recording ............................................................. 44

4.4 Data Processing Software .............................................................................. 45

4.4.1 Segmentation ................................................................................... 46

4.4.2 Segmentation Algorithm-Method 2 ................................................. 46

4.4.3 Segmentation Using Edge Detection and Radon............................. 44

4.5 Image Enhancement Algorithm...................................................................... 55

5. Experimental Results .................................................................................................... 57

5.1 Data Acquisition ............................................................................................. 56

5.2 Robot Controls................................................................................................ 58

6. Conclusion .................................................................................................................... 61

Bibliography ..................................................................................................................... 63

Vita.................................................................................................................................... 57

Page 7: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

vi

List of Tables

Table 2.1: Important design parameters of robot.............................................................. 13 Table 3.1: Hardware settings to program the National controller .................................... 27 Table 3.2: Hardware settings to program baud rates ........................................................ 27 Table 3.3: Hardware settings to program the National for relay selection ....................... 27 Table 3.4: Hardware settings to program the National for control operation................... 28 Table 3.5: Hardware settings to set the device number for the controller ........................ 28 Table 4.1: Resolution file size and frame transferred rate of Axis video server .............. 34 Table 4.2: Hexadecimal values and their corresponding input operations are shown...... 41 Table 5.1: IRIS2 robot progress a comparison ................................................................. 59

Page 8: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

vii

List of Figures Figure 1.1: Robot that capture real time data with TabletPC ............................................ 2 Figure 1.2: Data acquisition software that captures real time data with TabletPC............ 3 Figure 2.1: Search systems under vehicle inspection systems........................................... 4 Figure 2.2: Wintron Technologies-Under Vehicle Inspection System.............................. 4 Figure 2.3: Militech International inspection system inspecting the truck........................ 5 Figure 2.4: Law Enforcement Associates system.............................................................. 5 Figure 2.5: Lumenyte International Corporation system................................................... 6 Figure 2.6: Perceptics under Vehicle Surveillance System (a) & (b) ................................ 6 Figure 2.7: Professional Search System (a) & (b) ............................................................. 7 Figure 2.8: Vehicle Inspection Technologies-Und-aware systems.................................... 7 Figure 2.9: Vehicle Inspection Technologies-Und-aware systems................................... 8 Figure 2.10: GUI for multimodal robots -Robot navigation control panel......................... 8 Figure 2.11: GUI for video acquisition and control........................................................... 9 Figure 2.12: Block diagram internet based feedback control system .............................. 10 Figure 2.13: Reflexive robots (a) & (b) .......................................................................... 10 Figure 2.14: Reflexive robots (a) & (b) .......................................................................... 11 Figure 2.15: Mobile Robot Control Interface displaying path map................................. 12 Figure 3.1: IRIS1 Imaging System ball transfer wheels .................................................. 13 Figure 3.2: IRIS1 Imaging System .................................................................................. 14 Figure 3.3: IRIS1 Imaging System-Inside view .............................................................. 14 Figure 3.4: IRIS2 Imaging Systems................................................................................. 15 Figure 3.5: IRIS2 robot layout showing the position of cameras and sensors ................ 16 Figure 3.6: Picture of IRIS2 robot -front view ................................................................ 16 Figure 3.7: Picture of IRIS2 robot with two compartments ............................................ 17 Figure 3.8: Inside components of IRIS2 robot (a)& (b) .................................................. 17 Figure 3.9: ODIS robot before modification ................................................................... 18 Figure 3.10: Raytheon PalmIR PRO thermal camera ..................................................... 19 Figure 3.11: Lipstick camera .......................................................................................... 19 Figure 3.12: Quad splitter ............................................................................................... 20 Figure 3.13: Example of range image segmentation ........................................................ 20 Figure 3.14: Axis Device point ........................................................................................ 21 Figure 3.15: National Output controller with connection details ................................... 22 Figure 3.16: National Output controller connection with Axis server ............................ 22 Figure 3.17: Hardware system design for Input control operations ................................. 23 Figure 3.18: Hardware system design to monitor the battery voltage .............................. 24 Figure 3.19: Hardware system design to monitor overload current.................................. 25 Figure 3.20: Hardware system design to monitor system status....................................... 25 Figure 3.21: Hardware system design for networked output controls selection............... 26 Figure 3.22: Block diagram of channel selection in splitter ............................................. 29 Figure 3.23: Hardware system design for networked channel selection ........................ 29 Figure 4.1 : Architecture of the GUI for data control operations ..................................... 31 Figure 4.2 : GUI -four sections ......................................................................................... 32 Figure 4.3 : Data capture with excess of reflection .......................................................... 33 Figure 4.4 : Data capture with illumination control from the GUI................................... 35

Page 9: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

viii

Figure 4.5 : The Figure shows time, control OFF and alarm status of the robot.............. 35 Figure 4.6 : GUI- Section 3 with start capture, stop capture and options dialog box....... 35 Figure 4.7 : GUI- Section 3 with button controls ............................................................. 36 Figure 4.8 : Under vehicle data captured with the GUI.................................................... 36 Figure 4.9 : Under vehicle data cropped by the GUI using Software channel selection .. 36 Figure 4.10 : Original resolution of the data cropped by the GUI using Software........... 38 Figure 4.11 :Under vehicle data with time and date stamp............................................... 39 Figure 4.12: Future systems (a) and (b) ............................................................................ 40 Figure 4.13 : Under vehicle image captured using the GUI (a) &(b) ............................... 42 Figure 4.14 : Algorithm for short circuit protection ......................................................... 43 Figure 4.15 : Algorithm to monitor the data at the input port .......................................... 44 Figure 4.16 : Algorithm for output operations in the robot .............................................. 44 Figure 4.17 : Algorithm for output operations through the serial port ............................. 45 Figure 4.18 : The .doc file created by the GUI for system status monitoring .................. 46 Figure 4.19 : Block diagram of the segmentation algorithm ............................................ 46 Figure 4.20 : Segmented object obtained with the software for remote diagnostics ........ 47 Figure 4.21 : Segmented object using segmentation algorithm method 2........................ 47 Figure 4.22 : Segmented object obtained with segmented picture is with red edges ....... 48 Figure 4.23 : GUI for the remote diagnostics software .................................................... 48 Figure 4.24 : Block Diagram of Edge detection system applying radon transform ......... 49 Figure 4.25: Block Diagram of Edge detection (a)&(b) .................................................. 50 Figure 4.26 : Edge detection image produced at discontinuity......................................... 50 Figure 4.27 : Edge detection image input to the algorithm-level2 ................................... 51 Figure 4.28: Resultant image of algorithm-level2 first set of 20 pixels read by level2... 52 Figure 4.29 : Line detected by the radon transform with 20 pixels as input..................... 53 Figure 4.30 : Line detected by the radon transform localized .......................................... 53 Figure 4.31 : Localized image dilated to create as marker (a) & (b) ................................ 53 Figure 4.32 : Localized pixels-some results (a) & (b) ...................................................... 54 Figure 4.33 : Resultant automatic edge linking implemented in the algorithm................ 55 Figure 4.34 : Block diagram for Local Enhancement....................................................... 55 Figure 4.35: Local Enhancement results (a) &(b) ........................................................... 56 Figure 4.36 : GUI using VC++ Local Enhancement results............................................. 56 Figure 5.1 : Under vehicle image acquired using Tablet (a) & (b)................................... 58 Figure 5.2 : IRIS2 robot in a parking garage before inspection........................................ 59 Figure 5.3 : IRIS2 robot in a parking garage before inspecting a van .............................. 59 Figure 5.4 : IRIS2 robot inspecting a van in the parking garage ...................................... 60 Figure 6.1 : Future robot design to control from a remote PC.......................................... 62 Figure 6.2 : Block diagram to navigate the robot from a computer.................................. 62

Page 10: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

1

1 Introduction

1.1 Background Recent activities of terrorism have created many concerns regarding the safety of

people and property that would be the target of terrorist attacks. In particular, the inspection of the underside of vehicles for explosives and other dangerous elements presents a serious challenge. The narrow, hard-to-reach areas and low illumination shortage of light make the undersides of vehicles ideal areas for the concealment of these elements. Until now, under vehicle inspection of cars entering public and federal buildings has been achieved through rudimentary tools, the “mirror-on-a-stick” method being the most commonly known example. The inspection results from such a system are unreliable, and the method itself is unsafe for the person, or persons, performing the inspection. Poor lighting conditions make it difficult to see the various components of the scene. There is also difficulty in reaching and manually inspecting the entire space under the vehicle. In addition, there are safety risks associated with the inspection personnel’s direct handling of the threat, i.e., an accidental detonation.

In view of the above challenges, this paper documents the development of a wireless video imaging system for an under vehicle inspection robot. Two versions of robot were designed by iris lab for under vehicle inspection. This project explores various issues such as mobility, lighting conditions and sensors requirements were analyzed in detail during the development of Under Vehicle Imaging system during spring 2002. This development is divided into two design areas: hardware and software.

1.2 Proposed Approach This project is split up into two phases.

Phase 1:

An initial prototype robot is designed with thermal, near infrared and color sensors. The system is used to acquire high resolution under vehicle images.

Phase 2:

In the second phase a new robot has been designed where this second robot is equipped with sensors that can directly load video images across a wireless network connection. This robot uses TCP/IP protocols to view the real time video and execute input/output operations in the robot. Figure 1.1. shows the robot that captures real time data with a TabletPC.

Page 11: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

2

Figure 1.1: Robot that capture real time data with Tablet PC.

1.3 Proposed Approach in Developing the Software The software involves the development of a graphical user interface (GUI) to process

video from the hardware and present the results to a user. The design of this data acquisition software includes the basic operations required that can capture data at real time and save the data at the required format. The architecture of the software includes the networked control commands to ensure control operations and monitoring without affecting the data acquisition routines. Two separate network sessions are used in the GUI. One network session is exclusively to capture video from the robot and other is to execute the control operations for the robot.

The first network session uses ActiveX controls to read and display images in the GUI and the second network session is built using the MFC library in Visual Studio.NET. Figure 1.2. shows the GUI and its features.

Video Server

Device Point

Hardware

Output data of the Robot

Page 12: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

3

Figure 1.2: Data acquisition software that capture real time data with TabletPC.

1.4 Document Organization This first chapter gives an overview of the software and hardware requirements of robot for under vehicle inspection. Chapter 2 is a survey on the existing hardware and software systems available for under vehicle inspection. Chapter 3 and 4 present the development of the hardware and software designs, respectively. Chapter 5 provides experimental results from testing the robot design. Finally, chapter 6 concludes with closing remarks.

Page 13: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

4

2. Survey on Existing Under Vehicle

Inspection System This chapter gives a survey of the existing systems available for under vehicle

inspection. The following sections divide the current research into hardware and software systems.

2.1 Hardware Systems

2.1.1 Search Systems Incorporated

The camera for Search Systems Incorporated is on a probe at 15 degrees sweep angle. The probe can be extended to allow placement under the vehicle. The camera is remotely controlled by a push button and can be articulated to 120 degrees. Figure 2.1 below shows the Search Cam system in operation. The Search Systems [Search] has a video camera and a detachable color LCD display.

Figure 2.1: Search systems under vehicle inspection systems.

2.1.2 Wintron Technologies

The Wintron Technologies [Wintron] uses a pole with a pan/tilt camera head at the end as shown in Figure 2.2. The system also has a zoom capability. The pole can be articulated to a maximum of 160 degrees.

Figure 2.2: Wintron Technologies-Under Vehicle Inspection System.

Page 14: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

5

2.1.3 Militech International

Militech International [Militech] uses a static fixed camera that is placed in the path of the vehicle as shown in Figure 2.3. The camera is arranged horizontally viewing the vehicle directly when it crosses the system.

Figure 2.3: Militech International inspection system inspecting the truck.

2.1.4 Law Enforcement Associates, Inc .

Law Enforcement Associates, [Law] uses a portable system that has a water proof camera and lighting modules. This system can be carried anywhere in a small suitcase and assembled quickly. The system must be placed anywhere in the vehicle path and is able to capture high resolution images. The picture of this system is shown in Figure 2.4.

Figure 2.4: Law Enforcement Associates Inc., under vehicle inspection system.

2.1.5 Lumenyte International Corporation

The Lumenyte International Corporation [Lumenyte] is not a video based system but rather a direct visual system with a mirror at one end. A bulb at the mirror end glows continuously for 4000 hours to illuminate the vehicle under carriage. The picture of this system is shown in Figure 2.5. This system is called the Security Illumination Mat System (SIMSTM).

Page 15: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

6

Figure 2.5: Lumenyte International Corporation system is used in under vehicle inspection of the truck.

2.1.6 Perceptics

Perceptics under vehicle surveillance system [Perceptics] uses digital line scan technology for under vehicle inspection. There are two versions of this system is available, mobile and static systems. The system scans under vehicle in few seconds and is available in touch screen for visualization and zooming. The picture of this system is shown in Figure 2.6.

(a) (b)

Figure 2.6: Perceptics under Vehicle Surveillance System (a) Under vehicle system inspecting a car (b) Graphical user interface to process the system.

2.1.7 Professional Search System

The Professional Search System [Professional] has a mirror on a probe and weighs five pounds. The system has a battery-powered LED for illumination. The picture of this system is shown in Figure 2.7.

Page 16: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

7

(a) (b)

Figure 2.7: Professional Search System (a) Model EFIS3 (b) Model EFIS2.

2.1.8. Vehicle Inspection Technologies

The Vehicle Inspection Technologies [Vehicle] is a static inspection system with a recording capability. The system consists of cameras, lights and multiplexer. The system provides nine high resolution color images that give a full view of an entire vehicle. The picture of this system is shown in Figure 2.8.

Figure 2.8: Vehicle Inspection Technologies-Und-aware systems are shown inspecting a car.

2.1.9. Prolite Armor Systems

The Prolite Armor Systems [Prolite] is a portable under vehicle video inspection system for static or mobile applications. The system is packed in two compartments and weighs approximately 65 pounds. System weighs 65 lbs and the software for this system has feature such as object recognition, alarm generation via local area network. The picture of this system is shown below. The picture of this system is shown in Figure 2.9.

Page 17: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

8

Figure 2.9: Vehicle Inspection Technologies-Und-aware systems inspecting a vehicle.

2.2 Software Interfaces

This section reviews both research and commercial software interfaces suitable for an under vehicle inspection robot.

2.2.1 Multimodal User Interfaces for Mobile Robots

This paper [Myers] discusses a surveillance robot with camera and sensors. The user controls the robot from a GUI. The author discusses a multimodal interface where the user navigates the robot and issues simultaneously commands. Figure below shows the robot GUI and the graphical user interface for video acquisition and robot controls. Figure 2.10 shows the real time image and Figure 2.11 shows the control panel to navigate the robot. The user places the mouse on the center of red mesh shown in the control panel and navigates the robot.

Figure 2.10: GUI for multimodal robots -Robot navigation control panel.

Page 18: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

9

Figure 2.11: GUI for video acquisition and control.

2.2.2 Internet Robots and User Interface

This paper [Taylor, 2000] discusses the strategies used to control the robot through a network and display the real time video. Two separate servers are used in the robot: a video server and a robot server. The video server capture real time video and a robot server control the robot.

2.2.3 Control Strategies for Teleoperated Internet Assembly

This paper [Kress, 2001] discusses internet based teleoperations. The following factors are considered in the design of the teleoperated robotic system:

• Data transmission time delay, • Noise, • Signal dropout likelihood, and • Signal dropout recovery. Delay and packet losses are essential factors that measure the speed and reliability of

a system [Kress, 2001]. The minimum delay values generated represent delay of noncongested paths giving an indication of the baseline propagation and transmission delay. Higher values reflect congestion. The round-trip delay can also be calculated from this data if desired. This round-trip delay is called response. The system controls are executed by initially calculating by sending and receiving a packet of information in the network and then execute the control operations accordingly.

2.2.4 Real-Time Haptic Feedback in Internet-Based Telerobotic Operation

This paper [Elhajj] discusses the time delay incurred in sending a command to the robot through the internet. The author proposes to overcome this problem through feedback control. The block diagram of the system is shown below in Figure 2.12.

Page 19: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

10

Figure 2.12: Block diagram internet based feedback control system.

2.2.5 An Advanced Telereflexive Tactical Response Robot

This paper [Gilbreath, 2001] discusses reflexive robots and autonomous decision making through training. The picture of the robot and the GUI to control the robot is shown in Figure 2.13. The robot has collision avoidance sensors to guide in autonomous mode.

(a) (b) Figure 2.13: Reflexive robots (a) Robot with camera and sensors (b) GUI to control the robot

Human Operator

Joystick Local PC INTERNET

Access Point

Remote PC

Global View Cameras

Processing Unit

Page 20: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

11

The man-machine interface is programmed using Microsoft Visual Basic and the GUI is divided into three following categories:

Mobility control, Camera control, and Non-lethal weapon control.

Each time the operator clicks on the navigation button the robot velocity is increased to one increment. The camera control is with pan and tilt.

2.2.6 An Advanced Supervisory Control of Mobile Robots

This paper [Kawamura, 2001] discusses Intelligent Machine software Architecture (IMA) that executes concurrent commands on a separate machine while establishing inter-machine communication on another machine. This software architecture has a provision to load different architectures in robots according to the sensor feedback from the robot. To facilitate remote control of a robot, a supervisory control system enables the user to view the current sensory information.

(a) (b) Figure 2.14: Reflexive robots (a) GUI to control the robot (b) Robot with camera and sensors

The robot is equipped with visual imagery, sonar and laser signals, gyroscopic

vestibular data, speed of each motor, compass heading, GPS position, and camera pan and tilt angles, and odometry. Figure 2.14(a) and (b) shows the robot and its GUI.

2.2.7 Mobile Robot Control Interfaces

This paper [Gold] discusses the architecture of the GUI developed to control a robot. The robot consists of Sonars, CCD cameras, GPS, and odometry sensors and pan-tilt camera mount. The GUI has a control interface and video display window. The picture given below in Figure 2.15 is the GUI used to control robot. The GUI has also provision for path planning map.

Page 21: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

12

Figure 2.15: Mobile Robot Control Interface displaying path map.

2.2.8 Sharing Control Presenting a Framework

The paper [Rybski, 2002] discusses direct human control of the resources in the robotic imaging system as provided through the UI subsystem. This system allows a human operator to connect to a robot imaging system directly and command it from a graphical console. A UI console can be started on any machine on the local network, allowing multiple users to request control of the resources simultaneously. Like a behavior, a UI component gets access to the robot by sending a scheduling request to the GUI manager. The user interface control has priority when they are executed. This priority makes the system to execute a command and wait for the response of the system.

2.2.9 Web Top Robotics

This paper [Hirukawa,2000] discusses teleoperation systems and groups into two types: direct type where an operator directly control the real time data from the robot from a remote site and Indirect type where the operator programs the computer to issue teletype commands that can automatically operate the robot based on the signals from the robot. Different strategies are considered to develop robots according to their types.

Page 22: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

13

3. Hardware Development

3.1 Phase1-IRIS1 Design

Surface hardness directly affects the traction of a mobile robot traversing challenging terrain. Different ground material whether sand, gravel, or soil, contributes to the mobile imaging system ability to travel effectively on the surface. These factors are discussed in this paper [Howard, 2001] and are considered in the choice of wheels. Other important factors considered in designing this robot imaging system are as shown in the Table 3.1

Factors Description

Mobility

The vehicle should be able to navigate

from a known position to a desired new location and transmit the images seen by the camera.

Size

The vehicle should have a low profile and reasonable space to accommodate the cameras.

Field of view

The cameras in the vehicle should be capable of clearly viewing any object near to it and transmit the picture to a remote station.

Table 3.1: Important design parameters of the robot.

Figure 3.1: IRIS1 Imaging System ball transfer wheels.

The system has a smooth navigation by means of four softball wheels of 1.5” diameter. These wheels produce vibration-free smooth images of the data captured. An example of one such wheel is shown in Figure 3.1. This simple to operate robot known as IRIS1 provides clear, high contrast real-time video inspection of the undercarriage of cars, vans,

Page 23: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

14

and trucks at parking lots or high security areas. This system overcomes many the drawbacks of a traditional mirror-on-an-stick system. The sensors used in this system are Polaris Wp-300c Lipstick camera, Near IR camera and Raytheon thermal camera. These three cameras are multiplexed through an Everplex 4CQ Quad Processor. The picture of this system is shown in Figure 3.2.

Figure 3.2: IRIS1 Imaging System.

IRIS1 is easy to set up and a ready to use system. This imaging system has four florescent tube lights that can illuminate during inspection of under vehicle during night times. A circuit breaker in the system trips the input power supply when there is any current leakage in the system. The picture of the inside view of the system is shown in Figure 3.3.

Figure 3.3: IRIS1 Imaging System-Inside view.

3.2 Phase2-IRIS2 Design The second phase of the hardware design involved the development of a platform

known as IRIS2.

Page 24: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

15

3.2.1 Initial design and its draw backs

The IRIS2 design is a modified version of the U.S. Army Omni-Directional Inspection System (ODIS). The ODIS robot is only 3-3/4 inches tall and incorporates eight processors, three-wheel independent steering, and a color camera with transmitter. The ODIS robot has a smooth controlled navigation, and these advantages are utilized in IRIS2 design. Two designs evolved for IRIS2 with the first shown in Figure 3.4

Figure 3.4: IRIS2 Imaging Systems.

This first system incorporated many necessary features but has a few drawbacks such as limited space for a thermal camera and complicated assembly for a user. This design has been altered by placing the sensors on the other side of the robot in the second design of IRIS2. Other drawbacks to the system are as follows:

• Battery requires manual inspection for low voltage condition. • The real time data is available as quad by lowering the resolution of the

image. • The user is unaware of short circuit situations.

These real time problems are solved using networked controls in the new IRIS2 design. The details of the system are explained in the following sections.

3.2.2 Design Improvements

The improved IRIS2 system is built on the ODIS robot by placing the sensors on the front side of the ODIS robot. As with the initial design, sensors in this robot are the Raytheon IR camera, a near infrared camera and a high resolution color camera. A mirror is used with each camera to maintain the low profile feature of the ODIS robot. The original ODIS robot has a height of 4 inches but the improvements increased at IRIS2 the height to 5.5 inches. The ground clearance of this robot is 0.56 inches. The video server and the quad splitter are placed in the front portion of the robot and the weights are balanced using a spring arrangement. A design layout appears in Figure 3.5.

Page 25: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

16

Figure 3.5: IRIS2 robot layout showing the position of cameras and sensors.

3.2.3 Components of IRIS2

The IRIS2 design consists of the features implemented in the previous design of IRIS1 and additional technology for TCP/IP communication with the robot. This networked design allows the robot to be controlled from a remote location through an internet connected computer.

Figure 3.6: Picture of IRIS2 robot -front view.

The following is the hardware system and circuits designed for under vehicle inspection using the TCP/IP network:

• Voltage circuit to send low voltage signal to axis video server, • Low current circuit to send low current signal to axis video server, • Control ON/OFF circuit to send signals to axis video server, • National I/O controller to send signals to Quad splitter, • Interface circuit for National I/O controllers to communicate with Quad

splitter and choose the channels, and

Video server

Thermal camera

Quad splitter

Near Infrared camera

High resolution sensors

Page 26: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

17

• Interface circuit for National I/O controllers to convert the RS-232 from axis video server to switch ON/OFF the robot.

Figure 3.7: Picture of IRIS2 robot with two compartments and LEDs switched ON from a Tablet PC.

Figure 3.8: Inside components of IRIS2 robot.Compartment1: Thermal Camera, Near IR Camera, Color Camera, Mirror and LED.Compartment2: Quad Splitter, Axis Video Sever, National I/O Controller, Short Circuit Tripper and Low Voltage Controller.

The IRIS2 robot has two compartments which are shown in Figure 3.6 and Figure 3.7. “Compartment 1”is designed to contain the imaging sensors used in IRIS1 robot. The subsequent four sensor output (two from thermal camera ,one from near IR camera and one from high resolution color camera) are connected to the quad splitter in the “compartment 2”.The mirror is mounted at 45 degrees angle and the sensors are focused on the mirror. A Battery provides power for both the compartments except for the thermal camera. Thermal camera has a separate built-in battery. Figure 3.8 shows details of each compartment. The “Compartment 2” has quad splitter, an Axis video server, a National I/O controller, a short circuit tripper and low voltage controller. The

Compartment 2 Compartment 1

Compartment1 Compartment2

Page 27: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

18

video signal from “Compartment 1” is connected to quad splitter. The quad splitter, low voltage controller, and short circuit tripper are connected to the Axis video server.

3.2.4 ODIS Robot

ODIS is a prototype robot, developed by the Army Tank and Research, Development and Engineer Center Robotic Mobility Lab at Utah State University. ODIS is battery powered and stands only 3.75 inches tall. It uses three omni-directional wheels with odometry and navigational sensors to maintain steering and velocity. Figure 3.9 shows the ODIS robot before modification.

Figure 3.9: ODIS robot before modification.

3.2.5 Raytheon Thermal Camera

This Raytheon thermal camera is a portable, hand held thermal imaging system shown in Figure 3.10. It translates the infrared energy into electric signals and can be used as a night vision system. Infrared energy referred to as IR, is an electromagnetic radiation that travels in a straight line through space. The travel of this electromagnetic radiation is similar to that of light energy. The frequency of IR is below the red range of a visible spectrum. As the frequency of IR energy increases, its wavelength decreases. IR energy is less subject to scattering and absorption by smoke or dust than visible light.

The advantage of IR energy is that most objects emit strong IR energy irrespective of the temperature of the object. The thermal camera sees this IR energy. Hence infrared energy can be viewed at all varying brightness levels and even in the dark.

Objects emit differing amounts of infrared energy according to their temperature and thermal characteristics. Raytheon uses this principle of varying temperature to view the object. The sensors in the camera convert the differing amount of thermal energy into equivalent amount of light energy. Hence the various objects constituting a scene are converted to visible light in the display. This capability allows the camera to capture the images of the objects by observing the thermal properties of objects regardless of lighting conditions.

Page 28: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

19

Figure 3.10: Raytheon PalmIR PRO thermal camera.

3.2.6 Polaris Wp-300c Lipstick Camera

The Polaris Wp-300c lipstick camera is a high quality color concealed camera. It has 1/3 inch interline transfer CCD with 525 lines interlaces. Figure 3.11 shows the lipstick camera.

Figure 3.11: Polaris Wp-300c Lipstick camera.

The camera has a horizontal resolution of 400 lines with power consumption at 100mA and operating at temperature 23 to 104 degrees Fahrenheit. The camera also has auto white balance and weighs 150g.

3.2.7 Ever Focus Quad Splitter

The Ever Focus unit is a digital quad processor and four channel switcher and is shown in Figure 3.12. The maximum resolution of this quad splitter is 720x480 pixels. The modes of display allow one of four video inputs to be switched to the output or the four inputs to be multiplexed into a single quad display.

Page 29: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

20

Figure 3.12: Quad splitter.

The splitter has a title generator. Each channel has an independent brightness, color and tint adjustment. A buzzer indicates alarm signals such as loss of video. The front panel of the splitter has keys for channel selection and includes special keys for freezing and zooming the output. The quad splitter can be programmed for the display details such as time, date, channel and the number of cameras as annotated text on the output camera.

3.2.8 Axis Video Server

The Axis video server is connected with the quad splitter .The quad video output of the quad splitter is compressed as JPEG images by the video server and uploaded in the network using HTTP protocols. The Axis video server can deliver 30 frames/second with resolution of 352x288 pixels. The resolution can be increased to 704x576 with a reduction to 10 frames /second. The Axis video server has one video input and one video output. The video output can be connected to test the video received instantly. Two serial port outputs are available on the video server and are used to control the pan tilt cameras and input/output operations. The system is powered either by 12VAC or 12VDC. Figure 3.13 shows the picture of Axis video server.

Figure 3.13: Axis video server.

Input /output operations are available through a 16 pin terminal block. Digital input output operations can be executed in the video server using there respective URL. The three LEDS in the front side panel of the video server has the following indications.

Status LED

• Flashing green indicates a healthy status. • Red indicates a problem with the server. • Orange indicates that the system is reset to factory defaults.

Network Indicator

• Yellow indicates that network activity is at 10 Mbps. • Green indicates that network activity is at 100 Mbps.

Page 30: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

21

• Red indicates that there is no physical network connection. Power Indicator

• Red indicates power is ON. • No light indicates power is OFF.

3.2.9 Axis Device Point

The Axis video server is connected to the device point. This 802.11b Wireless Device Point from Axis is a bridge between cabled Ethernet and wireless Ethernet. Figure 3.14 shows the picture of Axis Device Point. This device provides a parallel connection between the video server and the network. The 802.11b Wireless Device Point offers wireless functionality to video server. The device point is placed in the robot and powered with 5 VDC supply. As the power supply available with ODIS robot is 12VDC, a 5VDC regulator circuit is provided in the ODIS to power the Device Point.

Figure 3.14: Axis Device Point.

3.3 Network Controls

3.3.1 National Output Controller

The national output controller is a serial RS-232 controller that takes in RS-232 bit stream at the input and decodes this bit stream to control and drive a set of eight relays. Each relay is programmed as a hexadecimal address with the dip switch settings on the controller card. The controller can be programmed for a specific device number with the hardware settings in the controller. This hardware device number allows for many systems to be connected to the same serial port.

The heartbeat LED keeps indicate to the user the status of this controller. A steady blinking of these LED is required for the card to function properly. The relay terminals are shown above in Figure 3.15 as NC/COM/NO. These details are as follows:

• NC-Normally closed terminal, • COM-Common terminal, and • NO- Normally open terminal.

Page 31: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

22

Figure 3.15: National Output controller with connection details.

Figure 3.16: National Output controller connection with Axis server. The input voltage is connected to the COM terminal and the output is connected to the NO terminal. When the output relay is selected the voltage in terminal COM is available at the output NO. In other words the normally open circuit of the controller for the relay selected becomes a closed circuit. Each relay status can be determined from the LED 1 to 8 shown in the above Figure 3.16. The RS-232 data input is connected to the RS-232 data output of the Axis video server. The RS-232 ground of the controller and the Axis video server has to be connected to ensure signal transmission levels are synchronized

Page 32: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

23

3.3.2 Input Controls

Figure 3.17: Hardware system design for Input control operations.

The paper [Mathew] discusses about the techniques. Some of the advantages of such systems are as follows.

Increased technology transfer: Hardware used in one site in the robot can be monitored at other site and debugged for errors.

Zero debugging distance: The interface to robot need not be running in the same machine or robot and their communication can be established through the network.

Expandability: Any hardware could be added to the robot and monitored with the software from a remote system.

Transparent simulations: The system reconfigurable by changing the hardware and the application could be modified to make the system more robust according to future requirements.

The control circuit allows the robot to operate autonomously and thus the robot can remotely in a dynamic environment. The control circuit for robots can be calibration

TCP/IP

Device Point

Low Voltage

Relay controller

Control on relay

(from main power)

Short Circuit

Alarm (Tripper)

Axis Video Server

12volts

battery

Page 33: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

24

independent [Koh, 1998]. The system circuit design for the IRIS2 is calibration free and once the parameters are set for operation the status of the robot gets updated through the network. Some of the parameters set that include the voltage level in the control card.

The IRIS1 version of under vehicle robot often had problems during data acquisition due to low battery voltage and accidental short circuit conditions. These requirements are addressed using networked input controls.

The design of the hardware for the input monitoring is discussed below. The block diagram in Figure 3.17 shows the device point that connects the robot with TCP/IP network. The Axis vide server has four input data ports that senses the input signal on request through HTTP commands from the user. There are three circuits connected to this input ports of the Axis video server to perform the required tasks of voltage and load monitoring. 3.3.3 Battery Supply Monitoring

Figure 3.18: Hardware system design to monitor the battery voltage.

The 12 VDC battery supply is connected to low voltage control card that has a trip circuit to communicate with the video server when the voltage levels are below some threshold value. The input ports are monitored from the software and any trip signal is communicated to the remote computer immediately when voltages drop below the threshold. The card is programmed 12 VDC using the hardware settings in the card and the card output is connected to the terminal 9 of the Axis video server. The video server updates the input signal of this port and if the signal is not available at this port then the software decodes this value as low voltage. This system has inbuilt relay that triggers the voltage to the Axis video server depending on the battery voltage. The block diagram of the system operation is shown in Figure 3.18.

12volts

battery supply

Axis video server I/O port

-Terminal 11

Page 34: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

25

3.3.4 Overload Current Monitor

Figure 3.19: Hardware system design to monitor the overload current.

The Axis video server terminal 11 is an input port. This port is designated to determine system overload status and is connected to the overload switch. The overload switch is designed as per the requirement. The driving motor circuit of the system has overload protection designed to 2 Amp-Hour. The camera and other sensors are designed for current 1 Amp-Hour. These two types of overload switches in the system increase the protection in the system. The block diagram of the system design shown in Figure 3.19 explains the connection details.

3.3.5 Power Status Monitor

Axis video server terminal 13 is an input port. This port in the video server is designated to determine system ON/OFF status. The control relay of the National Output controller is connected to this terminal of the Axis video server. Figure 3.20 hardware system design to monitor the battery voltage.

Figure 3.20: Hardware system design to monitor the battery voltage.

12VDC

Battery Supply

Axis video server I/O port

Terminal No.11

Overload Switch

(Current values are set

according to the application

Axis Video Server-Terminal 13

Jumpers

Page 35: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

26

3.3.6 Output Controls Monitor

The Axis video server has two RS-232 ports to control the pan tilt cameras. The output control system is designed by developing an interfacing circuit that communicates with Axis video server RS-232 ports and the output controls. Figure 3.21 shows the block diagram of the hardware system design for networked output controls selection.

Figure 3.21: Hardware system design for networked output controls selection.

3.3.7. Illumination Controls Monitor

The hardware settings for the National control card have been set carefully to match with the frequency of the Axis video server. The following are three important parameters used in RS-232 communication.

• Baud rate of the Axis video server and National Output controller were matched.

• Start and stop bits are set to the same values in both Axis and National controller.

• The ground terminals of both the Axis and National controller are connected together.

• The jumper locations are available in the controller as shown in Table 3.1. The hexadecimal values corresponding to the relay are transmitted through the RS-232 port of Axis video server and the relay is selected to perform the designated operation. The hardware setting is made carefully so that the other devices connected in the same serial port do not use the same hexadecimal value. Since the device number is unique for a given controller. The output selection of the relay has to be programmed using the jumper settings given in the following Table 3.1.

Turn on ODIS

Turn on Illumination

Select the channels in Quad

Splitter

RS-232

National

Input/Output

control card

Axis video server Device point

12volts battery supply

TCP/IP

Page 36: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

27

Table 3.1: Hardware settings to program the National controller for communicating with the video server.

Jumper Default Setting Description J1 Removed Address Select J2 Removed Address Select J3 Removed Address Select J4 Removed Address Select J5 Installed Baud Select 1 J6 Removed Baud Select 2 J7 Installed Data LED Brightness J8 Installed Between

Upper Two Posts Data Output Type

J9 Installed Between Upper Two Posts

Data Input Type

Table 3.2: Hardware settings to program the National controller for 9600,1200,19200 baud rates. The national controller and the Axis video server are set to 9600 baud rate and one start and stop bit. Jumper J5 is installed and J6 is removed. Table 3.2 shows jumper setting for baud rate.

Table 3.3: Hardware settings to program the National for relay selection.

The following operations shown in Table 3.4 use the eight relays available in the controller. The details of the hexadecimal values and the control relays are given in Table 3.3.

Baud J5 J6 1200 Removed Removed 9600 Installed Removed 19200 Removed Installed

Reserved Installed Installed

ASCII Character Function 0 Turn Off Relay 1

1 Turn Off Relay 2

2 Turn Off Relay 3

3 Turn Off Relay 4

4 Turn Off Relay 5

5 Turn Off Relay 6

6 Turn Off Relay 7

7 Turn Off Relay 8

Page 37: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

28

Hexadecimal value Relay number in controller

Relay ON Relay OFF Control operations

01 FE 08 FE 01 Control ON/OFF 02 FE 09 FE 02 LED Bank 1 ON/OFF 03 FE 0A FE 03 LED Bank 2 ON/OFF 04 FE 0B FE 04 Channel 1 selection in Quad Splitter 05 FE 0C FE 05 Channel 21 selection in Quad Splitter 06 FE 0D FE 06 Channel 3 selection in Quad Splitter 07 FE 0E FE 07 Channel 4 selection in Quad Splitter 08 FE 0F FE 08 Quad selection in Quad Splitter

Table 3.4: Hardware settings with hexadecimal values to program the National for relay selection.

The device number of the controller used is 0xFE.The Table for setting the device number is shown in Table 3.4 and Table 3.5.

NCD ASCII J1 J2 J3 J4 0 0 to 15 Removed Removed Removed Removed 1 16 to 31 Installed Removed Removed Removed 2 32 to 47 Removed Installed Removed Removed

3 48 to 63 Installed Installed Removed Removed

4 64 to 79 Removed Removed Installed Removed

5 80 to 95 Installed Removed Installed Removed

6 96 to 111 Removed Installed Installed Removed

7 112 to 127 Installed Installed Installed Removed

8 128 to 143 Removed Removed Removed Installed

9 144 to 159 Installed Removed Removed Installed

10 160 to 175 Removed Installed Removed Installed

11 176 to 191 Installed Installed Removed Installed

12 192 to 207 Removed Removed Installed Installed

13 208 to 223 Installed Removed Installed Installed

14 224 to 239 Removed Installed Installed Installed

15 240 to 255 Installed Installed Installed Installed

Table 3.5: Hardware settings to set the device number for the controller.

Page 38: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

29

3.3.8 Quad Splitter Controls

The channels selection in the Ever focus quad splitter is manually changed to view individual channels. The quad splitter combines the four individual video signals of resolution 720x480 each into one single frame of resolution 720x480. The Axis video server transforms the quad image into 352x240 resolutions and transmits in the network using TCP/IP protocol at 30 frames per second. Hence the resolution of individual video frames is reduced. To solve this problem the circuit of the quad splitter is decoded and connected to National output controller. The circuit diagram of the quad splitter and the modification is shown below in Figure 3.22. These values have been determined by analyzing the circuit with a voltmeter and are not available in any manuals. The channel “1” is selected with binary value 010, channel “2” is selected with binary value of 001, and channel “3” is selected for digital input 100 and so on.

Figure 3.22: Hardware system design for networked channel selection in the quad splitter.

Page 39: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

30

Figure 3.23: Block diagram of system connecting operation for channel selection in the quad splitter.

National Output Control

Axis video server RS-232

Quad Splitter

Page 40: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

31

4. Software Development 4.1 System software overview

The software components of the IRIS2 robotic imaging system upload images captured real time in a wireless network using TCP/IP protocols. The images are uploaded at 30 frames/second and 352 X 288 pixel resolutions. The architecture is designed to capture data at this speed and simultaneously send commands to control circuit on the robot.

Figure 4.1: Architecture of the GUI for data capturing and control operations.

Graphical User Interface

Network session 1

Using AXIS video server

ActiveX controls

Network session 2

Using windows MFC

User Interactions

TCP/IP

Page 41: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

32

The software design includes two important Network sessions. • Network session 1

This session has capabilities of executing HTTP commands to communicate with the network and is responsible for collecting the imaging data. This session uses the ActiveX control obtained from the Axis video server

• Network session 2 This session operates independently using Microsoft’s Network session capabilities available in .NET. This session executes HTTP commands to monitor the control circuit available in the robot through the RS-232 interface from the Axis video server.

Microsoft provides programming interfaces for client and server applications. The GUI for IRIS2 has the client applications and connects with a network using ActiveX controls and MFC classes. The ActiveX control is provide by the Axis video server and has routines to open a internet Network session to communicate with the Axis video server. Once this routine starts the application initiates Network session and the imaging data is accessed from the video server in the robot. If this Network session is interrupted by other HTTP commands, then the data connection for video transmission gets interrupted. To overcome this problem the software architecture includes a separate Network session (Network session2) that initiates a separate browser in the application and controls the other applications of the video server using internet protocols.

The architecture shown in Figure 4.1 has three blocks. The GUI block receives the user interface commands. These commands are spit between the two Network sessions. The video capturing commands are sent to Network session1 block that uses ActiveX control and the command for input/output controls is sent Network session that uses Windows MFC.

4.2 Network Session1: GUI Design The GUI for this application is developed using Microsoft Visual C++.NET .The

GUI application is developed as a single document interface. The user controls provide easy access to IRIS2 Robotic Imaging system and the captured imaging data are visualized in real time through this GUI. The functions developed for this GUI are grouped into four sections.

4.2.1 Section 1-Hardware Selection

Figure 4.2 shows real time data and the four sections of the GUI. The first section of the GUI contains the following fields

• Hardware/software selection switch, • Channel selection, and • Two illumination switches.

Page 42: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

33

The Hardware/Software channel switch is used to select the mode of channel selection. In the initial start up of the application the Software switch is selected and then the user can make the selection in this list box control.

Figure 4.2: GUI - four sections.

The section 1 of the Figure 4.1 shows the hardware/software, channel selection, LED1 and LED2 switches. In the software selection mode the real time data’s are viewed in viewer 1 and the channel selection is made using the software switch. The channel selected in the selector is expanded to a resolution of 706 X 480 and displayed. The resolution operation is performed using bilinear operation. The four individual channels can be displayed by making the selection in the channel selector. The data captures the quad output and displays simultaneously the selected channel in the viewer 1.

Hardware switch displays the channels selected in a channel selector at high resolution by making the channel changes in the quad splitter of the robot. This hardware selection makes the viewer 1 to display a higher resolution image directly from the quad

Section1 Section 2

Viewer 1

Viewer 2

Viewer 3

Section2

Page 43: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

34

splitter. The image displayed using the hardware selection is the captured and saved image.

Table 4.1: The Table shows the resolution file size and frame transferred rate of Axis video server (source Axis web page).

The image captured rates and the resolution details is shown in Table 4.1

Figure 4.3: Data captured with excess of reflection due to nature of the object.

Figure 4.4: The Figure shows the real time data capture with illumination controlled with remote switch from the GUI.

The LED1 and LED2 switches are provided in the system to turn on the lights in the robot according to the illumination available under the vehicle. This helps to control the illumination and avoid unnecessary reflections due to excessive of lights.

Figure 4.3 shows the imaging data capture with excess of reflection due to nature of the object. This excess reflection seen in the object could be controlled by the user using the remote illumination control feature provided. The Figure 4.4 shows illumination

Resolution File size(kb) Max fps 704x480 7~159 10 352x240 1.4~40 30 176x112 0.3~10 30

Page 44: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

35

from single light source under the vehicle. This feature provides the user a convenient way of switching OFF the lights by visual inspection.

4.2.2 Section 2: Time Stamp and Robot Status

The Section 2 of the GUI contains the following fields • Date • Time • Elapsed time • Robot Control ON/OFF status • Robot Voltage Normal/Low indication • Robot power conditions Normal/short circuit conditions

Date and time displays the current operating time and date of the system. The objects for these are derived from CTime class provided by MFC. The operator name and station are included manually by the operator of this robotic imaging system. These values are recorded by the system for future reference in an internal system log and saved as text file. The Figure 4.5 shows time, control off and alarm status of the robot.

Figure 4.5: The Figure shows time, control OFF and alarm status of the robot. The elapsed time in this section indicates the amount of time the data has been

captured from the time the data capture has been started. The elapsed time is provided to give the user visual information that the data capture is in progress. Once the data capture is stopped the elapsed time is reset.

The robot control ON/OFF status checks the robot status continuously and in the status bar shown in the Figure 3.12. The system is monitored continuously and the details are updated in this status bar. This status indicator confirms the status of robot ON/OFF conditions.

The Low voltage indicator in the status bar shown in Figure indicates the voltage level of the battery power in the robot. The required voltage for operating the robot is 12 VDC and in the event of the voltage dropping down below this level sends a message to the user through this GUI. The robot conditions such as short circuit or tripping of any overload in the wheels are indicated in this portion of status bar. Such overloading information triggers an alarm in the GUI.

4.2.3 Section 3: Robot Control and Image Capture

This section of the GUI has the following features • Start here button to set the initial parameters for the robot. • Control ON to start the robot • Control OFF to stop the data • Begin button to start the capture of data • Stop Capture button to stop the capture

Page 45: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

36

• Snap shot button to take snap shot of the data Start here button opens a dialog box as shown in the Figure above and the parameters

such as image type, number of frames/second, filename, and root directory for the images are to stored. The file type has to be selected as jpeg or bmp file and the frame rate of capture can vary from 1 frames/second to 30 frames per second. File name specified is the name of the captured data file name. Figure 4.6. shows GUI- Section 3 with start capture, stop capture and options dialog box.

Figure 4.6: GUI- Section 3 with start capture, stop capture and options dialog box.

If these values are not entered the system starts with default values as follows.

• Capture type-JPEG • Number frames -5 frames per second • Filename –Safer • Root directory-Desktop

Figure 4.7: Section 3 of the GUI with button controls. The begin capture starts capturing the image at the frame rate specified by the

user through START HERE button shown above in Figure 4.7.The elapsed time control is turned on the status bar once the Begin Capture button is clicked. The Snap Shot feature is used to capture the image at that instant. During inspection, if the user finds any suspicious data, a snapshot of the real time data could be taken using this feature.

Page 46: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

37

4.2.4 Viewer 1:Real-Time Video Display

This section displays the real time image with resolution 720x480. The captured image is of resolution 352x240 at 30 frames per second. This image is enlarged to fit the window size. In the hardware mode the image seen in this viewer is the captured image because the selection of channel is made directly in the quad splitter. The resolution of the image displayed here could be increased by decreasing the data transfer frame rate. The image displayed in this viewer is the quad image of the near infrared, infrared and color images. The fourth image in the quad is the color mapped thermal image. The Figure shown below is the real time data captured under the vehicle using IRIS 2 robot(modified ODIS robot). The IRIS2 has been driven with radio controller and the data was captured on a Tablet PC. The capture rate used here has been 5 frames per second. The Figure 4.8 shows a under vehicle data captured with this GUI.

Figure 4.8: Under vehicle data captured with GUI.

The captured image is of resolution 352x240 at 30 frames per second. This image is enlarged to fit the window size. In the hardware mode the image seen in this viewer is the captured image because the selection of channel is made directly in the quad splitter. In the software mode the image displayed in the viewer is 720x480 resolution and any channel selection made in this software mode apply bilinear transformation to the image to display at 720x480 resolution. In the software selection the image of each channel are cropped in real time and displayed at increased resolution applying image transformation algorithm in real time. The Figure 4.9 shows the original resolution of the data cropped by the GUI using Software channel selection.

Figure 4.9: Data cropped by the GUI using Software channel selection.

Page 47: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

38

This real time data is transformed to 720x480 using bilinear transformation and

the data after transformation is shown in Figure 4.10. Image appears blurred in the software switching and hence a hardware switching is provided in this GUI to view a high resolution image directly from the robot.

Figure 4.10: Original resolution of the data cropped by the GUI using Software channel selection.

4.2.5 Viewer 2: Video Snap Shot Display

This viewer displays the snap shot of the captured image. Real time snap shot is made and the captured data is saved as the snapshot image with date and time of stamp included in the image file name.

Figure 4.11: The Figure shows the under vehicle image stamped by the user with time and date stamped in the file name (SAFER071003.jpg).

Page 48: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

39

4.2.6 Viewer 3: Future Display (Reserved)

This part of the GUI is used provided for future control of navigating the robot. The software requirements to navigate the robot are provided in the GUI. The user can view the robot navigation path and drive the robot under the vehicle.

The control developed in the quad splitter helps to perform this operation. The quad splitter will be switched to view the camera placed for guiding the robot for navigation. After reaching the under side of the vehicle then the quad splitter could be switched to quad view. Figure 4.12.shows the future systems.

The details of this system are included in section 7- future work.

(a) (b)

Figure 4.12: Future systems (a) GUI with navigation controls in viewer (b) The Figure shows the robot driven with radio control under the vehicle for under vehicle inspection.

4.3 Network Session 2 - Input /Output Control Design Software

In the teletype operations such switch ON the illumination, the system is designed here as direct type operation. These functions are incorporated in this application using MFC and makes user friendly to interactive with the robot. The data to switch On/Off the illumination could be made from the tool bar provided on one side of the GUI. The input status of the hardware part of the system is indicated in the status bar.

VC++.NET provide interface to Win32 internet functions directly or through MFC WinInet classes. The SAFER application uses WinInet to access common Internet protocols. The WinInet provides access to gopher, FTP and HTTP. In this application classes in the WinInet header file provide client applications without dealing with WinSock, TCP/IP and the details of the internet protocols. This software accesses Win32 functions through MFC WinInet classes. HTTP is a protocol used to transfer HTML pages from a server to a client browser. The functions provided in WinInet classes are used here to download the html files by using HTTP protocols.

The speed of this operation depends on the speed of the network. The application connects with Axis video server given by its HTTP address. After the connection is initiated the computer and the video server will initiate the conversation with file transfer protocol before the actually connection is used to retrieve the files.

Page 49: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

40

(a)

(b)

(c)

Figure 4.13: Under vehicle image captured using the GUI (a) Under side of the vehicle showing the exhaust line (b)Shows the battery at the under side of vehicle (c) driving shaft The following are three important algorithms used to establish the communication between the IRIS robotic imaging system and SAFER GUI. The Figure 4.13 shown above is the data captured during evening hours with IRIS2 robot illuminated with the

Page 50: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

41

controls provided by this networked system. The check box in the tool bar is used to switch ON the illumination.

4.3.1 Input Port

The data in the input port is available as .cgi file and transported using HTTP protocols. The HTTP URL is used to check the status of the input controls. The software uses the second Network session to communicate with the input operations and displays the status of the GUI. The loop monitors the input port using HTTP protocol and if the loop identifies any open condition then the status is reported as input failure in the GUI.

The Input controls that are monitored in the software are • Control ON/OFF • Low voltage/Normal voltage • Current good condition/short circuit

This operation is performed by opening the internet browser in the GUI and accessing the URL for input port using HTTP protocol. The html text file is read by the function provided in the MFC and the value of the input port is identified by decoding the text value of the html file. The algorithm monitors these input ports continuously and reacts immediately incase of any abnormal data’s. Some examples of such abnormal values

• Motor of the robot getting jammed due to over loading of the system. • Any short circuit in the circuits of the robot. • Low voltage may cause the reduce the potential of clarity in the data’s

collected by the sensors in under vehicle inspection.

The following is the sample html file that can read the input port details

http://<servername>/axis-cgi/io/output.cgi?<parameter>=<value>

The hexadecimal value of the input port is sent with this URL and the status of the ports is returned as integer values in the html code of this page. The Figure 4.14 shows algorithm to monitor the robot system in case of any mishap in the system due to overload or short circuit that makes the GUI to turn OFF the system.

Table 4.2: Hexadecimal values and their corresponding input operations are shown.

Hexadecimal values Input control 01 Control On 02 Voltage condition 03 Current condition 04 Future Option

Page 51: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

42

Figure 4.14: Algorithm for short circuit protection.

The input operations have to be monitored continuously with out interrupting other operations in the robot. The Network session2 send a remote HTTP command to axis video server in the robot. The axis video responds to this HTTP request by sending the status values of the respective input control and there codes are in Table 4.2. Each input control that is sent to the axis video server has a particular input value. For each of these status values the axis video server responds with hexadecimal values 0 and 1.

The value of 0 indicates that there is no input available and the values of 1 indicate that the input value is available. These values decide the operation of the robot and if the GUI finds values of 0 in the 01 and 02 ports then the robot is immediately switched OFF and the user is alerted with the alarm sound. The flow chart in Figure 4.15 shows the algorithm to monitor the input parameters in the robot.

Page 52: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

43

Figure 4.15: Algorithm to monitor the data at the input port.

4.3.2 Output Port

The algorithm to monitor the output port is similar to that of the output port. Two main output port operations are to perform. There is one output port controlled from the GUI. The HTTP command to control the output port is shown below. http://<servername>/axis-cgi/io/output.cgi?<parameter>=<value> There is one output port in the axis video server and this port is used to switch on the entire system. The output port number is one and the port is switched ON and OFF by sending a backslash ‘/’ to switch ON the system and ‘\’ to switch OFF the system. The control loop for output operation is shown in Figure 4.16.

Monitor

Input port

GUI

Yes

Open circuit sends

signal to GUI

InternetSession2

Input operations

Control OFF

Page 53: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

44

Figure 4.16: Algorithm for output operations in the robot.

4.3.3 Serial Port

Due to limited availability of the output port in the Axis video server the serial port of the video server is used for output control operations in the robot. The algorithm for these output controls is explained in Figure 4.17

Figure 4.17: Algorithm for output operations through the serial port.

The GUI send the appropriate hexadecimal commands through Network session 2 to the Network session2. Using MFC the hexadecimal commands are wrapped in the HTTP commands for serial port communication and sent through the network.

GUI

Output operations

InternetSession2

Hexadecimal values for serial port communication

GUI

Send data Receive data

Network Session 2

Page 54: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

45

4.3.4 User Parameter Recording

Figure 4.18: The .doc file created by the GUI for system status monitoring. Each and every operation in the software is recorded in the log file and saved .doc file in the programs parent directory. All the input parameter values are saved in this log with date and time details. A file is created using CFile class in MFC and all the input /output operations are mapped to this file. This ensures each and every stage to be monitored. Any failure in executing the HTTP commands can be analyzed from this log details.

4.4 Data processing Software Design and Implementation The data collected by the robot is processed using software that can continuously

read the data and segment the objects from the video signals as seen by the robot. Objects seen under the machine can be used to locate any missing components or damage.

Some of the operations that are developed in this software are edge detection operations, segmentation of objects from the back ground and color coding to enhance data visualization and assist in obtaining accurate diagnosis. The edge detection techniques used are Canny, Sobel, Prewitt, Zero Crossing and Roberts’s Method. The GUI for this software consists of a file, basic operations, spatial filtering, frequency filtering, edge detection, morphological processing, and segmentation.

Page 55: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

46

The advantage of this software is that the data can be processed in a continuous loop; i.e., the data that is output from one process (say segmentation) can be used as input to the next operation such as edge detection. The design of this software makes remote diagnosis of under-machine inspection more effective and user friendly.

4.4.1 Segmentation

There are two types of algorithm used in this graphical user’s interface. The block diagram of the segmentation algorithm-type 1 used in the graphical users interface is given below in Figure 4.19.

Figure 4.19: Block diagram of the segmentation algorithm.

Input image to this algorithm is a high-resolution image. The GUI has provision to read the image from the drop down menu and the algorithm developed here is as above. The image is low pass filtered to smooth the noises present in the image before threshold the image [Jianping, 2001]. The threshold value is used to create the binary image.

Figure 4.20. Segmented object obtained with the software for remote diagnostics

(object in the segmented picture is shown black).

This binary image is used as the marker image by doing the open operation in the marker image. The marker image is then multiplied with the original image to create the segmented image. A segmentation result using this algorithm is shown in Figure 4.20.

Page 56: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

47

Figure 4.21: Segmented object obtained with the software for remote diagnostics (object in the segmented picture is shown black) using segmentation algorithm method 2.

Figure 4.22: Segmented object obtained with the software for remote diagnostics (object in the segmented picture is with red edges).

The algorithm-method 2 used for segmentation of objects is by finding the edge of the images with canny edge detection filter. The edges detected are used as the marker image and is superimposed on the original image to determine the segmented image. The segmented objects with superimposed edges can be colored by applying a color map to help with better visualization. Results of object segmentation are shown in Figure 4.22. Here the edges of the segmented objects are shown in red.

The graphical user interface of this software has the features to work with both the segmentation algorithm for better analysis of the data. Easy color application to improve visual effects is provided. Two versions of the software (AVI Processor and Image Processor) for under machine inspection is developed to segment both the video data and images of any format. Avi processor processes the video data and all the image processing operations such as segmentation /edge detection/color mapping /frequency filtering could be applied on the image and the data could be analyzed for any defects. Image processor has all the features available with the Avi processor and is used for the still images of any format. Depending on the type of data collected by the robot either of the two software’s could be used to analyze the data. The data collected can be processed either of the software to observe any defects in the under machine parts inspected. Image captured could be removed from noise with low/high pass filtering available in this system. Filtering is provided in both time and frequency domain. The color map applied to the segmented image could be changed with the easy slide and gray level of the images could be mapped accordingly for better visualization. The GUI of the software is shown in Figure 4.23.

Page 57: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

48

Figure 4.23 GUI for the remote diagnostics software 4.4.2 Edge Detection and Radon Transformation

Edge detected image has discontinuities which would cause hard time in recognizing the segmented objects. These discontinuities in objects could be identified applying radon transformation. Radon transformation identifies objects that are in straight line.

Figure 4.24. Block Diagram of Edge detection system applying radon transform.

Page 58: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

49

This concept is applied in this algorithm to link the edges of the objects that are not connected properly in edge detection. The solution for effective usage of Radon transform is by applying the concept of localization. Localization is a method of applying radon transforms to a part of the image and finding the connected edges specific to that area. By this way the lines detected by the Radon transform could be identified and intellectually used to locate the missing edges in image. Edges in the image are detected using the canny edge-detecting filter. Canny filter could effectively detect the edges with lesser discontinuities in edges. If the threshold values of canny edge filter are adjusted then there is possibility to get the missing edged but the noise levels in the image get increased.

Figure 4.25: Block Diagram of Edge detection algorithm-Labels gray level 1.0 at discontinuity.

Figure 4.25: Eight matrixes used to traverse the pixel along the edges and set their gray level to one at discontinuity

Page 59: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

50

Edges detected in the image are first processed to label the discontinuities in the edges. This is done by multiplying each edge point with the matrix (8x8) having pixel value only at the connected coordinate. Likewise the image is read completely and the coordinates at discontinuity are determined and labeled with gray level 1. This algorithm operates according to the output produced by the following 8 matrixes. Each pixel is multiplied with the matrix and the resultant value is stored in a array. The values from the resultant of this matrix indicate whether the image pixels are connected and in the direction of pixels traverse.

Figure 4.26: Edge detection image produced by the algorithm-Labels gray level 1.0 at discontinuity

Edge detected image has a gray level of one in their edges and zero gray level at others. The algorithm reads the starting pixel of gray level one automatically. From this gray level the algorithm traverses along the direction of pixels. This traverse along the pixel direction is made possible by taking each pixel as input to the above matrix function and multiplying with all the above pixels. The matrix is arranged in such a way that it produces a value of two only if there is any pixels detected in the matrix. The matrix results are used to identify the pixels that are in discontinuity. If in any traverse if the pixel values multiplied with this matrix causes a gray level of one then the pixel value is set to one. Like wise the pixel direction and the pixel connectivity are identified with label. This algorithm works with any edge-detected image and some of the results are shown in the results. The gray levels of the image that are connected are labeled as gray level 0.5 and the gray level at the discontinuity are identified as gray level one.Radon transform represent the collection of projection of image in various direction by varying the angle. A projection of a two-dimensional function f(x,y) is a line integral in a certain direction. For example, the line integral of f(x,y) in the vertical direction is the projection of f(x,y) onto the x-axis. The line integral in the horizontal direction is the projection of f(x,y) onto the y-axis. Location of strong peaks in the Radon transform matrix correspond to the location of straight lines in the original image.

Starting

coordinate for radon

transform

Page 60: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

51

Though canny edge detection gave good results in forming the edges of the objects, there are noises in the form unlinked lines. These broken lines seen within objects due to intensity variation are studied by applying Radon transform. Radon transform is similar to Hough transform. The binary edge detected image after processing in Level 1 has two gray levels 0.5 / 1.0 in its edge. The gray level of 1.0 indicates that there is a possibility of discontinuity at that pixel and Radon transform has to be applied here with localization concept. The width of the localization has to be decided by the algorithm. Smaller the area of the width more accurate is the results in detecting the edges that are not connected. Some important concepts developed in this algorithm are as follows

• The pixel at discontinuity is selected automatically by the algorithm and the algorithm collects 10 pixels that are connected with the coordinate at discontinuity.

• Having these 10 pixels as input to the Radon transform, the maximum peak coordinates where the curves intercept is located.

• The coordinates are used to construct the line and the approximate path of the pixel is determined.

• This result is then dilated to a width of 20 pixels and this dilated image is used as marker to obtain the local area of discontinuity.

• The input image to this algorithm is shown below. The red circle shown in the image indicates the discontinuity in the edge.

The algorithm starts reading the image edge from this pixel shown as discontinuity within the read circle and the next 20 pixels connected with edge of discontinuity is read and the resultant image of this algorithm-level2 is shown below. The first set of pixels connected with the edge of discontinuity shown above is applied with radon transform and the pixels producing the maximum intercept are connected to form a line. This line is close to the pixel of discontinuity but cannot be completely dependent because it has only few pixels connected to the discontinuity.

Figure 4.27: Edge detection image input to the algorithm-level2.

Page 61: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

52

Figure 4.28: Resultant image of algorithm-level2 first set of 20 pixels read by level2. It is required to take the pixels that are associated to the pixels at discontinuity. This

is implemented with localized operation of the radon transform. Radon transform output produced with taking 20 pixels shown in the above image is shown in Figure 4.28. The line detected by the radon transform in the algorithm-level2 is used to localize the pixels that are within the area of discontinuity and is shown in Figure 4.29. As this line is connected to the entire image it is required to restrict it to the localized area to detect the pixels that are available only to the area of discontinuity. This concept is implemented here by detecting the pixels connected to the discontinuity in the radon transform detected line and removing the excess details and hence localizing the area associated with discontinuity pixels. A marker image of zero gray level having the same size of the image is created. The radon transform image is kept as reference and the pixels of width 20 pixels are dilated in the marker image. The resultant image created with dilation operation having the radon transform details is shown in the Figure 4.30.

Figure 4.29: Line detected by the radon transform with 20 pixels as input.

Page 62: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

53

Figure 4.30: Line detected by the radon transform localized.

This localized marker image is applied to the edge detected image to extract the edge

pixels that are in the localized area of the image. These pixels when applied with radon transform creates more closed line than the one detected in the level 2 of this algorithm because it takes all the pixels that are surrounding the area of discontinuity.

Localized pixels close to the area of discontinuity are applied with radon transform and now the lines detected will have a closer path to the point of disconnected pixels. In the above image it can be seen that pixels area are localized and would give more accurate results with radon transform because it has no pixels that are other than the pixels associated with discontinuity. If all the pixels that are not in line with the pixels of discontinuity are included, then the detected line passing through pixel at discontinuity are not close to the pixels at discontinuity.

Figure 4.31(a): Localized image dilated to create as marker.

Page 63: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

54

Figure 4.31(b): Localized image dilated to create as marker.

This line detected is once again localized to the image pixels at discontinuity for a given length and the resultant image is shown in fig 4.32. Finally now the algorithm traverses in the pixels of the above-detected line and interpolates with each pixels in the discontinuity to find gray level one within this localized area.

Figure 4.32: Some results (a) Final line detected with the localized pixels (b) Localized output that is applied for interpolating discontinuity pixels.

Figure 4.33: Resultant automatic edge linking implemented in the algorithm.

Page 64: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

55

If there is no pixels of discontinuity (labeled gray level one) is found then the

algorithm connects the entire algorithm concludes the pixels as not connected. In this image the algorithm identifies the pixels at discontinuity and connects the pixels at discontinuity and is shown in Figure 4.33.

4.4.3 Image Enhancement Algorithm

Due to low illumination under the vehicle, the ark image of the real time data acquired has the components of the histogram on the darker side of the gray scale. The components of the brighter image are biased toward the high side of the gray scale. The histogram processing techniques would enhance the image globally by modifying the pixels based on the gray level content of the entire image. This global approach enhances the entire image. Small areas in the image could be enhanced with localized enhancement technique. The block diagram of the localized enhancement is shown in Figure 4.34.

Figure 4.34 Block diagram for Local Enhancement.

The technique adopted here is to define a square neighborhood and move the center of this area from pixel to pixel. At each location the histogram equalization of that particular area in the image in the square neighborhood are calculated and mapped to the center of the square area in the image. Some results are shown below in the Figure 4.25.

Input image Histogram Equalization

Window Mapping Center Pixels

Locally

Equalized image

Page 65: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

56

Figure 4.35: Local Enhancement results (a) Original image (b) Enhanced image.

A graphical user interface created for this application is shown below in Figure 4.36.

Figure 4.36: GUI developed using VC++ shows local Enhancement results of under vehicle image.

Page 66: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

57

5. Experimental Results 5.1 Data Acquisition

IRIS2 robot has been tested for under vehicle inspection in Neyland parking garage in UT. The data has been captured in a remote Tablet PC. Some of the sample data’s acquired in a Tablet PC are shown in Figure 5.1.

(a)

(b)

Figure 5.1: Under vehicle image acquired using TabletPC (a) Under vehicle quad image showing gear shaft and (c) Under vehicle quad image showing gear shaft.

Page 67: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

58

Some of the important features in the software are as follows:

• GUI using Visual Studio.NET that connects with Axis Activex control. • Second Network session to perform input/output operations. • Generate separate .doc file to save the system details. • Monitor the input status continuously and generate alarms for low voltage and

overload conditions. • Displays the real time Input status in the status bar. • Software automatically switches of the robot in the event of overloading due to any

jamming conditions on the wheels. • Generate snap shots of the image and save as JPEG files. • Software establishes network connection in the GUI by just entering the IP address

in the Options dialog. • The resolution of the captured images is 352x288.

5.2 Robot Control The IRIS2 robot has been used in real time application to inspect a vehicle in the

parking garage. This IRIS2 robot has control networked to a remote computer. The illumination and channel selection can be changed from a remote computer. The IRIS2 robot inspecting vehicles in the parking garage are shown in Figure 5.2.

Figure 5.2: IRIS2 robot in a parking garage before inspection.

The sensors and the power supply for the control systems are turned on from a

remote computer. This robot has the following advantages: • Radio controlled operation. • Remote networked monitoring of voltage and current parameters. • Remote networked channel selection for high resolution image acquisition. • Remote networked data acquisition with a computer connected in the network

from anywhere in the world. • Remote networked illumination switching to overcome reflection in the image

due to excess of light.

Page 68: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

59

Figure 5.4 shows the IRIS2 inspection under the vehicle. The darkness under the vehicle makes the necessity for illumination control from a remote system than having a manual switch in the robot. The IRIS2 robot inspecting vehicles in the parking garage are shown in Figure 5.3. Some of the interesting features of the robot are shown in Table 5.1.

Figure 5.3: IRIS2 robot in a parking garage before inspecting a van.

Table 5.1: IRIS2 robot features.

• Data captured through video server to networked computer

• Networked Hardware to read the status of the robot could be read from computer

• Hardware to control On/Off the robot

• Remote channel selection for high resolution of images with networked controls

• With additional drivers illumination under the vehicle could be controlled from the real time data

• System designed to drive the robot from a computer using additional hardware

Some of the Features in the Robot

Page 69: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

60

Figure 5.4: IRIS2 robot inspecting a van in the parking garage.

Page 70: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

61

6. Conclusion The Graphical user interface designed for IRIS2 robot tested successfully in data

acquisition using Tablet PC. The GUI has the following simple interfaces to an easy data acquisition.

• User centered graphical user interface with three windows for data visualization. • Easy list box control to select the data acquisition with software or hardware. • Easy channel selection through hardware or software mode. • Second Network session makes the software self potential to issue commands to

control the hardware with interrupting the data acquisition. • Check boxes to switch on illumination to the remote robot. • Status of the robot continuously monitored and updated in the status bar. The IRIS2 robot has advanced features such as networked controls and radio

controlled navigation. The real time data is uploaded to the wireless network using TCP/IP protocol and viewed from a PC connected in the internet anywhere in the world. With the techniques developed here a autonomous robot driven from a computer is possible and its block diagram is as follows.

The block diagram shown in Figure 6.1 is one way of making the navigation of under vehicle robot from a computer connected in the network. The Axis video server communicates with the National output controller and the stepper motor controller. The National output controller changes the stepper motor power supply between the two motors. This reduces additional stepper motor controller electronics in the robot and provides a robust controller vehicle inspection.

Figure 6.1: Block diagram to navigate the robot from a computer

User Interface

HTTP commands with Hexadecimal values to control the stepper

motor controller

Network session 2

Using windows MFC

Page 71: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

62

Figure 6.2: Future robot design to control from a remote PC.

There are three interesting future prospects with this software • Software architecture support to drive the robot from any computer available

in the network with the existing capabilities in the software. • Illumination manipulation with existing capabilities in the software to reduce

the reflection of excessive light in the object that would be of great importance in segmenting the under vehicle parts for object recognition.

• Develop data acquisition and control software in a Hand PC or Pocket PC.

The GUI developed here has the resource to generate commands to control a robot. The second Network session is valuable as a separate function in the Saferview.cpp file and can be called anywhere in the program to navigate the robot under the vehicle.

Another important problem that can be addressed with this software is reflection of objects due to over lighting. This over lighting cause improper segmentation of objects and adds complexity to object recognition. A dc drive can be interfaced in the robot and the software can adjust the illumination parameters under the vehicle with appropriate hexadecimal values and reduce the brightness to moderate value under the vehicle. Figure 6.2 shows the design for future robot with controls from PC.

The video acquisition and robot control with Hand PC/Pocket PC is another interesting application developed to make the robot easily controllable through internet. Hand PC /Pocket PC operate in Windows CE operating system. The Windows CE architecture is different from Windows desktop versions. A GUI developed in Windows CE using Embedded Visual C++ operates from NEC MobilePRO900 to control robot input/output operations. GUI (Windows CE) version can issue controls for input/output operations in the robot. The display part of this GUI are yet to be made.

National Output

Controller

Stepper motor

controller

Stepper Motor

(Back Wheels)

Stepper Motor

(front Wheels)

Axis Video Server

TCP/IP

Page 72: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

63

Bibliography Elhajj.I, H. Hummert and N.Yun-hui, “Real-Time Haptic Feedback in Internet-Based Telerobotic Operation”, Internet Reference: http://www.egr.msu.edu/ralab/Publications_PDF/chicago00.pdf.

Gold.E, “Mobile Robot Control Interface”, Internet Reference: http://www1.cs.columbia.edu/robotics/projects/avenueUI/oldUIwriteup.

Gilbreath.G.A, D.A.Ciccimaro, H.R. Everett, “An Advanced Telereflexive Tactical Response Robot” Autonomous Robots, Volume 11, 39-47, July 2001. Hirukawa.H, I. Hara “Web top robotics”, IEEE Robotics & Automation Magazine, pp.40-44, June, 2000. Howard .A and H.Seraji “An Intelligent Terrain-Based Navigation System for Planetary Rovers”, IEEE Robotics & Automation, pp1-7, 2001. Jianping.F, D.K.Y.Yau, et.al. “Automatic Image Segmentation by Integrating Color- Edge Extraction and Seeded Region growing”, IEEE Trans. Image Processing volume 10, pp.1454-1466, Oct 2001. Kawamura, R.A.Peters II, C. Johnson, P. Nilas and S. Thongchai, “Supervisory Control of Mobile Robots using Sensory EgoSphere” 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001, July 29- August 1 2001, Banff, Alberta, Canada. Page 531-537, 2001. Koh.H, K.Igarshi, M.Asada, “Adaptive Hybrid Control for Visual and Force Servoing in an Unknown Environment ” , pp.1097-1103, IEEE Robotics & Automation ,1998 . Kress.R.L, W.R.Hamel, P.Murray, and K.Bills, “Control Strategies for Teleoperated Internet Assembly”, IEEE/ASME Transaction on Mechtronics, Vol. 6, NO. 4, pp.410-416, December 2001. Kuglin and D. Hines, “The phase correlation image alignment method,” Proc.1975 the IEEE Int. Conf. on Cybernetics and Society, pp. 163-165, 1975. Law Enforcement Associates, Inc., “Under Vehicle Inspection System (UVI)”, Internet Reference: http://www.uvisystems.com/mil.htm. Lumenyte International Corporation, “The Security Illumination Mat System (SIMS™)”, Internet Reference: http://www.army- technology.com/contractors/civil/lumenyte/lumenyte1.html. Malinowski.A, B. Wilamowski, “Controlling Robots via Internet” 1st International Conference on Information Tech., Mechatronics, Turkey, October 1-3,2001.

Page 73: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

64

Militech International (Law Enforcement Assoc, Inc.),“Under Vehicle Inspection System (UVI)”, Internet Reference: http://www.cnewsusa.com/ads/22071.html.

Moore K. L and N.S.Flann, “A Six-Wheeled Omni directional Autonomous Mobile robot”, IEEE Control System Magazine, pp 53-66, December 2000. Myers.G.K, S.Gherbi, “Multimodal User Interface for Mobile Robots”, Internet Ref.: http://www.artificialmuscle.com/publications/433-pa-98-069.pdf. Peleg .S, B. Rousso, A.Rav-Acha, and A.Zomet, “Misaiming on adaptive manifolds,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 10, pp. 1144-1154, October 2000.

Perceptics, “Under Vehicle Surveillance System”, Internet Reference: http://www.perceptics.com/vehicle-surveillance.html.

Professional Search System, “Under Vehicle Inspection systems”, Internet Reference: http://www.detectorcenter.it/industria/sicurezza/mirror.htm.

Prolite Armor Systems, “Under Vehicle Video Inspection System” Internet Reference: http://www.publicsafetymall.com/UVIS.htm. Rybski.P.E, E. Rybski, S.A. Stoeter, Nikolaos, I. Burt, T.Dahlin, P.Papanikolopoulos, M.Gini, D.Hougen, D.G.Krantz and F.Nageotte,“Sharing Control Presenting a Framework for the Operation and Coordination of Multiple Miniature Robots”, IEEE Robotics & Automation, Vol.8, No. 4, December 2002. Schulz.D, W.Burgard, D.Fox, S.Thrun and A.B.Cremers, “Web Interfaces for Mobile Robots in Public Places” Robots on the web”, IEEE Robotics & Automation, Vol. 7, No. 1, pp. 48 – 56, March, 2000. Taylor.K and B.Dalton, “Internet robots: A New Robotics Niche”, IEEE Robotics and Automation, pp.27-34, March 2000. Siegwart.R, P. Saucy. “Interacting Mobile Robots on the Web” ICRA’99, Detroit, MI, USA, pp 1-7, May 10-15, 1999. Search Systems incorporated, Internet Reference: http://www.searchsystems.com/PDF/UVIS.pdf . Vehicle Inspection Technologies, “Und-Aware, under vehicle inspection system” Internet Reference: http://www.undaware.com/brochure.htm. Wintron Technologies, “Under Vehicle Inspection System”, Internet Reference: http://www.wintrontech.com.

Page 74: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

65

Wei.Y.M , and B. S. Manjunath, “EdgeFlow: A Technique for Boundary Detection and Image Segmentation” IEEE Trans. Image Processing Vol. 9, no. 8, pp.1375-1388, August 2000.

Yu .L, P.W.Tsui, Q. Zhou, H.Hu, “A Web-based Telerobotic System for Research and Education at Essex”, IEEE/ASME International Conference on Advanced Intelligent Mechatronics Proceedings, pp 1-6, 8─12 July 2001. Como, Italy.

Page 75: Hardware and Software Development of a Wireless Imaging ...€¦ · Wireless Imaging System for Under Vehicle Inspection Robot PILOT (Project In Lieu Of Thesis) Presented for the

66

Vita Balaji Ramadoss was born in Tamilnadu, India, on November 25, 1972, the son of

Kalavathy and Ramadoss. After graduating in 1990 from EVR School, India, he attended under graduate engineering degree in Mookambigai College of Engineering, Tamilnadu, India. He started his career as an Electronics Engineer in DATS, Chennai, India and joined TAMIN Plant (government of India company) in 1996, as an Automation Engineer and worked for five and half years. In 2001, he returned to Graduate School as a Masters student at the University of Tennessee, Chattanooga. He joined Imaging, Robotics and Intelligent System Laboratory as a graduate research student where he completed his Masters in Electrical and Computer Engineering in Fall 2003.