technical documentation - isy.liu.se...remotely operated underwater vehicle 2019–01–04 1...

49
Technical Documentation Editor: Martin Drangel Version 1.0 Status Reviewed Martin Drangel Approved Fredrik Ljungberg TSRT10 - Control System Project Course Technical Documentation ROV [email protected]

Upload: others

Post on 24-Jan-2021

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Technical DocumentationEditor: Martin Drangel

Version 1.0

StatusReviewed Martin DrangelApproved Fredrik Ljungberg

TSRT10 - Control System Project CourseTechnical Documentation

[email protected]

Page 2: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Project Identity2018/HT, ROV

Linköping University, Department of Electrical Engineering (ISY)

Project Email: [email protected]

Web site: www.isy.liu.se/edu/projekt/reglerteknik/2018/rov/

Client: Fredrik Ljungberg, Linköping University

Phone: +46 73 05 14 895,

E-mail: [email protected]

Customer: Rikard Hagman, Combine Control Systems AB

Phone: +46 72 964 70 59,

E-mail: [email protected]

Course Responsible: Daniel Axehill, Linköping University

Phone: +46 13 28 40 42,

E-mail: [email protected]

Group members:

Name Responsibility Phone E-mail (@stu-dent.liu.se)

Kristian Ericsson Project Leader (PL) 070 64 38 923 krier714Martin Drangel Documentation (DOC) 076 765 83 65 mardr222Casper Johansson Hardware (HW) 073 07 30 858 casjo385Tommy Karlsson Design 073 07 32 095 tomka260Philip Öhrn Software 073 07 59 759 phioh654Victor Petersson Testing 073 03 30 126 vicpe520Mikael Nådin Information 073 43 49 157 mikna029

Page 3: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Document historyVersion Date Changes Sign Reviewed0.1 2018-12-06 First draft. MD All0.2 2018-12-13 Second draft. MD All1.0 2018-12-18 First version. MD All

TSRT10 - Control System Project CourseTechnical Documentation

[email protected]

Page 4: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Contents

1 Introduction 1

1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.3 Use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

2 System Overview 2

2.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2.3 Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.4 Exclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.5 Design philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

3 New Hardware 6

3.1 Raspberry Pi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.2 LED-lights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.3 Step-down voltage regulator . . . . . . . . . . . . . . . . . . . . . . . 6

4 Communication 7

4.1 Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

5 Modelling 10

5.1 Coordinate systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

5.2 Dynamic model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

5.3 Euler angles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

5.4 Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

5.4.1 Kinematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

5.4.2 Rigid-Body kinetics . . . . . . . . . . . . . . . . . . . . . . . 14

5.5 Hydrodynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

5.6 Hydrostatics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

5.7 Actuators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

5.8 Model in component form . . . . . . . . . . . . . . . . . . . . . . . . 17

5.9 Simplified model structure . . . . . . . . . . . . . . . . . . . . . . . . 17

TSRT10 - Control System Project CourseTechnical Documentation

[email protected]

Page 5: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

6 Sensor Fusion 19

6.1 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6.1.1 Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6.1.2 Gyroscope . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6.1.3 Accelerometer . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6.1.4 Magnetometer . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6.1.5 Pressure sensors . . . . . . . . . . . . . . . . . . . . . . . . . 20

6.1.6 Sonar sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6.2 Complementary filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6.3 Depth estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

6.4 Extended Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . 22

6.4.1 Motion model . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

6.4.2 Measurement update . . . . . . . . . . . . . . . . . . . . . . . 23

6.5 Further development . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

7 Graphical User Interface 25

7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.1.1 Topic monitoring . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.1.2 Plot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.1.3 Camera image . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.2 Other rqt plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

7.3 Further development . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

8 Control System 27

8.1 Control Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

8.1.1 Decentralized control . . . . . . . . . . . . . . . . . . . . . . . 27

8.1.2 PID . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

8.1.3 Set-point weighting . . . . . . . . . . . . . . . . . . . . . . . . 28

8.1.4 Decoupled control . . . . . . . . . . . . . . . . . . . . . . . . 28

8.1.5 Feedback linearization . . . . . . . . . . . . . . . . . . . . . . 28

8.2 General structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

8.3 Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

8.4 Thrust decoupling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

TSRT10 - Control System Project CourseTechnical Documentation 5

[email protected]

Page 6: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

8.5 Modes of operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

8.5.1 Vision mode . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

8.5.2 Path planner mode . . . . . . . . . . . . . . . . . . . . . . . . 32

8.5.3 Publisher mode . . . . . . . . . . . . . . . . . . . . . . . . . . 32

8.5.4 XBOX mode . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

8.6 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

8.7 Further development . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

9 Vision System 34

9.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

9.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

9.3 Object . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

9.4 Object detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

9.5 Geometrical calculations and reference signals . . . . . . . . . . . . . . 36

9.5.1 Intrinsic camera parameters and distance calculation . . . . . . 36

9.5.2 Angle calculation . . . . . . . . . . . . . . . . . . . . . . . . . 37

9.5.3 Passing reference signals . . . . . . . . . . . . . . . . . . . . . 38

9.6 Further development . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

10 Autonomy 40

10.1 Move to predefined location . . . . . . . . . . . . . . . . . . . . . . . 40

10.2 Rise to surface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

10.2.1 Further development . . . . . . . . . . . . . . . . . . . . . . . 41

TSRT10 - Control System Project CourseTechnical Documentation

[email protected]

Page 7: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Notations

CB Center of BuoyancyCG Center of GravityEKF Extended Kalman FilterESC Electronic Speed ControllerDOF Degree of FreedomGCS Global Coordinate SystemIMU Inertial Measurement UnitLCS Local Coordinate SystemPCS Pool Coordinate SystemROS Robot Operating SystemROV Remotely Operated Underwater VehicleRPI Raspberry PiSCS Sonar Coordinate System

TSRT10 - Control System Project CourseTechnical Documentation

[email protected]

Page 8: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

1 Introduction

The area of autonomous vehicles is an area with an aggressive progression. Every yearmore companies focus their research towards this field and many large vehicle compa-nies are joining the trend. No matter if the operation is in the sea, air, space or on landthere is a use for autonomous vehicles.

This project is the third instance of its kind in the course TSRT10 where a remotely op-erated underwater vehicle (ROV) is being developed. The earlier projects have focusedon basic functionality, simulation and modelling of the ROV, whereas this project willfocus on furthering the existing functionality, while also adding some new.

1.1 Purpose

The purpose of this project has been to enhance the performance of the ROV’s controlsystem and navigation system. This is to be achieved using the results of the previousprojects as basis. The project also intends to further develop the ROV’s vision systemand sensor fusion estimates.

1.2 Goal

The goal for the project has been for the ROV to be able to carry out some predefinedmissions. One example of such missions could be to find and follow an object underwater, or to move to a predefined point.

Furthermore the control system should be robust, easy to modify and easy to understandfor future project groups.

1.3 Use

When the project is finished, the result will be used to further develop the ROV. Ulti-mately, the goal is to have a fully autonomous vehicle. The ROV will also be used as atest platform for development of control systems for underwater vehicles.

TSRT10 - Control System Project CourseTechnical Documentation 1

[email protected]

Page 9: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

2 System Overview

In this section an overview of both the hardware and the software of the system willbe presented. The main components of the system are the ROV and the external work-station. Most of the functionality that has been located on the external workstation inprevious projects has been migrated to the new Raspberry Pi 3 B+ located on the ROV.

In Figure 1 an overview of the hardware and the software modules in the system andhow these are connected is presented. In Figure 2 an overview of the interface and whatinformation is sent between the modules is shown.

Figure 1: An overview of the system. Solid lines are power lines and dashed lines arecommunication lines. Sharp-edged boxes are hardware components and round-edgedboxes are software modules.

TSRT10 - Control System Project CourseTechnical Documentation 2

[email protected]

Page 10: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Figure 2: An overview of the system modules. Sharp-edged boxes are hardware com-ponents and round-edged boxes are software modules.

2.1 Hardware

The hardware located on the ROV is listed below.

• A BlueROV acrylic chassi.

• An acrylic tube for electronics.

• Six 30A AfroESC (Electronic Speed Controller) flashed with Blue Robotics lin-earising firmware.

• Six Blue Robotics T200 thrusters.

• A Raspberry Pi 3 B+ with mounted heatsinks.

• A Raspberry Pi camera module v.2.

• A HK Pilot Mega 2.7 with the following sensors:

– Magnetometer - HMC5883L.

– Barometer - MS5611-01BA.

– IMU - MPU6000.

• Three sonar sensors - HRXL-MaxSonar-WR-MB7360.

TSRT10 - Control System Project CourseTechnical Documentation 3

[email protected]

Page 11: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

• An external pressure sensor - MS5837-30BA.

• A battery - Turnigy 5000mAh 4S 25C Lipo Pack.

• A LiPo Battery Voltage Tester with low voltage buzzer alarm.

• A step down voltage regulator Pololu D24V50F5.

The external workstation is a laptop with Ubuntu 16.04, ROS (kinetic) and relevantROS-packages installed.

2.2 Software

Since the system is running on a ROS framework it is natural to describe the softwarecomponents as nodes. The nodes included in the system and a description of the purposeof each node is listed below.

• roscore - Master node that tells other nodes how to find each other.

• raspicam_node - Handles image data from the camera.

• bluerov_arduino - Handles the sensors and ESCs on the HKPilot.

• heartbeat - Makes sure that a connection between the ROV and workstation isestablished.

• teleop_xbox - Handles user input from the XBOX-controller.

• joy - Handles OS USB inputs.

• rqt - Handles the visual GUI.

• controller - Handles the implementation of the controllers on the ROV.

• complementary_filter_node - Handles the implementation of the complemen-tary filter for attitude estimation.

• pos_est - Handles the implementation of the Kalman filter for postion estimation.

• depth_est - Handles depth estimation

• vision - Handles image processing.

TSRT10 - Control System Project CourseTechnical Documentation 4

[email protected]

Page 12: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

2.3 Modules

The I/O-module consists of the Raspberry Pi (RPI) and HKPilot. The RPI runs a ROS-serial node and functions as a master while the HKPilot runs a ROS-serial arduino-nodeand is responsible for sensor readings and communicating sensor values to the RPI.

The sensor fusion module consists of implementations of filters and state estimators. Ittakes raw sensor data as input and outputs estimated states.

The control module consists of the implementations of the controllers used for control-ling the ROV. It takes state estimates as input and outputs PWM control signals to theESCs which control the thrusters.

The vision module is located on the external workstation and handles the image pro-cessing related to detecting and tracking an object.

The GUI module handles settings provided by the user from the graphic interface. Itcan also display a graphic representation of the ROV’s position and orientation whenrunning a simulation.

2.4 Exclusions

In agreement with the customer, the environment where the BlueROV-system is to bedeveloped and used is a pool with calm water. No regard was taken during the develop-ment to fulfill the requirements in environments with high disturbances, like open wateror similar outdoors environments.

2.5 Design philosophy

Development of software is conducted with modularity in mind to make it possibleto modify or remove a module without damaging the larger system. It also makes itpossible for several modules to be developed in parallel.

The code produced in the project adheres to the Google C++ or ROS coding standarddepending on what programming language is used.

TSRT10 - Control System Project CourseTechnical Documentation 5

[email protected]

Page 13: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

3 New Hardware

For the most part all hardware is the same as last years project, except for a new RPI,LED-lights and a different step-down voltage regulator.

3.1 Raspberry Pi 3

As requested from the client the Raspberry Pi was upgraded from a RPI2 to a RPI3,which is a more powerful version with more computational power. The physical switch-ing is straightforward as the models are almost identical concerning the mount and ca-bles. As for the software a new installation was made to match the software on the oldermodel.

3.2 LED-lights

To illuminate the environment and to visually communicate the status of the ROV acouple of LED-strips are mounted inside of the ROV. Depending on the position in theROV the LED-lights have different functions. The lights in the front of the ROV areused to supply light for the camera to make it easier to detect object. Some of theother LED-strips are used to give the operator a hint of the ROV’s current attitude tomake it easier to steer in manual mode. Lastly the remaining LED-lights are used tocommunicate status and can be used to indicate errors in future implementations. Whilemostly redundant in the current state of the ROV, it will be valuable for when the ROV isrun autonomously. The colors of the the LED-lights are relayed through ROS-messagesand depend on the current status of the ROV.

3.3 Step-down voltage regulator

During some early tests it was discovered that the step-down voltage regulator used inearlier years did not supply enough current to satisfy the need of the six ESC:s, theRPI and the arduino. Having the old step-down regulator made the system unreliable asadding more functionality onto the RPI made it use more current and as a consequence itshut down and restarted. The new step-down regulator is a "Step Down VR D24V50F5"which converts voltage between 6 and 38 volts to 5 volts, and supplies a maximum of5 amperes. When mounting the step-down regulator some adjustments had to be made,a new connector from the battery to the ROV and also some cables were soldered ontothe step-down regulator.

TSRT10 - Control System Project CourseTechnical Documentation 6

[email protected]

Page 14: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

4 Communication

The system is developed to be modular and ROS is used as a framework to encouragethis. The communication between the ROS nodes are generally done using topics orservices, but for this project only topics were used. The reason for this is to keep theROS syntax as easy and understandable as possible since most current and future projectmembers don’t have any ROS experience. All nodes have the ability to subscribe and/orpublish messages to a topic and the communication between them is carried out auto-matically by ROS. In Section 4.1 the topics and what message type they correspondto are presented. Which node subscribes and publishes to what topic is also displayedalong with a short description of the topic.

4.1 Topics

/rovio/enable_thrustersType: std_msgs: BoolSubscribers: bluerov_arduinoPublishers: teleop_xboxDescription: Signals for enabling/disabling the thrusters./rovio/thrustersType: std_msgs: UInt16MultiArraySubscribers: bluerov_arduinoPublishers: controllerDescription: Control signals to the thrusters./imu/data_raw_compressedType: std_msgs: UInt16MultiArraySubscribers: imu_converterPublishers: bluerov_arduinoDescription: The raw IMU measurements. Compressed in order to not fill the serialbuffer./imu/data_rawType: sensor_msgs: ImuSubscribers: complementary_filterPublishers: imu_converterDescription: The raw IMU measurements, decompressed into the standrad IMU mes-sage./imu/dataType: sensor_msgs: ImuSubscribers: teleop_xbox, controller, vision, depth_est, pos_estPublishers: complementary_filterDescription: The updated states after orientation sensor fusion is applied.

TSRT10 - Control System Project CourseTechnical Documentation 7

[email protected]

Page 15: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

/imu/mag_compressedType: std_msgs: UInt16MultiArraySubscribers: imu_converterPublishers: bluerov_arduinoDescription: The raw MAG measurements. Compressed in order to not fill the serialbuffer./imu/magType: sensor_msgs: MagneticFieldSubscribers: complementary_filterPublishers: imu_converterDescription: The raw MAG measurements, decompressed into the standrad MAG mes-sage./rovio/heartbeatType: std_msgs: BoolSubscribers: bluerov_arduino, controllerPublishers: heartbeat_rovioDescription: Sent continously to show that the communication between the ROV andworkstation works./cmd_velType: geometry_msgs: TwistSubscribers: controllerPublishers: teleop_xboxDescription: Desired forward force and desired angle from the controller./joyType: sensor_msgs: JoySubscribers: teleop_xboxPublishers: joy_nodeDescription: Input from the XBOX controller./gyro/calibrate_offsetsType: std_msgs: BoolSubscribers: bluerov_arduinoPublishers: -Description: Command to calibrate the gyro./rovio/accelerometer/calibrate_offsetsType: std_msgs: BoolSubscribers: bluerov_arduinoPublishers: -Description: Command to calibrate the accelerometer./rovio/sonar/(front|right|left)Type: std_msgs: Float32Subscribers: pos_estPublishers: bluerov_arduinoDescription: Measurements from the sonar sensors.

TSRT10 - Control System Project CourseTechnical Documentation 8

[email protected]

Page 16: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

/raspicam_node/compressedType: sensor_msgs: CompressedImageSubscribers: visionPublishers: raspicam_nodeDescription: Streams recorded images from camera./rovio/depth/dataType: std_msgs: Float32Subscribers: teleop_xbox, controllerPublishers: depth_estDescription: Estimated depth of the rov./rovio/water_pressure/dataType: std_msgs: Float32Subscribers: depth_estPublishers: bluerov_arduinoDescription: Pressure used to calculate the depth of the center of the rov./controller/modeType: std_msgs: UInt16Subscribers: bluerov_arduino, controllerPublishers: -Description: The current control mode used./ref_posType: geometry_msgs: Vector3Subscribers: controllerPublishers: -Description: Target position in pool./rovio/pos_estType: std_msgs: Float32MultiArraySubscribers: controllerPublishers: pos_estDescription: Contains estimated states x-position, y-position, forward velocity and poolyaw offset./(Depth|Roll|Pitch|Yaw)/refType: std_msgs: Float32Subscribers: controllerPublishers: -Description: Target reference for the ROV in one dimension when corresponding controlmode is used./PID/(x|z|roll|pitch|yaw)Type: geometry_msgs: Vector3Subscribers: controllerPublishers: -Description: Used for temporarily trying different PID parameters. The parameters areset in the order P, I, D. That is: x = P, y = I, z = D.

TSRT10 - Control System Project CourseTechnical Documentation 9

[email protected]

Page 17: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

5 Modelling

The model used during the project has not been changed from previous projects and thissection therefore mainly presents results that have been obtained at an earlier point.These result, however, have been useful throughout this project and are relevant topresent. For a more detailed description, see previous projects [4][5][6].

5.1 Coordinate systems

In the model, two orthogonal coordinate systems are used. These are the Pool Coordi-nate System (PCS) and the Local Coordinate System (LCS). The PCS has its origin inone corner of the pool with the x-and y-vectors spanning the horizontal plane and thez-vector pointing down towards the bottom of the pool. The LCS has its origin in thecenter of gravity on the ROV with the x-axis pointing towards the front, the y-axis point-ing to starboard and the z-axis pointing downwards relative to the ROV. An overview ofthe GCS is presented in Figure 3 and the LCS in Figure 4.

Figure 3: Top down view of the pool showing the global coordinate system.

TSRT10 - Control System Project CourseTechnical Documentation 10

[email protected]

Page 18: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Figure 4: Top down view of the ROV showing the local coordinate system.

5.2 Dynamic model

The following dynamic model describes the forces acting on the ROV. The full physi-cal model is first derived and then some assumptions are introduced that simplifies themodel.

The underwater vehicle with six degrees of freedom can be described by the followinggeneral equation.

η = JΘ(η)ν, (1a)

τ = Mν +C(ν)ν +D(ν)ν + g(η). (1b)

Where the notations are defined as

η ,[x y z φ θ ψ

]T,

ν ,[u v w p q r

]T,

(2)

And some additional notations is introduced in order to make some equations morereadable

TSRT10 - Control System Project CourseTechnical Documentation 11

[email protected]

Page 19: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

η1 , p ,[x y z

]T (3a)

η2 , Θ ,[φ θ ψ

]T (3b)

ν1 , v ,[u v w

]T (3c)

ν2 , ω ,[p q r

]T. (3d)

The vector η describes position and attitude in the PCS and ν linear and angular ve-locities in LCS. JΘ is a transformation matrix from the LCS to the PCS for linear andangular velocities. τ is the forces acting on the ROV generated by the thrusters. M ,C(ν), D(ν) and g(ν) respectively describes mass and moment of inertia, Coriolis ef-fect, damping forces, gravity and buoyancy effect on the ROV.

The parameters used in the ROV model are listed in Table 2. If new sensors are added tothe ROV, the mass and center of gravity parameters might need to be updated. Most ofthese parameters were determined in [4] by using gray box modelling and an ExtendedKalman Filter (EKF). By exciting the ROV in one or two DOFs at the time and analyzingthe fit for the model with estimated parameters to validation data, the parameters thatproduced the highest fit to validation data were determined.

Table 1: Parameters used in the ROV model.

Parameter Value Descriptionm 7.36 [kg] The total mass of the ROV.g 9.82 [m/s2] Gravitational constants.ρ 1000 [kg/m3] Density of water.lx1 0.19 [m] Distance from thruster 1 to CG in x-direction.ly1 0.11 [m] Distance from thruster 1 to CG in y-direction.lx2 0.19 [m] Distance from thruster 2 to CG in x-direction.ly2 0.11 [m] Distance from thruster 2 to CG in y-direction.ly3 0.11 [m] Distance from thruster 3 to CG in y-direction.lz3 0.05 [m] Distance from thruster 3 to CG in z-direction.ly4 0.11 [m] Distance from thruster 4 to CG in y-direction.lz4 0.05 [m] Distance from thruster 4 to CG in z-direction.lx5 0.17 [m] Distance from thruster 5 to CG in x-direction.lz6 0.06 [m] Distance from thruster 6 to CG in z-direction.Zb -0.018 [m] Distance from CG to CB.V 0.0075 [m3] Displaced volume.B 73.7 [N] The ROV’s buoyancy (B = ρgV ).W 72.2 [N] Force of gravity (W = mg).Xu -19.7 [kg/s] Linear damping coefficient due to translation in

the x-direction.Xu|u -1.36*10−5 [kg/m] Quadratic damping coefficient due to translation

in the x-direction.

TSRT10 - Control System Project CourseTechnical Documentation 12

[email protected]

Page 20: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Table 1: Parameters used in the ROV model.

Parameter Value DescriptionYv -33.5 [kg/s] Linear damping coefficient due to translation in

the y-direction.Yv|v -0.0022 [kg/m] Quadratic damping coefficient due to translation

in the y-directionZw -2.08*10−5 [kg/s] Linear damping coefficient due to translation in

the z-direction.Zw|w -121 [kg/m] Quadratic damping coefficient due to translation

in the z-direction.Kp -0.282 [kgm2/s] Linear damping coefficient due to rotation about

the x-axis.Kp|p -0.104 [kgm2] Quadratic damping coefficient due to rotation

about the x-axis.Mq -0.496 [kgm2/s] Linear damping coefficient due to rotation about

the y-axis.Mq|q -0.104 [kgm2] Quadratic damping coefficient due to rotation

about the y-axis.Nr -1.02 [kgm2/s] Linear damping coefficient due to rotation about

the z-axis.Nr|r -5.84*10−8 [kgm2] Quadratic damping coefficient due to rotation

about the z-axis.Ap 0.167 [kgm2] Inertia and increased inertia around the x-axis.Bq 0.334 [kgm2] Inertia and increased inertia around the y-axis.Cr 0.718 [kgm2] Inertia and increased inertia around the z-axis.Du 35.6 [kg] Mass and mass added due to translation in the x-

direction.Ev 107.4 [kg] Mass and mass added due to translation in the y-

direction.Fw 57.8 [kg] Mass and mass added due to translation in the z-

direction.

5.3 Euler angles

The ROV’s orientation is described using Euler angles, yaw ψ (z-rot), pitch θ (y-rot) androll φ (x-rot) which are rotation around the x-, y- and z-axis respectively. The rotationmatrix from the LCS to the PCS can be defined with Euler angles as in (4).

RPL(Θ) =

CθCφ −CθSψ + SφSθCψ −SθSψ + CφSθCψCθSψ CθCψ + SφSθSψ −SθCψ + CφSθSψ−Sθ SφCθ CφCθ

(4)

TSRT10 - Control System Project CourseTechnical Documentation 13

[email protected]

Page 21: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

In (4) C. stands for cos(.) and S. stands for sin(.), further T. stands for tan(.). Therotation matrix has the property of being orthogonal, i.e. RP

L(Θ)T = RPL(Θ)−1. It then

follows that RPL(Θ) = RL

P (Θ)T . The notation RPL , R is thus introduced to simplify

the following equations. The transformation from the LCS angular velocities to the PCSangular velocities is Θ = TΘ(Θ)ω, where TΘ(Θ) is

TΘ(Θ) =

1 SφTθ CφTθ0 Cφ −Sψ0 Sφ/Cθ Cφ/Cθ

. (5)

5.4 Dynamics

The ROV dynamics is divided into kinematics and kinetics. Kinematics describes themotion of a body due to its geometrical form. The motion due to external forces isdescribed by the kinetics.

5.4.1 Kinematics

The state equations for the kinematics are

η = JΘ(η)ν ⇐⇒[p

Θ

]=

[R(Θ) 03x3

03x3 TΘ(Θ)

] [vω

]. (6)

5.4.2 Rigid-Body kinetics

The rigid-body kinetics are described, using the Euler angle formulation, as

τRB = MRBν +CRB(ν)ν, (7)

where τRB is the forces and moments acting on the rigid body,MRB is the inertia matrixfor the rigid body and CRB(ν) describes the Coriolis effect and centripetal forces. Un-der the assumption that the geometrical center of the ROV and the ROV’s CG coincides,the inertia matrix can be expressed as

MRB =

[mI3x3 03x3

03x3 Ig

], (8)

where Ig is the inertia matrix for the CG of the ROV.

If the ROV is assumed to have three planes of symmetry, the matrix describing theCoriolis effect and centripetal forces can be expressed, after eliminating cross terms, as

TSRT10 - Control System Project CourseTechnical Documentation 14

[email protected]

Page 22: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

CRB(ν)ν =

m(qw − rv)m(ru− pw)m(pv − qu)qr(Iy − Iz)rp(Iz − Ix)qp(Ix − Iy)

. (9)

5.5 Hydrodynamics

The external forces caused by hydrodynamic effects can be expressed as

τDyn = −MAν −CA(ν)ν −D(ν)ν (10)

where MA describes added mass and moment of inertia, CA(ν) describes the Corio-lis effects from the added mass and moment of inertia, and D(ν) models linear andquadratic damping effects.

Under the assumption that the ROV operates at low speeds and has three planes ofsymmetry, the matrices are defined as

MA = −

Xu 0 0 0 0 00 Yv 0 0 0 00 0 Zw 0 0 00 0 0 Kp 0 00 0 0 0 Mq 00 0 0 0 0 Nr

, (11)

CA(ν)ν =

Yvvr − ZwwqZwp−XuurXuuq − Yvvp

(Yv − Zw)vw + (Mq −Nr)qr(Zw −Xu)uw + (Nr −Kp)pr(Xu − Yv)uv + (Kp −Mqpq

, (12)

D(ν)ν = −

(Xu +Xu|u|u|)u(Yv + Yv|v|v|)v

(Zw + Zw|w|w|)w(Kp +Kp|p|p|)p(Mq +Mq|q|q|)q(Nr +Nr|r|r|)r

. (13)

TSRT10 - Control System Project CourseTechnical Documentation 15

[email protected]

Page 23: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

5.6 Hydrostatics

The hydrostatic forces relates to the gravitational pull and buoyancy force and can beexpressed as

τStat = −g(η), (14)

where τStat are generalised forces and the hydrostatic forces are described by

g(η) =

(W −B)sin(θ)

−(W −B)cos(θ)sin(φ)−(W −B)cos(θ)cos(φ)−zBBcos(θ)sin(φ)−zBBsin(θ)

0

. (15)

5.7 Actuators

For modelling of the forces generated by the actuators, a thrust geometry matrix T anda look-up table f(u) is used. The general model is described by

τAct , Tf(u), (16)

where the forces are split in to linear and angular velocities and the thrust geometrymatrix is defined as

τAct =

τuτvτwτpτqτr

,T =

0 0 1 1 0 00 0 0 0 0 −1−1 −1 0 0 −1 0ly1 −ly2 0 0 0 lz6lx1 lx2 −lz3 −lz4 −lx5 00 0 ly3 −ly4 0 0

,

where lai is the distance offset on the a-axis to thruster i.

The look-up table has been obtained from thrust measurement data and is split in to onetable for each thruster.

f(u) =

f(u1)f(u2)f(u3)f(u4)f(u6)

.

TSRT10 - Control System Project CourseTechnical Documentation 16

[email protected]

Page 24: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

5.8 Model in component form

In this section the accelerations in all six degrees of freedom of the ROV are describedby the following equations.

u =1

m−Xu

(τu + (Xu +Xu|u|u|)u+ (B −W )sin(θ) +m(rv − qw)− Yvrv + Zwqw

),

(17a)

v =1

m− Yv(−τv + (Yv + Yv|v|v|)v + (B −W )cos(θ)sin(φ) +m(pw − ru)−Xuru+ Zwpw

),

(17b)

w =1

m− Zw(τw + (Zw + Zw|w|w|)w + (B −W )cos(θ)cos(φ) +m(qu− pv)−Xuqu+ Yvpv

),

(17c)

p =1

Ix −Kp

(τp + (Kp +Kp|p|p|)p+BzBcos(θ)sin(φ) + qr(Iy − Iz +Nr −Mq) + vw(Zw − Yv)

),

(17d)

q =1

Iy −Mq

(τq + (Mq +Mq|q|q|)q +BzBsin(θ) + pr(Kp −Nr + Iz − Ix)− uw(Zw −Xu)

),

(17e)

r =1

Iz −Nr

(τr + (Nr +Nr|r|r|)r + pq(Mq −Kp + Ix − Iy) + uv(Yv −Xu)

).

(17f)

5.9 Simplified model structure

The model in Section 5.8 have a lot of cross terms which are hard to estimate. Thisis since estimating them requires excitement in more than one degree of freedom andthe current sensor setup is limited. The cross term should be negligible at low speeds.This gives a simplified model with only one degree of freedom terms, the terms wereestimated in [6] and are in table 2.

u =1

Du

(τu + (Xu +Xu|u|u|)u

), (18a)

v =1

Ev

(τv + (Yv + Yv|v|v|)v

), (18b)

TSRT10 - Control System Project CourseTechnical Documentation 17

[email protected]

Page 25: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

w =1

Fw

(τw + (Zw + Zw|w|w|)w − (B −W )

), (18c)

p =1

Ap

(τp + (Kp +Kp|p|p|)p+BzBsin(φ)

), (18d)

q =1

Bq

(τq + (Mq +Mq|q|q|)q +BzBsin(θ)

), (18e)

r =1

Cr

(τr + (Nr +Nr|r|r|)r

). (18f)

TSRT10 - Control System Project CourseTechnical Documentation 18

[email protected]

Page 26: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

6 Sensor Fusion

This section describes the function of the sensor fusion as it is implemented on the ROV.The purpose of the sensor fusion is to receive raw measurements from all the sensors andto reduce the amount of the disturbance and bias. Doing this will make the sensor fusionable to estimate all the states of the ROV, such as the different attitude and velocities.

6.1 Sensors

This section describe the current sensor configuration and its capabilities. All the sen-sors are placed on the ROV and thus the measurement will be in or closely related to theLCS.

6.1.1 Camera

A camera is mounted on the ROV and it will be used, for example, to follow objectswhich is explained further in section 9.

6.1.2 Gyroscope

The gyroscope measures angular velocities which can be integrated to angles. The mea-surements from the gyro are sent without any manipulation directly to the Complemen-tary Filter described in section 6.2 and thus the equation of the gyro is ygyro = ω+egyro,where the measurement error is modeled as egyro ∼ N(0, Rgyro) and ω is in the LCS.

6.1.3 Accelerometer

Equation (19) describes the measurement of the accelerometer, where the measurementerror is modeled as eacc ∼ N(0, Racc) and g is the gravitational acceleration. Thesemeasurements are directly sent to the Complementary Filter described in 6.2.

yacc = R(Θ)T

00−g

+ eacc (19)

6.1.4 Magnetometer

Since the Earth’s magnetic field is known, a magnetometer is a perfect way to measurethe ROV’s rotation relative to the Earth’s magnetic field, which can be related to the fix

TSRT10 - Control System Project CourseTechnical Documentation 19

[email protected]

Page 27: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

PCS. Thus, the equation for the magnetometer can be simplified to

ymag = R(Θ)T

√m2x +m2

y

0mz

+ emag (20)

Here the measurement error is modeled as emag ∼ N(0, Rmag) and mx,my,mz is themagnetic field measured when the ROV and the PCS is aligned. The measurements aresent to the Complementary Filter, but in the current implementation the magnetome-ter tends to drift even after large amount of time spent on calibration. Therefore theComplementary Filter do not use the received measurements.

6.1.5 Pressure sensors

On the ROV there are two pressure sensors, one inside with the hardware and one out-side in the water. The one in the water is used to estimate depth. The pressure sensorinside with the pressure offset is currently not used and the use of the outside pressuresensor is described in section 6.3.

6.1.6 Sonar sensors

A sonar is a measurement sensor which measures distance using ultrasound and is atype of Time of flight (TOF)-detector. The sonars are for practical reasons not placed inthe origin of the LCS coordinate system and thus all the measurements were translatedinto the LCS. They will measure the a distance that will be projected in to the xy-planeof the LCS before sent to the measurement update stage. The measurement equation forthe sonars is ys,i = hi(η) + es,i, where s stands for sonar, es,i is the measurement errorfor each of the sensors and i = 1, 2, 3. The handling of the measurements of the sonarswill be described in section 6.4.

6.2 Complementary filter

The Complementary filter is used to estimate the ROV’s angles using the HKPilot’s gy-roscope, accelerometer and magnetometer, for a more in depth explanation see [8]. Aproblem with the gyroscope is that the data received drifts during longer tests, and theestimated angles no longer stays true. As the accelerometer is bad at measuring fastangle-changes but works well at slow changes, it is used to compensate for the gyro-scopes drifting behaviour. The magnetometer is used to find the direction of the ROVcompared to the magnetic field of the earth and can be used to simplify the positioningand can be used if the global position is of interest. These problems is compensated forby using a "complementary" filter as it uses the angle of the gyroscope, with the additionof the data from the accelerometer, see equation 21 and Figure 5.

TSRT10 - Control System Project CourseTechnical Documentation 20

[email protected]

Page 28: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

The angle α is the estimated angle of the ROV, which is a combination of the angle ofthe gyroscope αG and it’s time derivative through a high-pass filter and the angle fromthe magnetometer and accelerometer αGM through a low-pass filter.

α = A · (αG +dαG

dt) + (1− A) · (αGM), 0 ≤ A ≤ 1 (21)

Figure 5: A block diagram describing the Complementary filter using the gyroscope,accelerometer and magnetometer.

6.3 Depth estimation

Using the pressure sensor mounted outside of the acrylic tube the depth estimation isinitialized with the ROV out of water and some measurements are recorded, which isused to calculate a reference value. This reference value is used as an offset whencalculating the pressure when the ROV is submerged to give a pure underwater pressurevalue which is calculated and published as a topic in meters. With the pressure sensormounted on the end of the acrylic tube the pressure is not measured at the center of LCS.As such the depth estimation subscribes on the topic published by the IMU to receive theattitude of the ROV in quaternions. According to definition is the conversion betweenquaternions to euler-angles as stated in equation (22), the variable θ is the rotation inpitch and φ is the rotation in roll. The q0, q1, q2 and q3 is the attitude of the ROV givenin quaternions. (

θφ

)=

arcsin(2(q0q2 − q3q1))

arctan

(2(q0q1+q2q3)

1−2(q21+q22)

) (22)

Having the rotation of the ROV in pitch and roll makes it possible to calculate the depthat the center of the LCS according to equation (23). The variable h is depth in meters,

TSRT10 - Control System Project CourseTechnical Documentation 21

[email protected]

Page 29: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

ppressure is the measured pressure, pref is the air/reference pressure, ρw is the density ofwater and g is the gravitational constant. The constant xoffset is the distance from thepressure sensor to the LCS in the x-direction and zoffset is the distance in z-direction.

h =ppressure − pref

ρwg+ xoffsetsin(θ)− zoffsetcos(φ)cos(θ) (23)

Since the pressure sensor was quite precise, there was no need to filter the data.

6.4 Extended Kalman Filter

As described in the section 6.2 the complementary filter will estimate the attitude andthus the EKF will only contain position and velocity states. Since the section 6.3 willestimate the z-position. Thus the states will be,

x = [px py vLCS,x]T (24)

where px is the x-position in PCS, py is the y-position in PCS and vLCS,x is the velocityin the roll direction of the LCS project in the xy-plane of the PCS. The EKF uses a timediscrete motion model and measurement model

xk+1 = f(xk, uk) + vk

yk = h(xk, uk) + ek(25)

Here f(xk, uk) is the time discrete motion model described in 6.4.1 and h(xk, uk) is themeasurement model described in 6.4.2. vk and ek for the sonar sensors described in6.4.2 are process noise and measurement noise. They are both assumed to be Gaussiandistributed, where vk ∼ N(0, Qk) and ek ∼ N(0, Rk). The whole EKF-algorithm isdescribed in Algorithm 1.

TSRT10 - Control System Project CourseTechnical Documentation 22

[email protected]

Page 30: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Algorithm 1 Extended Kalman FilterThe filter is initalized with x1|0 = x0 and P1|0 = P0.Measurement update:

Innovation covariance : Sk = Rk +HkPk|k−1HTk

Kalman gain : Kk = Pk|k−1HTk S−1k

Innovation : εk = yk − h(xk|k−1)

xk|k−1 +Kkεk (26)

Pk|k = Pk|k−1 − Pk|k−1HTk S−1k HkPk|k−1 (27a)

Time update:xk+1|k = f(xk|k) (27b)

Pk+1|k = Qk + FkPk|kFTk (27c)

Where, Hk = ∂h∂x

∣∣xk+1|k

and Fk = ∂f∂x

∣∣xk|k,uk

6.4.1 Motion model

Here a modified constant velocity model was used, with the three states that were de-scribed above. The complementary filter will estimate the attitude and the angular ve-locities comes from the gyroscope, as such these are assumed to be known by the EKF.The motion model is given as,

xk+1 =

px(k + 1)

py(k + 1)

vLCS,x(k + 1)

=

px(k) + Tscos(ψ − ψoffset)vLCS,x(k)

py(k) + Tssin(ψ − ψoffset)vLCS,x(k)

vk

+TsI3×3

vpx

vpy

vvLCS,x

(28)

where ψoffset is the initial angle difference between the PCS and LCS.

6.4.2 Measurement update

Once the sonar measurements are received the measurement update will run. The mea-surement update will first calculate which wall the sonar sensor measures the distancefrom, using the current yaw, yaw-offset and sensor-offset, where the yaw-offset is theangle between the earth’s magnetic field and the PCS. Now when the wall has been es-timated the measurement is projected in to a x and y part in the PCS. This value can beused as reference from a wall and is denoted as h(η) and can be used, as seen in figure 6,to update the px or the py state, if the measurement passed the outlier rejection criteria.

TSRT10 - Control System Project CourseTechnical Documentation 23

[email protected]

Page 31: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

So h(η) can be described as,

h(η) =

Xmax − cos(α) ·∆p, if left wall is hitcos(α) ·∆p, if right wall is hitYmax − sin(α) ·∆p, if upper wall is hitsin(α) ·∆p, if lower wall is hit

depending on what wall that was hit and where ∆p is the received distance from thesensor projected in the xy-plane and α = yaw − yawoffset + sensoroffset.

Figure 6: Explanation of how the sonar measurement updates the global position

6.5 Further development

Even though the logged data is good when displayed the magnetometer sometimes in-crease the yaw when standing still. This is probably due to non-static disturbances in themagnetic-field. These disturbances can be from the ROV or the pool, so further investi-gation is needed. If the disturbances are from the ROV a new more exact magnetometerwhich is placed where the disturbances are smaller could solve the problem and thusfurther improve the angle estimates of the complementary filter.

More precise sensors are required for the position filter to be able to work as intended. Ifnew more precise sensors are installed a SLAM-algorithm could be implemented ratherthan the position filter with initial known position that is implemented currently.

TSRT10 - Control System Project CourseTechnical Documentation 24

[email protected]

Page 32: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

7 Graphical User Interface

The following section will describe the graphical user interface and how it was imple-mented on the ROV. The GUI is based on rqt and the purpose of the GUI is to make theROV easier to use. It can as an example be used to handle visualization of data and tochange different settings and parameters of the ROV.

7.1 Overview

The user is able to set which mode the ROV is operated through in the GUI. In its currentstate the ROV is able to run in four different operating modes, vision-, path planner-,publisher- and XBOX-mode. The user can also change different camera settings, PID-parameters etc. All topics such as thruster forces, sensor fusion states and referencesignals can also be visualized in a real-time plot in the GUI.

The rqt GUI is based on different plugins, see [7], which is installed to be used onthe workstation. At the moment five plugins are used for the GUI, these are rqt_bag,rqt_plot, rqt_publisher, rqt_topic and rqt_image_view.

When the workstation is activated the main GUI is started automatically with it. Themain GUI consists of Topic monitoring-, Plot- and Camera image perspective.

7.1.1 Topic monitoring

This part of the GUI is built from three different plugins, the rqt_bag, rqt_publisherand rqt_topic. The topic monitoring window consists of these three plugins combinedin a user friendly layout.

7.1.2 Plot

The plot window works as intended out of the box, it is the rqt_plot plugin, nothing hasbeen modified.

7.1.3 Camera image

The last part of the GUI is the camera image. It is started together with the workstationbut in order to see the camera image you need to manually subscribe to the image topic.This is the rqt_image_view plugin, nothing has been modified.

TSRT10 - Control System Project CourseTechnical Documentation 25

[email protected]

Page 33: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

7.2 Other rqt plugins

The rqt-packages provide a very powerful and easy tool to implement different GUItools. All plugins under the category common plugins for rqt can be added to the GUI.In this project a few plugins were used and although only five ended up in the final GUIsome other where used during the project. One plugin which was found very useful wasrqt_graph which gives a plot of the current ROS system together with all nodes andtopics. In this plot it can be seen which node that publishes every topic and which nodesthat subscribes to it. This is only one of many useful plugins, see [7] for more.

How the GUI looks and a description on how to use it is found in the user manual [3].

7.3 Further development

The GUI has a great deal of functionality and is implemented to be self explanatory anduser friendly. In the forthcoming development there might be incentive to make it moreaesthetically appealing. The implementation is grey and boring in its current state whichmight not appeal the user. It can also become even more user friendly. As an examplebuttons could be added to change operating mode rather then sending a topic command.

TSRT10 - Control System Project CourseTechnical Documentation 26

[email protected]

Page 34: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

8 Control System

For this year’s project an entirely new control system was developed with the intent ofhaving it running on the ROV. The tool used to create the control system was Simulinkfrom which C++ code was generated, as requested by the customer. In accordance withthe long-term goal of developing a fully autonomous ROV the controller was moved tobe run on the ROV platform instead of the workstation as for previous years.

The controller implemented is a decentralized controller with six PIDs, one in eachDOF.

The ROV has four operating modes, Path planer, Vision, Manual publishing of refer-ence signals and XBOX mode. The third mode is mainly intended to be used whendeveloping the ROV. XBOX mode is used for open-loop control of the ROV while Pathplaner and Vision are used to perform different autonomous functions.

Previous year’s project implemented a LQ-regulator which relies heavily on the model.Due to unsuccessful tests of their controller a concern towards the reliability of themodel was raised during this project. Especially when operating the ROV in directionsfor which the model’s cross terms are not negligible, since the model parameters werederived using telegraph tests in one DOF at a time. The uncertainty of the model waskept in mind when designing the control system. Another aspect that was consideredwas how much capacity the RPI3 of the ROV has. Effort has thus been made to makethe controller have as low computational cost as possible.

8.1 Control Theory

This section briefly introduces the theory of control methods used in the project.

8.1.1 Decentralized control

The simplest way to deal with input/output cross couplings in a system is to simplydisregard them and instead establish pairs between input- and output signals. In orderto achieve as weak cross couplings as possible, pairs are identified by studying theRGA matrix which states how inputs and outputs are coupled. The result is a diagonalcontroller, where in this project PID controllers were implemented for each input/outputpair. The method relies upon cross-couplings being weak enough to be negligible aswell as having the same number of inputs as outputs. [9]

8.1.2 PID

The control system consists of five different PID regulators, one for each degree offreedom except y-axis velocity. A PID controller consists of a proportional, integral andderivative part, which computes the control signal according to (29).

TSRT10 - Control System Project CourseTechnical Documentation 27

[email protected]

Page 35: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

u(t) = Kpx(t) +Ki

∫ t

0

x(τ)dτ +Kddx(t)

dt(29)

The constantsKp,Ki andKd in the PID controller can be tuned to achieve a satisfactorybehaviour.

8.1.3 Set-point weighting

In order to reduce large initial peaks of the control signals and minimize overshoots ofa system when changing references, set-point weighting can be introduced. It entailsintroducing two parameters, β and γ which scale the references which the proportionaland derivative parts act on respectively, (30).

u(t) = Kp(βr(t)− y(t)) +Ki

∫ t

0

(r(τ)− y(τ))dτ +Kd

(γdr(t)

dt− dy(t)

dt

)(30)

In this project β has been chosen to one and γ to zero to reduce the initial peak ofthe control signals when changing references. It is mainly used as a means to not getsaturated control signals and for better reference tracking. [11]

8.1.4 Decoupled control

In order for decentralized control to work there must be a natural pair of input- andoutput signals. If this is not the case an exchange of variables can be made as to make thetransfer matrix from input to output as diagonal as possible. In this project decouplingwas used from thruster control signals to forces and torques in the ROV’s 6 DOFs [9].

8.1.5 Feedback linearization

Because most control methods are based on linear systems, nonlinear systems are usu-ally transformed to equivalent linear systems.

One method to do so, is to use feedback linearization. Feedback linearization transformsa nonlinear system of the form shown in (31a), (31b) to a linear system, by developinga control input as shown in (31c). The aim of the developed control input is to yield alinear relation between the new control input introduced, v, and the systems output.

x = f(x) + g(x)u (31a)y = h(x) (31b)

TSRT10 - Control System Project CourseTechnical Documentation 28

[email protected]

Page 36: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

u = a(x) + b(x)v (31c)

For this model a feedback linearization can be developed according to [2]. The new con-trol signal is then chosen to be the body-fix accelerations ab. This results in a linearisedsystem

ν = ab (32)

The dynamic system model represented in (1) is rewritten in (33a) to only contain oneterm including all nonlinearities. It has also been rewritten to be on the form of (31a).Here, ν is equivalent to x of the general example. f(x) is a nonlinear function and τrepresent the input control signals with g(x) being an identity matrix.

Mν = −G−1(ν,η) + τ (33a)

whereG−1(ν,η) = C(ν)ν +D(ν)ν + g(η). (33b)

To get the linear system in (32) the new control signal is to be chosen as

τ = Mab +G−1(ν,η) (34)

a(x) and b(x) in (31c) is represented byG−1(ν,η) andM respectively in 34.

TSRT10 - Control System Project CourseTechnical Documentation 29

[email protected]

Page 37: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

8.2 General structure

An overview of the control system’s general structure is presented in figure 8.2. Thecontrol system contains a method to pass references, a controller to compute controlsignals and an observer used to estimate states which are used in a feedback loop.

Figure 7: An overview of the ROV control system.

8.3 Controller

The implemented controller is a decentralized controller of PIDs used to compute de-sired accelerations in the ROV’s six DOFs. The control variables used are displayed in(35). (Value in parenthes is used for XBOX mode only)

x =

ebx (abx)

0zp

φθψ

(35)

The choice to not control the ROV in LCS y was based on observations of poor per-formance in open-loop control and insufficient insight about the model quality. Depthcontrol is done along the GCS z-axis and yaw in GCS xy-plane. Pitch and roll motionare controlled about LCS.

TSRT10 - Control System Project CourseTechnical Documentation 30

[email protected]

Page 38: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

The controllers compute the desired accelerations according toabx0apzabφabθapψ

= Kpe(t) +Ki

∫ t

0

e(τ) +Kddxestdt

(36)

The derivative part only acts on the state estimate to avoid large derivatives (and thuscontrol signals) occurring for rapid reference changing. Also note that the PID parame-ters are different for all six states.

From accelerations the desired forces and torques along and about the ROV’s LCS areto be computed. However, this requires all accelerations to be in LCS. With depth andyaw controllers outputting accelerations in GCS have to be transformed into LCS. Fordepth control while pitching or rolling, transforming apz into local acceleration yieldscomponents along LCS x and y axes. Transformation was done by using the third col-umn of the rotational transformation matrix passed to the controller from sensor fusionmodule (corresponding to rotation about the GCS z-axis) and multiplying it with apz, see(37). abxaby

abz

=[0 0 1

]R(Θ)Tap

z (37)

Same reasoning applies to yaw motion which means apψ needs to be transformed to localaccelerations yielding acceleration components about LCS x and y axes. Yaw motionstill concerns rotations about the GCS z-axis and so the same sub-matrix of the rotationaltransformation matrix was used here too, (38).abφabθ

abψ

=[0 0 1

]R(Θ)Tap

ψ (38)

The choice of controlling the ROV’s depth in GCS has the perk of allowing the ROV tohave arbitrary attitude when changing depth and still retain its x and y positions. Con-trolling the ROV’s yaw in the GCS xy-plane allows the ROV to have arbitrary attitudewhen yawing while maintaining fix positions in GCS x, y and z. It also makes the ROVeasier to operate in XBOX mode.

With all accelerations represented in the LCS, the corresponding forces or torques arecomputed using feedback-linearization, i.e applying equation (34).

TSRT10 - Control System Project CourseTechnical Documentation 31

[email protected]

Page 39: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

8.4 Thrust decoupling

Decoupling of thruster forces is done by calculating the inverse of the thrust geometrymatrix, T−1. T−1 is multiplied on both sides of the equal sign of (16) as (39).

T−1τAct = T 1Tf(u)⇐⇒ T−1τAct = If(u) (39)

(39) is used to compute thruster forces from the desired forces and torques acting alongand about the LCS axes passed from the controller. The actual thruster signals arecomputed using a look-up table from force to control signal.

8.5 Modes of operation

The ROV can be operated in four different modes that determines which reference sig-nals that are passed on to the control system. The four modes, and the correspondingreference signals for the particular mode, are presented in this section.

8.5.1 Vision mode

In vision mode, reference signals from the vision module are passed on to the con-trollers. The reference signals are the distance errors along the LCS x-axis calculatedaccording to (49), the yaw reference calculated according to (50) and the desired depth.Reference signals in other degrees of freedom are set to zero.

8.5.2 Path planner mode

In path planner mode, reference signals are passed on to the controllers from the pathplanner. The path planner calculates reference signals for distance error along the LCSx-axis, yaw angle, and depth. Reference signals in other degrees of freedom are set tozero.

8.5.3 Publisher mode

In publisher mode, reference signals sent by the ROV operator through the GUI arepassed on to the controllers. The operator has the possibility to input reference signalsin depth, roll, pitch and yaw. The reference signals in other degrees of freedom are setto zero.

TSRT10 - Control System Project CourseTechnical Documentation 32

[email protected]

Page 40: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

8.5.4 XBOX mode

In XBOX mode, reference signals from the XBOX controller are passed on to the ROVcontroller. From the XBOX controller reference signals can be sent for all degrees offreedom except in the LCS y-axis distance error. From the XBOX controller, signals forenabling and disabling the thruster can also be sent.

8.6 Communication

The controller was developed using Simulink from which C++ code was generated. Itwas then uploaded to a ROS node called Controller running on the RPI. The controlleris only communicating via topics.

Which topics the controller is subscribing to and which topics that are published fromthe controller can be seen in section 4.1, where more details about the message typesetc. can be seen.

8.7 Further development

For further development of the controller a feed-forward might be of interest. It wasexperimented with in this project but ended up being too time consuming and not nec-essary as the regulator worked well enough without one. A feed-forward might help theROV operate at higher speeds as it helps reduce overshoots by knowing desired outputcontrol signals in advance.

The PID parameters used for this project are probably not optimal but were insteadsettled for adequate performance in accordance with the requirements. Thus, the per-formance of the ROV can probably be improved by better tuning of the PID parameters.β and γ parameters of set-point weighting can probably also be chosen better. In thisproject no wider tests were done to decide them, but they were instead chosen conser-vatively as β = 1 and γ = 0.

Since the model is a non-linear one, there are no optimal PID parameters for the ROV’sentire operating range. One interesting topic then, is whether the PID parameters canbe dynamically tuned, dependant of operating point, to give optimal performance of theROV independent of operation.

It is uncertain how much of the RPI3’s processing power is used today. Maybe an MPCor LQ-regulator which demand more computing force can be implemented. This ofcourse depends on how well the model works. However, the initial belief of a poormodel has changed and the current belief is that the model may be adequate enough toimplement a model-based regulator. It will probably depend on how much impact theneglected cross-terms have.

TSRT10 - Control System Project CourseTechnical Documentation 33

[email protected]

Page 41: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

9 Vision System

The following section describes how the vision system is implemented. How the moduleis incorporated in the ROS framework as well as the object detection and tracking isdiscussed in this section.

The main purpose of the vision system is to be able to detect and track the distance andangles to an object. An autonomous function to search for the object when detection islost is also implemented.

9.1 Hardware

A Raspberry Pi Camera V2 is used as the camera in the vision system. The camera islocated inside the acrylic tube of the ROV and is facing towards the front. The cam-era is connected to a Raspberry Pi 3 B+ and the image processing is handled on theworkstation.

9.2 Software

Communication between the camera and the Raspberry Pi is handled through a ROSnode called raspicam_node. This package uses two topics for communicating cameraimages and camera info. It also makes it possible to set different camera parameters likeframerate, height, width, quality and much more.

Some camera calibration has to be performed to increase the camera’s ability to detectobjects in under water conditions. The calibration is done with a calibration script and alaminated checkerboard. Preferably the calibration is performed under water since theROV is operating in that environment. The calibration process is described in furtherdetail in the user manual, see [3].

The OpenCV library [10] is used for the image processing. OpenCV contains manyready-made functions that are useful when developing image processing algorithms likeobject detection.

The vision system code is located in src/vision/ and consists of two files, object_detection.hand object_detection.cpp.

9.3 Object

The object that is used for detection is a brightly colored yellow plastic ball. The yellowball performs best under water compared to its blue and green counter parts since itgives better contrast compared to the color of the water. The final color available, red,might have given even better contrast but poses some challenges when thresholding theimage with HSV parameters. More on this in the next section.

TSRT10 - Control System Project CourseTechnical Documentation 34

[email protected]

Page 42: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

9.4 Object detection

With the specified object a simple threshold is used to get a binary image containingthe object (along with some minor noise). To get a detected object with a smooth shapea Gaussian blur is applied to the original image to smooth out the colors of the image.Small noise and holes in the the detected binary object can be remedied using binaryopening and closing respectively. These concepts are illustrated in Figure 8. Using somebasic image processing techniques the radius and center of the object can be calculated.By using an object of known size the radius can be used to conclude how far awaythe object is located. A HSV conversion is made before thresholding. HSV separatesthe color space of the image into three different matrices: hue, saturation and value.Hue represents the color and saturation/value represents how mixed the color is withwhite/black respectively. This allows for easier thresholding than simply using the givenRGB values.

Original image

1000 2000 3000 4000

500

1000

1500

2000

2500

3000

Thresholding performed

1000 2000 3000 4000

500

1000

1500

2000

2500

3000

Binary closing performed

1000 2000 3000 4000

500

1000

1500

2000

2500

3000

Biggest object extracted

1000 2000 3000 4000

500

1000

1500

2000

2500

3000

Figure 8: Overview of required object detection steps.

The following list summarizes the implemented object detection algorithm:

1. Read the original image from the ROS topic /raspicam_node/image/compressed.

TSRT10 - Control System Project CourseTechnical Documentation 35

[email protected]

Page 43: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

2. Apply a Gaussian blur to the image.

3. Convert the blurred image from RGB to HSV.

4. Perform HSV thresholding.

5. Perform binary opening and closings.

6. Find contours in the image and extract the contour with the largest area.

7. Fit a circle to enclose all pixels in the contour.

8. Perform geometrical calculations based on the radius of the enclosing circle, theposition of the center of the circle and intrinsic camera parameters.

9.5 Geometrical calculations and reference signals

When the vision system has extracted the largest object from an image the next step isto perform geometrical calculations on this object to be able to pass reference signals tothe control module. This section describes how this is implemented.

9.5.1 Intrinsic camera parameters and distance calculation

The geometrical calculations are dependant on some intrinsic camera parameters. Theequation to estimate the distance to the ball looks like

D =FrobjS

, (40)

where D [mm] is the distance to the object, F [mm] is the focal length, robj [mm] is theknown radius of the object and S [mm] is defined as

S =r

P, (41)

where r [pixels] is the average detected radius in the image over multiple frames and P[pixels/mm] describes how pixels are related to real world distances.

The focal length is given on the datasheet for the camera and has the value F =3.04mm. The radius of the brightly colored ball was measured and found to be robj =3.5mm. To calculate how the pixels are related to real world distances, P needs to becalculated. This parameter is dependant on two additional intrinsic camera parametersfx and fy which are focal lengths in x- and y-directions expressed in pixel units. P canwith these additional parameters be calculated as

P =(fx + fy)/2

F. (42)

TSRT10 - Control System Project CourseTechnical Documentation 36

[email protected]

Page 44: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

This equation gives a reasonable estimation if fx and fy is somewhat similar, which isthe case for the camera in this project where fx = 446.690641 and fy = 444.360137.These values can be calculated with the calibration script included in the project, seemore about the calibration process in the user manual [3].

With all this information a distance in millimeters can be estimated to the detectedobject.

9.5.2 Angle calculation

The angle to the detected object can be calculated in a similar fashion with the additionof information regarding where the center of the detected object is located in the image.The coordinated of the image is set so that the x-axis moves to the right along the widthof the image and the y-axis moves down along the height of the image, see Figure 9.

Figure 9: Coordinate system in images

A pixel value deviation from the center of the object to the middle of the image can becalculated as

∆x = xobj − xc (43)

TSRT10 - Control System Project CourseTechnical Documentation 37

[email protected]

Page 45: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

∆y = yobj − yc. (44)

The deviations can then be translated in to millimeters as follows

∆xmm =∆x

r/robj(45)

∆ymm =∆y

r/robj. (46)

And the respective angles to the object are calculated as

αx = − arcsin

(∆xmmD

)(47)

αy = arcsin

(∆ymmD

)(48)

9.5.3 Passing reference signals

Since the control module is implemented to calculate control signals based on an errorfrom a nominal value, these errors needs to be calculated in the vision module. Adistance error is calculated as

Dref = (D −Dnom) ∗ 0.01 (49)

where Dnom = 0.5m and the scaling factor 0.01 is a design choice to get errors of amagnitude that is reasonable for the control module.

A yaw reference is calculated as

ψref = ψ + αx (50)

where ψ is obtained from the sensor fusion and converted from a quaternion value toradians since the reference ψref is passed to the control module as radians.

Finally, a depth reference is calculated as

zref = z +αyD

100(51)

where z is the current depth obtained from the sensor fusion and zref is the referencesignal sent to the control module.

TSRT10 - Control System Project CourseTechnical Documentation 38

[email protected]

Page 46: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

9.6 Further development

As the image processing is sensitive to the lighting of the surroundings this should bethe main focus in further development of the vision module. During this project someideas on how to get stable color rendering were investigated.

The first idea was to mount a light source on the ROV pointing in the direction of thecamera. LED-strips were mounted inside the acrylic tube facing towards the front of theROV, but this approach turned out to create too much reflection in the acrylic tube whichcaused disturbances in the image processing. A powerful external light source mountedoutside of the acrylic tube might improve the performance of the image processing sinceit will give similar lighting conditions for the camera regardless of the surroundings.

A second approach was to change camera parameters such as auto white balance, expo-sure and saturation. During the project it became apparent that the colors in the imagechanged a lot depending on the lighting since the camera makes automatic adjustmentsto the previously mentioned parameters. If these automatic adjustments could be dis-abled or compensated for somehow, the color rendering might become more stable.

To make further developments in autonomy, possible improvements to the vision mod-ule could be to not only extract the largest detected object in an image but to search forobjects of a certain shape. It might also be of interest to increase the resolution of thecamera image to get better performance when detecting small objects or objects that arefar away. During this project the vision module was initially developed to run on theRaspberry Pi but was then deemed too computationally heavy and placed on the exter-nal workstation and thus the performance with a higher resolution camera image wasnot investigated.

TSRT10 - Control System Project CourseTechnical Documentation 39

[email protected]

Page 47: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

10 Autonomy

This section describes the autonomous functionality of the ROV.

10.1 Move to predefined location

The ROV has the ability to perform a mission where it moves to a predefined locationwithin a known pool environment with a deviation of at most 50 cm in X,Y,Z in PCS.The ROV has the ability to perform such a mission by using the path planner developedduring the 2018 project. This is a part of the controller module that uses the coordi-nates for a desired position and the estimated position to control the ROV to the desiredposition. Using this, the path planner calculates the reference values for the following:

• Position in PCS z-axis

• Yaw angle, ψ, in PCS

• Distance in LCS x-axis

Using the path planner algorithm, the ROV will reach the predefined location throughsequential maneuvering. The path planner will first control the ROV such that the de-sired position along the PCS z-axis is achieved within a specified error tolerance. Whenthe position is within the error tolerance the path planner will also send reference signalfor yaw angle. When the yaw angle is within the corresponding error tolerance the pathplanner will output the distance to the desired position along LCS x-axis to the PIDcontrollers.

The path planner receives the desired positions along PCS X,Y,Z-axes by subscribingto ROS topic /ref_pos. To receive the current state estimates for X,Y,Z and yaw inPCS coordinates the path planner subscribes to ROS topic /rovio/pos_est.

10.2 Rise to surface

In the event of the ROV losing connection to the workstation, i.e. no heartbeat, theROV has the functionality that it will slowly rise to the surface of the water. The mainadvantage of this functionality is that it facilitates the process of recovering the ROVfrom the water, since it will not sink to large depths unless the battery is drained. Infigure 10 the rise to surface functionality is graphically presented using data loggedduring ROV operation.

TSRT10 - Control System Project CourseTechnical Documentation 40

[email protected]

Page 48: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

Figure 10: The estimated depth when the heartbeat signal is lost.

10.2.1 Further development

There are possibilities to further develop the ROV autonomy based on the current go-to-point functionality. Since the ROV is able to move to a predefined location, a naturalfurther development is to develop the control system module such that the ROV canreach several predefined points in a predefined order.

Another area of possible further development is the path planner algorithm. As ex-plained, the current path planner algorithm reaches the desired position through threesequences. If a functionality is developed that allows the ROV to reach several points ina predefined order, a new path planner algorithm might be suitable to achieve a smootherand possibly less time consuming trajectory. There is however some limitations to this.For example, the performance of the controllers when the ROV is excited in more thanone degree of freedom at a time has not been tested thoroughly. Since the parametersregarding the cross terms in the ROV model have not been estimated, and the inverseof the model is included in the controller, the performance when controlling the ROVin several degrees of freedom at a time might be insufficient. Therefore, if a new pathplanner is developed that excites the ROV in several degrees of freedom simultaneouslyit might be necessary to estimate the cross terms in the model.

TSRT10 - Control System Project CourseTechnical Documentation 41

[email protected]

Page 49: Technical Documentation - isy.liu.se...Remotely Operated Underwater Vehicle 2019–01–04 1 Introduction The area of autonomous vehicles is an area with an aggressive progression

Remotely Operated Underwater Vehicle 2019–01–04

References[1] A. Aili and E. Ekelund. Model-Based Design, Development and Control of an Un-

derwater Vehicle. Msc Thesis - LiTH-ISY-EX-16/4979-SE, Sweden: Linköping Uni-versity, 2016.

[2] Thor Fossen. HANDBOOK OF MARINE CRAFT HYDRODYNAMICS AND MO-TION CONTROL. Chichester, West Sussex, U.K. ; Hoboken N.J. : Wiley, 2011.,2011.

[3] M. Drangel. User Manual, Remotely Operated Underwater Vehicle., http://www.isy.liu.se/edu/projekt/tsrt10, 2018.

[4] A. Aili and E. Ekelund. Model-Based Design, Development and Control of an Un-derwater Vehicle. Msc Thesis - LiTH-ISY-EX-16/4979-SE, Sweden: Linköping Uni-versity, 2016.

[5] N. Sundholm. Technical Documentation, Remotely Operated Underwater Ve-hicle. http://www.isy.liu.se/edu/projekt/tsrt10/2016/rov/documents/TSRT10_ROV_2016_technical_documentation.pdf,2016.

[6] M. Homelius. Technical Documentation, Remotely Operated UnderwaterVehicle., http://www.isy.liu.se/edu/projekt/tsrt10/2017/rov/doc/TSRT10_ROV_Technical_Documentation.pdf, 2017.

[7] rqt common plugins http://wiki.ros.org/rqt_common_plugins

[8] Roberto G. Valenti, Ivan Dryanovski and Jizhong Xiao, Keeping a Good Attitude:A Quaternion-Based Orientation Filter for IMUs and MARGs

[9] Torkel Glad, Lennart Ljung Reglerteori, Flerariabla och olinjära metoder.

[10] OpenCV Library, https://opencv.org/, Accessed: 2018-12-11

[11] Integrator Wind-up https://www.cds.caltech.edu/~murray/courses/cds101/fa04/caltech/am04_ch8-3nov04.pdf Accessed:2018-12-12

TSRT10 - Control System Project CourseTechnical Documentation 42

[email protected]