mas industrial robotics
TRANSCRIPT
MAS Industrial RoboticsRelease latest (Kinetic)
Jun 19, 2021
Contents
1 About 1
2 Getting started 32.1 Install Ubuntu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.2 Git - Version Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.3 ROS - Robot Operating System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.4 Setup catkin workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.5 Bring up the robot and its basic components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3 Contributing 73.1 Robot Operating System (ROS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73.2 Coding guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103.3 Toolkit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4 Perception 154.1 Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154.2 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164.3 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.4 Object segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.5 Object recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.6 Barrier tape detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.7 Cavity detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5 Manipulation 19
6 Navigation 216.1 On workstation or your PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216.2 2D SLAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216.3 2D Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
7 Planning 257.1 Refbox parser . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257.2 Planning bringup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267.3 Planning core . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267.4 Task planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277.5 Planner executor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297.6 PDDL problem generator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
i
7.7 Knowledge base analyzer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307.8 MIR Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307.9 Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317.10 Planning Visualisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
8 Simulation 378.1 Mapping and Navigation for Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378.2 World Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
9 Perception Documentation 43
10 Manipulation Documentation 4510.1 mir_moveit_scene_ros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4510.2 mir_pregrasp_planning_ros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
11 Navigation Documentation 4711.1 mir_move_base_ros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
12 Planning Documentation 4912.1 mir_actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.2 mir_knowledge_base_analyzer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.3 mir_knowledge_ros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.4 mir_manipulate_drawer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.5 mir_planning_visualisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.6 mir_task_planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.7 planner_wrapper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.8 mir_states . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5012.9 mir_refbox_parser_ros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
13 Simulation Documentation 5113.1 mir_world_generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
ii
CHAPTER 1
About
1
MAS Industrial Robotics, Release latest (Kinetic)
2 Chapter 1. About
CHAPTER 2
Getting started
2.1 Install Ubuntu
The repository and its related components have been tested under the following Ubuntu distributions:
• ROS Kinetic: Ubuntu 16.04
If you do not have a Ubuntu distribution on your computer you can download it here.
2.2 Git - Version Control
• Install Git Software
Install the Git core components and some additional GUI’s for the version control:
sudo apt-get install git-core gitg gitk
• Set Up Git
Now it’s time to configure your settings. To do this you need to open a new Terminal. First you need to tell gityour name, so that it can properly label the commits you make:
git config --global user.name "Your Name Here"
Git also saves your email address into the commits you make.
git config --global user.email "[email protected]"
• GIT Tutorial If you have never worked with git before, we recommend to go through the basic git tutorial.
3
MAS Industrial Robotics, Release latest (Kinetic)
2.3 ROS - Robot Operating System
• Install ROS
The repository has been tested successfully with the following ROS distributions. Use the link behind a ROSdistribution to get to the particular ROS Kinetic installation instructions.
Note: Do not forget to update your .bashrc!
• ROS Tutorials
If you have never worked with ROS before, we recommend to go through the beginner tutorials provided byROS.
In order to understand at least the different core components of ROS, you have to start from tutorial 1 (“Installingand Configuring Your ROS Environment”) till tutorial 7 (“Understanding ROS Services and Parameters”).
2.4 Setup catkin workspace
• Create a catkin workspace
source /opt/ros/kinetic/setup.bashmkdir -p ~/kinetic/src; cd ~/kinetic/srccatkin_init_workspacecatkin build
• Clone and compile the MAS industrial robotics software
First of all you have to clone the repository.
cd ~/kinetic/src;git clone [email protected]:b-it-bots/mas_industrial_robotics.git
Then go on with installing further external dependencies:
cd ~/kinetic/src/mas_industrial_robotics./repository.debssource ~/kinetic/devel/setup.bash
The last command should be added to the ~/.bashrc file so that they do not need to be executed everytime youopen a new terminal.
And finally compile the repository:
cd ~/kineticcatkin build
If no errors appear everything is ready to use. Great job!
• Setting the Environment Variables
– ROBOT variable
With the ROBOT variable you can choose which hardware configuration should be loaded when startingthe robot. The following line will add the variable to your .bashrc:
4 Chapter 2. Getting started
MAS Industrial Robotics, Release latest (Kinetic)
echo "export ROBOT=youbot-brsu-1" >> ~/.bashrcsource ~/.bashrc
– ROBOT_ENV Variable
The ROBOT_ENV variable can be used to switch between different environments. The following line willadd the variable to your .bashrc:
* Real robot
echo "export ROBOT_ENV=brsu-c025" >> ~/.bashrcsource ~/.bashrc
* Simulation
echo "export ROBOT_ENV=brsu-c025-sim" >> ~/.bashrcsource ~/.bashrc
2.5 Bring up the robot and its basic components
• In Simulation
roslaunch mir_bringup_sim robot.launch
In a new terminal you can open the Gazebo GUI to see the environment and the robot
rosrun gazebo_ros gzclient
• At the Real Robot
roslaunch mir_bringup robot.launch
• Test the base
roslaunch mir_teleop teleop_keyboard.launch
• Visualize the robot state and sensor data
rosrun rviz rviz
• Build a map for base navigation
roslaunch mir_2dslam 2dslam.launch
• Use autonomous navigation
– Omni-directional navigation
roslaunch mir_2dnav 2dnav.launch nav_mode:=dwa
Click on the menu bar “File -> Open Config”, navigate to “~/indigo/src/mas_industrial_robotics” and select the“youbot.rviz” file.
2.5. Bring up the robot and its basic components 5
MAS Industrial Robotics, Release latest (Kinetic)
6 Chapter 2. Getting started
CHAPTER 3
Contributing
3.1 Robot Operating System (ROS)
3.1.1 ROS packages naming and structure
Creating a new package
• Naming
In order to create a new ROS package for one of the repositories some rules need to be considered:
1. The package name has always the following format:
prefix_my_package_name
2. Use the right prefix for every repository:
a. mas_domestic_robotics: mdr_
b. mas_industrial_robotics: mir_
c. mas_common_robotics: mcr_
3. Use lowercase.
4. Separate words in the package name by underscores (_).
Examples for creating packages according to the above described rules are as follows:
catkin create_pkg mdr_grasp_planningcatkin create_pkg mir_whole_body_controlcatkin create_pkg mcr_object_detection
• Folder structure
Every ROS package within our repositories has to strictly match the following structure:
7
MAS Industrial Robotics, Release latest (Kinetic)
.common
configinclude
<package_name>src
<package_name>testtools
rosconfiginclude
<package_name>launch
<package_name>.launchrviz
<package_name>.rvizscripts
<package_name>src
<package_name>_nodetesttools
CMakeLists.txtpackage.xmlsetup.pyREADME.md
In short:
– ROS-independent code goes into the common folder
– the ros folder contains a ROS-wrapper for the functionality you are adding
Meta-packages
If the package you are creating is meant to contain other packages inside of it, it needs to have instead the followingstructure:
./<meta_package_name><meta_package_name>
CMakeLists.txtpackage.xmlREADME.md
Note: It is extremely important to maintain your package.xml up to date with its dependencies. Not doing soresults in the need of specialized tools or manual inspection of launch files and source code to discover your packagedependencies.
3.1.2 Messages, services and actions
Creating a new message, service or action. If your package defines its own messages, services or actions you shouldadd them to the corresponding meta-package:
8 Chapter 3. Contributing
MAS Industrial Robotics, Release latest (Kinetic)
./<package_name>_msgsaction
MyAction.actionmsg
MyMessage.msgsrv
MyService.srvCMakeLists.txtpackage.xmlREADME.md
Note: The srv file name should start with verb i.e. RecognizeImage.srv
Depending on the repository you are working on, the meta-package is related to the domain, e.g. mdr_planning_msgsor mdr_navigation_actions
3.1.3 Linting
Running roslint with catkin
Before merging into the main repository roslint is ran on all merge requests. Unless all errors are resolved the mergerequest will be rejected. To test if your changes would pass the roslint test locally:
• Add the following lines to your CMakelists.txt:
find_package(catkin REQUIRED COMPONENTS roslint ...)
roslint_python() # pep8 lintingroslint_cpp() # ROS wrapper of Google's cpplint
Your package.xm should include roslint as a build dependency:
<build_depend>roslint</build_depend>
• Build target roslint:
– with catkin_make:
catkin_make roslint_<package_name>
– with catkin_tools:
catkin build --no-deps <package_name> --make-args roslint_<package_name>
• If build fail copy and execute the gray line that looks something like the following to see more detailed errors:
cd <package_source_directory>catkin build --get-env <package_name> | catkin env -si /usr/bin/make roslint --→˓jobserver-fds=6,7 -j; cd -
Running catkin_lint
You should also make sure that the catkin_lint tests pass; running it from the root of your catkin workspace you canrun:
3.1. Robot Operating System (ROS) 9
MAS Industrial Robotics, Release latest (Kinetic)
catkin_lint --strict --ignore CRITICAL_VAR_APPEND,LINK_DIRECTORY src/mas_domestic_→˓robotics
See Also:
• roslint
• catkin_lint
Proposed linters:
• C++
• Python
• ROS
3.2 Coding guidelines
Note: Make sure that your text editor is properly configured to use spaces instead of tabs. All C/C++ code shouldbe written and formatted according to the Google style guide (with an exception to column limit and breaking braces.See .clang-format for more details). All Python code should adhere to the PEP-8 style guide.
3.2.1 Coding style
C/C++
• Linting
cpplint is used to lint C++ code according to the Google style guide. cpplint can be installed using pip (in apython 3.7 environment)
pip install cpplint
Run cpplint on a file/directory as follows
cpplint <filename/directory>
• Code formatting
Clang format is used to format C/C++ code. Install clang format as follows.
sudo apt-get install clang-format
All configurations are present in .clang-format file and its is mandatory to use this file to format the code in thisrepository. Please note that it is necessary to run clang-format from the repository’s root folder so that it usesthe repository’s .clang-format file. To run clang-format on a single C++ file, use
clang-format -i <C++ filename>
To run clang-format on files inside a directory recursively, use
find . -regex '.*\.\(cpp\|hpp\|cu\|c\|h\)' -exec clang-format -i {} \;
10 Chapter 3. Contributing
MAS Industrial Robotics, Release latest (Kinetic)
• Static code analysis
Cppcheck is used for static code analysis in order to detect bugs and undefined behaviors due to bad codingconstructs. Install cppcheck using
sudo apt-get install cppcheck
To run cppcheck on a file or directory, run
cppcheck <filename/directory>
Python
• Linting
Python code should follow the PEP-8 style guide. Installing linters in your python environment ensures compli-ance with the PEP-8 style guide. The precommit hooks for this repository uses pylint in a python 3.7 environ-ment and can lint python 2 code. Pylint can be easily installed using pip.
pip install pylint
In order to analyze file/s for linting errors, manually run pylint using the following command.
pylint <python filename or directory name>
• Sorting imports
isort organizes and sorts imports in python files. Install isort using pip.
pip install isort
To run isort on a python file use
isort <python filename>
• Code formatter
Black is used to format python code. Please ensure that your code is formatted using black before committingyour changes. Black can be installed using pip (again in a python 3.7 environment).
pip install black
To format existing python code using black, run the following
black <python filename/directory>
Note: Pre-commit hooks has been added to this repository. Please note that you will not be able to locally commityour changes to git until all the checks in the .pre-commit-config.yaml pass. Some serious violations of the standardcoding guidelines will result in errors while running git commit and have to be manually fixed. Users will not be ableto commit their code, until these errors are fixed. Please ensure that git commit or pre-commit hooks (and not thecode itself) is run in a python 3.7 environment as configured in .pre-commit-config.yaml.
Warning: Alternatively, one could also verify if the pre-commit hooks pass before actually committing the codeto git. To do so please run the following command after making necessary changes to your code.
3.2. Coding guidelines 11
MAS Industrial Robotics, Release latest (Kinetic)
pre-commit run --all-files
This is however currently discouraged because there are several linting errors in the whole repository yet to befixed and one doesn’t want to end up fixing thousands of errors when just trying to add their contribution.
3.2.2 Editors for software development
• Visual Studio Code
• Vim
• Pycharm
Install the necessary python, C++ and ROS plugins after installing a desired editor. Other editors which support ROSare listed here.
Configuring editors
It is important to configure your editor settings so that linters, code formatters and code checkers check for errors(and solve them if possible) automatically upon saving your changes in a file. Below is an illustration of the settingsconfigurations that need to modified in Visual Studio Code to avoid manually performing the checks described inCoding style. Similar configurations can be done in other editors too.
The settings can be configured through the Settings option in File menu or in settings.json.
• Python linting By default pylint is enabled in Visual Studio Code, however pylint has to be installed using pipin your chosen python interpreter path. Please do not enable other linters as this could create a conflictwhile running pre-commit hooks. Please checkout the VS Code website for more information.
• Python code formatting Since pre-commit hooks uses black to format python code, this can be very conve-niently added to your editor so that the file is auto-formatted by black upon saving. Add the following toyour settings.json to enable black code formatting.
"python.formatting.blackArgs": ["--line-length=79"],"python.formatting.provider": "black","[python]": {"editor.codeActionsOnSave": {"source.organizeImports": true}},
• C++ linting Install the cpplint extension to VS Code to enable the cpplinter. This then highlights the lintingerrors in the C++ code with squiggly lines.
• C++ code formatting Clang-format is used to format C++ code. This can be configured in settings.json afterinstalling the official Microsoft C/C++ extension. Add the following lines to your settings.json file so thatthe configurations from .clang-format in the repository are used by VS Code to format the C++ files.
"C_Cpp.clang_format_style": "file","C_Cpp.formatting": "clangFormat"
12 Chapter 3. Contributing
MAS Industrial Robotics, Release latest (Kinetic)
3.3 Toolkit
3.3. Toolkit 13
MAS Industrial Robotics, Release latest (Kinetic)
14 Chapter 3. Contributing
CHAPTER 4
Perception
MAS industrial robotics - perception tutorial
4.1 Camera
4.1.1 Arm camera calibration
Camera calibration
4.1.2 RealSense2 camera
How to use the RealSense2 camera
1. Installation
Go to the intel-ros github page. Clone the realsense repository in your catkin workspace inside src:
git clone [email protected]:intel-ros/realsense.git
2. Camera Output
Run the following to get access to the camera:
roslaunch realsense2_camera rs_rgbd.launch
Open rviz to visualize the camera output.
3. Configure camera output (OPTIONAL)
Run the following to open the rviz configuration window:
rosrun rqt_reconfigure rqt_reconfigure
You can also try to change the “octree_resolution” value:
15
MAS Industrial Robotics, Release latest (Kinetic)
cd *catkin workspace*/src/mas_perception/mcr_scene_segmentation/ros/configgedit scene_segmentation_constraints.yaml
4. Setup Base Frame
Run the following:
rosrun tf static_transform_publisher x y z roll pitch yaw base_link camera_→˓link 100
where x, y, z are the distances and roll, pitch, yaw are the rotations from the base_link to the camera_link.
To visualize your frames in rzviz, add the TF feature in the rviz menu.
5. Save Point Clouds
If it’s your first time saving point clouds, you need to choose where you want to save them and enabledata collection:
cd *catkin workspace*/src/mas_perception/mcr_scene_segmentation/ros/launchgedit scene_segmentation.launch
Change the value of “dataset_collection” from “false” to “true”. Change value of “logdir” from “/temp/to the path in your computer where you want to save the files.
Run the following to get access to the point clouds given by the camera:
roslaunch mcr_scene_segmentation scene_segmentation.launch input_pointcloud_→˓topic:=/camera/depth_registered/points
Publish the message ‘e-start’:
rostopic pub /mcr_perception/scene_segmentation/event_in std_msgs/String→˓"data: 'e_start'"
Publish the message ‘e-add-cloud-start’:
rostopic pub /mcr_perception/scene_segmentation/event_in std_msgs/String→˓"data: 'e_add_cloud_start'"
This last one will save the current point cloud of the observed object in your system.
Warning: Sometimes the camera won’t save the point cloud (don’t worry, not your fault). Just try adifferent position for the object until it works.
6. Visualize Point Cloud
Run the following in the folder where you saved the point clouds:
pcl_viewer *.pcd file you want to open*
4.2 Dataset
4.2.1 Dataset collection
16 Chapter 4. Perception
MAS Industrial Robotics, Release latest (Kinetic)
3D dataset collection
We use a rotating table to collect point cloud data.
Note: This only works with a single object.
Setup:
1. Using robot arm camera
2. Using camera
• Launch the camera
Go to RealSense2 camera for more information about the camera.
• Apply static transform from camera_frame to base_link as explained in RealSense2 camera
Make sure the pointcloud of the plane is parallel to the gorund on rviz by transforming/rotating it.
Note: Passthrough filter will not work if it’s not parallel to the ground
• Then launch multimodal object recognition
roslaunch mir_object_recognition multimodal_object_recognition.launch→˓debug_mode:=true
Note: To enable dataset collection, it requires to be in debug_mode. You can also point to a specifilogdir to save the data e.g. logdir:=/home/robocup/cloud_dataset
• Trigger data collection mode
rostopic pub /mir_perception/multimodal_object_recognition/event_in std_→˓msgs/String e_data_collection
• Start collectiong dataset
rostopic pub /mir_perception/multimodal_object_recognition/event_in std_→˓msgs/String e_start
• Stop data collection mode
rostopic pub /mir_perception/multimodal_object_recognition/event_in std_→˓msgs/String e_stop_data_collection
2D dataset collection
• Collect dataset using rotating table
• Collect dataset using random surface
• Collect dataset during competition
4.2. Dataset 17
MAS Industrial Robotics, Release latest (Kinetic)
4.2.2 Dataset preprocessing
2D dataset preprocessing
• Object detection
– Download labelme
– Label the data
– Augment data
– Convert to a particular format: VOC, KITTI etc
4.3 Training
4.4 Object segmentation
4.4.1 3D object segmentation
4.4.2 2D object segmentation
4.5 Object recognition
4.5.1 3D object recognition
4.5.2 2D object recognition
4.6 Barrier tape detection
4.7 Cavity detection
18 Chapter 4. Perception
CHAPTER 5
Manipulation
MAS industrial robotics - manipulation tutorial
19
MAS Industrial Robotics, Release latest (Kinetic)
20 Chapter 5. Manipulation
CHAPTER 6
Navigation
6.1 On workstation or your PC
• To shh the youbot (in all terminals):
yb4
Note: alias yb4=ssh -X robocup@youbot-brsu-4-pc2
• Export the youbot ssh alias
export_yb4
Note: alias export_yb4=export ROS_MASTER_URI=http://youbot-brsu-4-pc2:11311
• Run rviz
rosrun rviz rviz
Set the global frame to base_link
6.2 2D SLAM
• Run roscore
roscore
• Launch the robot
21
MAS Industrial Robotics, Release latest (Kinetic)
roslaunch mir_bringup robot.launch
• Run 2D SLAM
roslaunch mir_2dslam 2dslam.launch
Note: The map is built using the front laser’s only
• Run the map saver
Go to the map configuration directory
roscd mcr_default_env_config
By using ls you can see several folders corresponding to existing environments. You can either use an existingmap or create a new one:
mkdir [map_name]cd [map_name]
And then run:
rosrun map_server map_saver
This will create two files: a map.pgm and map.yml.
Finally, to use the map that you just created you need to check which map will be loaded by the navigation stack:
echo $ROBOT_ENV
If you need to change it:
export ROBOT_ENV=[map_name]
Note: Usually the .rosc script is used to set the environment, among other variables.
6.3 2D Navigation
• Bringup the robot
First export the environment to be used:
export ROBOT_ENV=brsu-C025
Launch the robot:
roslaunch mir_bringup robot.launch
• Launch the navigation node
roslaunch mir_2dnav 2Dnav.launch
22 Chapter 6. Navigation
MAS Industrial Robotics, Release latest (Kinetic)
• Create navigation goals and orientations
First you need to create the files where goals will be saved:
touch navigation_goals.yamltouch orientation_goals.yaml
• Localize the robot
In rviz:
1. Select the 2D pose estimate
2. Click the position near the robot
3. Move with joystick
4. Launch navigation tools in yb2
• Save the navigation and orientation goals
roscd mcr_default_env_configcd brsu-C025rosrun mcr_navigation_tools save_map_poses_to_file
• Test navigation goal using move_base
rosrun mir_move_base_safe move_base_safe_server.pyrosrun mir_move_base_safe move_base_safe_client_test.py [source] [dest]
• Navigation test using refbox
roslaunch mir_basic_navigation_test refbox_parser.py
6.3. 2D Navigation 23
MAS Industrial Robotics, Release latest (Kinetic)
24 Chapter 6. Navigation
CHAPTER 7
Planning
MAS industrial robotics - task planning tutorial
7.1 Refbox parser
This node contains components to store the world model of the environment and the goal of the tasks in the knowledgebase of the robot. World model is defined as the locations and properties of different moveable objects(eg: robot, boxetc) in the environment. The world model node has to subscribe to the incoming data stream which informs of thecurrent state of the world.(Eg : Refree box) It then has to store this value in the any format which can be utilized forother components.
25
MAS Industrial Robotics, Release latest (Kinetic)
7.1.1 Requirements
sudo apt-get install flex ros-indigo-mongodb-store ros-indigo-tf2-bullet freeglut3-dev
• Rosplan
The following folders are not required they can be avoided by $> touch CATKIN_IGNOREinside the respective directories - rosplan_demos - rosplan_interface_mapping - ros-plan_interface_movebase
7.1.2 Usage
1. Launch the component (example):
roslaunch mir_refbox_parser refbox_parser.launch
7.1.3 Testing
1. We need the Knowledge base service running to test the loading of knowledge base
roslaunch mir_pddl_problem_generator rosplan_knowledge_base_example.launch
Before running the test the file has to be copied to the mir_pddl_problem_generator/ros/test/example_domain folder.
2. For testing navigation we need the navigation action server
roslaunch mir_move_base_safe move_base.launch
Input(s)
• event_in: trigger to start the node.
• refbox: the ros topic string format on which the BTT, BNT, BPT, BMT, CBT messages is being published
Output(s)
• e_status: Returns if the data storage ws successful
7.2 Planning bringup
Contains launch file which brings up all the necessary modules for robocup @work competition.
7.3 Planning core
Handles plan generation, execution, monitoring and replanning (if needed)
26 Chapter 7. Planning
MAS Industrial Robotics, Release latest (Kinetic)
7.3.1 Dependencies
• mir_planning_msgs
• mir_states
• mir_pddl_problem_generator (PDDL problem generator)
• mir_task_planning (Task planning)
• mir_planner_executor (Planner executor)
• Optionally (with real robot) mir_actions (Actions)
7.3.2 Usage
This is not a standalone package (but it acts like a coordinator) and thus it should not be executed without its depen-dencies.
To test with mockup action servers without a robot
roscoreroslaunch mir_planning_core task_planning_components.launchrosrun mir_task_executor task_executor_mockuproslaunch mir_task_planning upload_problem.launchroslaunch mir_planning_core task_planning_sm.launch
7.4 Task planning
mir_task_planning generates task plan using classical planner.
This module provides
• ros independent wrapping for planners (currently mercury and lama) in /common/planner_wrapper/.
Based on the planner requested, this module executes a shell command as mentioned in config file(see Configuration). This generates a bunch of files in /tmp/plan/ directory. The wrapper parsesthe most optimal plan file and returns as a list of string. Remaining files are deleted for a fresh startfor next request.
• ros wrapper on top of the previously mentioned module is available in /ros folder. The ros wrapper is imple-mented as an ActionServer.
Upon request from an ActionClient containing domain_file, problem_file, plannerand mode, the server tries to plan with those provided configuration and returns a CompletePlan.
– domain_file - file path of domain file (it should be a .pddl file)
– problem_file - file path of problem file (it should be a .pddl file)
– planner - name of planner to use (lama or mercury)
– mode - PlanGoal.NORMAL or PlanGoal.FAST (fast mode implies that first plan foundwill be returned. No further optimisation will be performed by the planner.
Note: This should only be use when a small number of goals are provided as it will produce avery bad and non optimal plan.
7.4. Task planning 27
MAS Industrial Robotics, Release latest (Kinetic)
7.4.1 Requirements
• mercury_planner
• lama_planner
Note: g++-multilib is needed to install these packages.
7.4.2 Usage
• Without ROS
python planner_wrapper.py
• With ROS
roscoreroslaunch mir_task_planning task_planner.launchroslaunch mir_task_planning task_planner_client_test.launch
7.4.3 Configuration
# shell commands to execute plannersplanner_commands:
# all CAPS words will be replaced in the codemercury:
command: 'timeout TIMELIMIT EXECUTABLE DOMAIN PROBLEM FILENAME'rospkg_name: 'mercury_planner'executable_path: 'build/Mercury-fixed/seq-sat-mercury/plan'
lama:command: 'EXECUTABLE --search-time-limit TIMELIMIT --alias seq-sat-lama-2011 -
→˓-plan-file FILENAME DOMAIN PROBLEM'rospkg_name: 'lama_planner'executable_path: 'fast-downward/fast-downward.py'
7.4.4 Additional files
• ros/launch/upload_problem.launch and ros/test/upload_problem can be used for testingwhole planning pipeline without Refbox. This launch file will act as refbox, refbox client and refbox parser.It reads a .pddl problem file and uploads instances and facts to knowledge base. To test the whole planningpipeline without refbox, following things need to be executed
roscoreroslaunch mir_planning_core task_planning_components.launchrosrun mir_task_executor task_executor_mockuproslaunch mir_task_planning upload_problem.launchroslaunch mir_planning_core task_planning_sm.launch
• common/pddl contains robocup’s domain file and a bunch of problem files for testing.
28 Chapter 7. Planning
MAS Industrial Robotics, Release latest (Kinetic)
Note: This folder is needed by default if common/planner_wrapper/planner_wrapper.py is run-ning standalone.
7.5 Planner executor
planner_executor is the main executable which is executed when planner_executor.launch is launched.It creates a bunch of BaseExecutorAction objects. When a execute plan goal is received, it call individualaction’s execute function.
Most of the time, this execute function will change the names of the parameters obtained from planner to somethingthat makes sense. After that, it will call run function which
• creates a goal of the action server
• sends this goal to action server
• wait for the server to respond within certain time duration
• if the server responds with success, it will return true otherwise false
Some actions are Combined, which means, it has most of the things in common at planning and execution levelbut has different server (for example, perceive location and perceive cavity). This will have a check on one of theparameters. Based on this check either one of the server will be called as described above.
7.5.1 Usage
Normal use
roslaunch mir_planner_executor planner_executor.launch
Testing
roslaunch mir_planner_executor planner_executor.launchroslaunch mir_planner_executor planner_executor_test.launch
7.5.2 OOP Structure
BaseExecutorAction|'- ExecutorAction| || '- MoveAction| || '- PickAction| || '- PlaceAction| || '- StageAction| |
(continues on next page)
7.5. Planner executor 29
MAS Industrial Robotics, Release latest (Kinetic)
(continued from previous page)
| '- UnstageAction| || '- BasePerceiveAction| | || | '- PerceiveAction| | || | '- PerceiveCavityAction| || '- BaseInsertAction| || '- InsertAction| || '- InsertCavityAction|'- CombinedPerceiveAction|'- CombinedInsertAction
7.6 PDDL problem generator
Generates automatically PDDL problem definition from knowledge base snapshot.
7.7 Knowledge base analyzer
1. query the knowledge base and tells you if there is pending goals
2. query the knowledge base and informs if there is new knowledge
7.7.1 Usage
Get objects for a particular location:
rostopic pub /mir_knowledge_base_analyzer/knowledge_base_queries/query_param std_msgs/→˓String "WS02"rostopic pub /mir_knowledge_base_analyzer/knowledge_base_queries/query std_msgs/→˓String "get_objects_at_location"rostopic echo /mir_knowledge_base_analyzer/knowledge_base_queries/objects_at_location
Get current location of robot:
rostopic pub /mir_knowledge_base_analyzer/knowledge_base_queries/query std_msgs/→˓String "get_robot_location"rostopic echo /mir_knowledge_base_analyzer/knowledge_base_queries/robot_location
7.8 MIR Knowledge
Stores semantic knowledge about the industrial domain.
30 Chapter 7. Planning
MAS Industrial Robotics, Release latest (Kinetic)
7.9 Actions
A bunch of action servers for performing basic robocup actions.
The robocup @work domain is partitioned into 6 basic actions
• move base
• pick
• perceive
• place
• stage
• unstage
Each action is implemented as SMACH state machines which are wrapped with ActionServer. AnActionClient needs to send request using GenericExecuteGoal.
This request message contains a single dictionary kind of message called parameters of typediagnostic_msgs/KeyValue[].
# goal definitiondiagnostic_msgs/KeyValue[] parameters---# result definitiondiagnostic_msgs/KeyValue[] results---# feedbackstring current_statestring text
Each server needs different information from this request message. Please see the following files for detailed info:
7.9.1 Move base safe
Move robot base from a known/mapped location to another known/mapped location while looking for barrier tape.
Goal parameter description
• destination_location: name of known/mapped location (e.g. WS01, SH02, PP01)
• dont_be_safe: bool determining if barrier tape detection should be used or not (e.g. true, false)
• arm_safe_position: joint configuration name for arm while the base is in motion. This is used for barriertape detection. (e.g. barrier_tape, folded)
7.9.2 Perceive location
Perceive a workstation and detect and classify objects on it.
Goal parameter description
• location: name of known/mapped location (e.g. WS01, SH02, PP01)
7.9. Actions 31
MAS Industrial Robotics, Release latest (Kinetic)
7.9.3 Perceive cavity
Perceive a precision placement workstation and detect and classify cavities in it.
Goal parameter description
• location: name of known/mapped location (e.g. PP01, PP02)
7.9.4 Perceive mock
Perceive an aruco cube on a workstation
Related to: mir_perceive_aruco_cube
Goal parameter description
• location: name of known/mapped location (e.g. WS01, SH02, PP01)
Usage
Make sure Planning bringup’s launch file is already launched.
roslaunch mir_perceive_mock perceive_aruco_server.launchrosrun mir_perceive_mock perceive_aruco_client_test
7.9.5 Pick object
Pick an object from a workstation at a known/mapped location
Goal parameter description
• location: name of known/mapped location (e.g. WS01, SH02)
• object: name of object to be picked (e.g. M20, S40_40_G)
7.9.6 Place object
Place object in gripper on a workstation at a known/mapped location
Goal parameter description
• location: name of known/mapped location (e.g. WS01, SH02)
7.9.7 Stage object
Put the object in gripper on robot’s back platform.
32 Chapter 7. Planning
MAS Industrial Robotics, Release latest (Kinetic)
Goal parameter description
• platform: name of robot’s back platform (e.g. PLATFORM_MIDDLE, PLATFORM_LEFT)
7.9.8 Unstage object
Pick an object from robot’s back platform.
Goal parameter description
• platform: name of robot’s back platform (e.g. PLATFORM_MIDDLE, PLATFORM_LEFT)
7.9.9 Insert object
Insert an object into a container on a workstation after picking it from robot’s back platform.
Goal parameter description
• platform: name of robot’s back platform (e.g. PLATFORM_MIDDLE, PLATFORM_LEFT)
• hole: name of container on workstation (e.g. CONTAINER_BOX_BLUE)
7.9.10 Insert cavity
Insert an object into a cavity on a precision placement workstation after picking it from robot’s back platform.
Goal parameter description
• peg: name of object which needs to be inserted (e.g. M30, F20_20_B)
• platform: name of robot’s back platform (e.g. PLATFORM_MIDDLE, PLATFORM_LEFT)
• hole: name of container on workstation (e.g. PP01, PP02)
Additionally, Planner executor also sends next_action as one of the parameter. This can be used by action serversto have parallel execution of arm while the base is in motion to save som time. At the moment, only Move base safe isusing this information.
This package also contains Utils.py which contains utility functions for action servers.
7.9.11 Usage
roscoreroslaunch mir_actions run_action_servers.launch
To execute an action, call the corresponding action client with appropriate arguments. For example, move base safe:
rosrun mir_move_base_safe move_base_safe_client_test.py WS01
7.9. Actions 33
MAS Industrial Robotics, Release latest (Kinetic)
7.10 Planning Visualisation
Visualise the planning related knowledge in RViz.
Knowledge base and plan visualised
1. move_base actions according to current plan
2. Unfinished goals (object that needs to be placed) [Green]
3. Objects that need to be picked according to current plan [Blue]
4. Normal objects that need not be interacted with
5. Objects stored on robot’s platform
Knowledge base visualised without plan with fake objects
34 Chapter 7. Planning
MAS Industrial Robotics, Release latest (Kinetic)
7.10.1 Configuration
Configuration file for generating a marker from a 3D model is defined as follows
m20:file_name: 'm20.stl'scale: 0.001color:
r: 0.1g: 0.1b: 0.1
offset:x: 0.0y: 0.0z: 0.0roll: 0.0pitch: -90.0yaw: 180.0
The marker should be created at the bottom center of the given position. (See mir_planning_visualisation.utils.Utils.get_marker_from_obj_name_and_pos() for more info)
7.10.2 Test
roscoreroslaunch mir_planning_core task_planning_components.launchroslaunch mir_task_planning upload_problem.launchrosrun mir_planner_executor planner_executor_mockuproslaunch mir_planning_visualisation test_planning_visualiser.launchroslaunch mir_planning_core task_planning_sm.launch
7.10. Planning Visualisation 35
MAS Industrial Robotics, Release latest (Kinetic)
See Planning Visualisation for further info on the gif.
36 Chapter 7. Planning
CHAPTER 8
Simulation
8.1 Mapping and Navigation for Simulation
8.1.1 Mapping
Run simulation related nodes
• Run roscore
roscore
• Launch the robot (In another terminal)
roslaunch mir_bringup_sim robot.launch
• Run gazebo simulator (In another terminal)
rosrun gazebo_ros gzclient
• Run rviz (In another terminal)
rosrun rviz rviz
Note: To setup the RViz, please go to the bottom of this page.
8.1.2 Generate map
• Run 2D SLAM (In another terminal)
roslaunch mir_2dslam 2dslam.launch
37
MAS Industrial Robotics, Release latest (Kinetic)
Now there should be robot in an empty map in RViz.
• Move the robot around the map. (In another terminal)
roslaunch mir_teleop teleop_keyboard.launch
WSAD keys set the robot in motion and “Space bar” stops that motion.
As you move the robot around, you should be able to see walls appearing in the map in RViz and all the otherarea will be free. After you have mapped the whole environment, you can save the map in config map directory.
• Move to map config directory (In another terminal)
roscd mcr_default_env_config
• Then make a directory and move inside that newly created directory
mkdir test_mapcd test_map
• Now you can save the map that you just created
rosrun map_server map_saver
This will ideally create 2 files namely map.pgm and map.yml. Now you can exit out of mir_2dslam execution.You can also exit from mir_teleop, gazebo, mir_bringup_sim
8.1.3 Making the map usable
In order to use this map in future to navigate, follow the following steps:
• Add files where the goals will be saved (in the same directory where the map files have been saved)
touch navigation_goals.yamltouch orientation_goals.yaml
• Make a copy of the existing launch file.
cd ~/catkin_ws/src/mas_common_robotics/mcr_environments/mcr_gazebo_worlds/ros/→˓launchcp brsu-c025-sim.launch test_map.launch
Note: Inside test_map.launch, edit the argument of robot_env (line 10). Replace brsu-c025-sim with test_map.Save this file.
• Make a copy of xacro file.
cd ~/catkin_ws/src/mas_common_robotics/mcr_environments/mcr_gazebo_worlds/common/→˓worlds/cp brsu-c025-sim.urdf.xacro test_map.urdf.xacro
Note: Now your newly created map should be ready for use.
38 Chapter 8. Simulation
MAS Industrial Robotics, Release latest (Kinetic)
8.1.4 Navigation
• Run the commands from “Run simulation related nodes” as mentioned above to bring the robot up.
• Launch the navigation node
roslaunch mir_2dnav 2Dnav.launch
• Open RViz
Add PoseArray in RViz and change its topic to /particlecloud. Now you will be able to see red arrows aroundthe robot. These arrow show the pose of the robot.
• You now need to localize the robot to get its correct pose. Move the robot around the map. (In another terminal)
roslaunch mir_teleop teleop_keyboard.launch
Rotate the robot in its place using QE keys. You will notice the red arrows converging around the robot. Oncethe the robot is reasonably localised, you can navigate the robot around in 2 ways:
1. GUI (RViz)
Click on the 2D Nav Goal and select a goal on the map.
2. Terminal based (ROS Action Server Client)
For Action server client, the robot first needs name of position for source and destination places (Itcannot use x, y, theta values)
To name the poses, you have to execute save_base_map_poses_to_file
roscd mcr_default_env_configcd test_maprosrun mcr_navigation_tools save_base_map_poses_to_file
This program is terminal based interactive program. The program will ask you to name the position.
1. You can now navigate the robot to your desired position (using GUI of RViz or mir_teleop).
2. Once your robot is at the desired position, you can enter a name and press enter.
Note: Note : The name of the location should be ALL CAPS. For example, COR-NER_1, MAIN_DOOR, etc. If the name contains any lower case character, the serverwill not work) You will see pose of the robot inside square brackets in the next line andprompted for another name.
3. Repeat step 1 and 2 to add multiple names to different locations inside the map.
4. To close this interactive program, press Ctrl + z. (Note : Ctrl + c won’t work.) Then killthis process.
ps
take a note of the pid of python process.
kill -9 <pid_number>
Now stop mir_2dnav and start it again.
Launch move base launch file (In another terminal)
8.1. Mapping and Navigation for Simulation 39
MAS Industrial Robotics, Release latest (Kinetic)
roslaunch mir_move_base_safe move_base.launch
Run the server file. (In another terminal)
rosrun mir_move_base_safe move_base_safe_server.py
You can test everything by running client test file. (In another terminal)
rosrun mir_move_base_safe move_base_safe_client_test.py SOURCE_→˓NAME DESTINATION_NAME
For example, if you want to move the robot from MAIN_DOOR to CORNER_1, then
rosrun mir_move_base_safe move_base_safe_client_test.py MAIN_→˓DOOR CORNER_1
Note: The source location is irrelevant for the client test file. Your robot can beanywhere and the program will still work correctly. Just give some valid location nameas a place holder.) The client will print success : True, if it was able to successfullynavigate to the destination position.
8.1.5 RViz setup
• Add Map, RobotModel, LaserScan using the “Add” button in bottom left corner of “Display” section of RViz.
• In Map, change the topic to /map
• In LaserScan, change the topic to /scan_front. Add another LaserScan and change its topic to /scan_rear.
• In global option, change the Fixed Frame to map.
Note: You can also add another PoseArray and change its topic to /move_base/GlobalPlannerWithOrientation tovisualise the plan created by the mir_2Dnav node.
8.2 World Generation
Procedurally generates at work arena for gazebo simulation.
Generates
• .xacro file containing world model for gazebo simulator
• map.pgm and map.yaml files for occupancy grid for ros navigation
• navigation_goals.yaml file for industrial robotics Move base safe action
Change the parameters for generation in common/config/config.yaml file. (see Configuration)
8.2.1 Usage
Generate all necessary files using grid_based_generator.py
40 Chapter 8. Simulation
MAS Industrial Robotics, Release latest (Kinetic)
roscd mir_world_generation/common/mir_world_generationpython grid_based_generator.py
Visualise all the generated files using example .launch file
roslaunch mir_world_generation sim.launch
8.2.2 Configuration
# how many workstation of each type is requiredws_type_to_num:
ws: 8sh: 2pp: 1
# Path where all the generated files should be storedgeneration_dir: '/tmp'
# number of row and columns of cells in gridnum_of_rows: 3num_of_cols: 4
# threshold for generating a wall between 2 cell [0.0 - 1.0]# Increase this value to generate more complex environment (more walls and/or more ws)wall_generation_threshold: 0.3
# threshold for generating noise in occupancy grid [0.0 - 1.0]noise_threshold: 0.15
# number of retries before giving upmax_retries_allowed: 5
# distance from base link of robot to center of workstation (used for nav goal)base_link_to_ws_center: 0.65
8.2. World Generation 41
MAS Industrial Robotics, Release latest (Kinetic)
8.2.3 Examples
42 Chapter 8. Simulation
CHAPTER 9
Perception Documentation
43
MAS Industrial Robotics, Release latest (Kinetic)
44 Chapter 9. Perception Documentation
CHAPTER 10
Manipulation Documentation
10.1 mir_moveit_scene_ros
10.1.1 mir_moveit_scene_ros.attach_grasped_object module
10.1.2 mir_moveit_scene_ros.restrict_arm_workspace module
10.2 mir_pregrasp_planning_ros
10.2.1 mir_pregrasp_planning_ros.pose_mock_up_gui module
10.2.2 mir_pregrasp_planning_ros.pregrasp_planner_node module
10.2.3 mir_pregrasp_planning_ros.simple_pregrasp_planner_utils module
45
MAS Industrial Robotics, Release latest (Kinetic)
46 Chapter 10. Manipulation Documentation
CHAPTER 11
Navigation Documentation
11.1 mir_move_base_ros
11.1.1 mir_move_base_ros.move_base module
47
MAS Industrial Robotics, Release latest (Kinetic)
48 Chapter 11. Navigation Documentation
49
MAS Industrial Robotics, Release latest (Kinetic)
CHAPTER 12
Planning Documentation
12.1 mir_actions
12.1.1 mir_actions.utils module
12.2 mir_knowledge_base_analyzer
12.2.1 mir_knowledge_base_analyzer_ros.knowledge_base_analyzer module
12.3 mir_knowledge_ros
12.3.1 mir_knowledge_ros.problem_uploader module
12.4 mir_manipulate_drawer
12.5 mir_planning_visualisation
12.5.1 mir_planning_visualisation.kb_visualiser module
12.5.2 mir_planning_visualisation.plan_visualiser module
12.5.3 mir_planning_visualisation.utils module
12.6 mir_task_planning
12.6.1 mir_task_planning.utils module
12.7 planner_wrapper
12.8 mir_states
12.8.1 common.action_states module
12.8.2 common.basic_states module
12.8.3 common.manipulation_states module
12.8.4 common.navigation_states module
12.8.5 common.perception_mockup_util module
12.8.6 common.perception_states module
12.9 mir_refbox_parser_ros
12.9.1 mir_refbox_parser_ros.refbox_parser module
50 Chapter 12. Planning Documentation
CHAPTER 13
Simulation Documentation
13.1 mir_world_generation
13.1.1 mir_world_generation.grid_based_generator module
13.1.2 mir_world_generation.node module
13.1.3 mir_world_generation.utils module
51