personal care robot: mechanism, motion control, and pose

7
IJBSCHS Biomedical Soft Computing and Human Sciences, Vol.23, No.2, pp.83-89 [Original article] Copyright©1995 Biomedical Fuzzy Systems Association (Accepted in November 2018) 83 Personal Care Robot: Mechanism, Motion Control, and Pose Estimation Guang YANG 1 , Shuoyu WANG 1 , Bo SHEN 1 , Hayato ENOKI 2 , Kenji ISHIDA 3 Kazuo OKUHATA 4 , Yoshinobu MIZOBUCHI 4 and Shingo INO 4 1 School of System Engineering, Kochi University of Technology 2 Faculty of Health and Welfare, Tokushima Bunri University 3 Kochi Medical School Hospital, Kochi University 4 Medical Division, Satt Systems Co., Ltd. Abstract: To confront the long-term daily care challenge caused by the shortage of nursing person- nel, we are developing a single-body personal care robot for the walking disabled and weak elderly people. In this research, we propose a novel personal care robot with 23 degrees of freedom. Fol- lowing an introduction to the robot’s mechanical structure and sensor configuration, a motion con- trol method is provided. Additionally, to ensure accurate object manipulation, we present a pose es- timation pipeline of important daily necessities for personal care. Ultimately, the proposed motion control and pose estimation methods were verified by experiments. Keywords: Personal care robot, Motion control, Pose estimation 1. Introduction The lack of a caring force for the elderly and disabled people is now a worldwide problem. Accord- ing to the data published by the Ministry of Health, Labor, and Welfare in January 2017, the population in Japan requiring daily support and nursing care is ap- proximately 12.45 million people (almost one tenth of the total population). As for nursing care personnel, owing to the requirement for advanced skills (depend- ing on the various characteristics of the targeted people and the diversity of the caring tasks), the gap between supply and demand keeps increasing [1]. Confronted with these challenges, robots that can provide daily personal care will be of great help. Tradi- tional industrial manipulators can perform complex and accurate object manipulation missions [2] but suffer from a lack of mobility. Service robots [3] focus mostly on indoor navigation and human-robot interface techniques using voice recognition and touch screens; therefore, they are weak in the physical interaction with objects. Also, there are products such as the “KUKA youBot”, which integrates a manipulator and mobile platform [4]. Still, considering its joint configuration, attaching various types of sensors and performing personal care tasks in a household environment are difficult. In this paper, we present a type of single-body mul- tifunctional personal care robot called “KUT-PCR”. With its humanoid upper body and omnidirectional lower platform, KUT-PCR can take care of users in various ways. We then propose a motion control method for the robot to navigate the indoor environment and manipulate intended objects. Additionally, to provide the perception ability required in a complete fetching and serving task, we propose a pose estimation algorithm. Finally, we demonstrate the possibility for KUT-PCR to provide household living support using the proposed methods. The remainder of this paper is organized as follows. We start with an overview of personal care environments and caring tasks in Section 2. Section 3 introduces the mechanical design and sensor configuration of KUT-PCR. The proposed control method and pose esti- mation algorithm are introduced in Sections 4 and 5. We then evaluate our approaches through the experiments, including object pose estimation, fetching, and serving, 185 Miyanokuchi, Tosayamada, Kami City, Kochi 782-8502, Japan Phone: +81-0887-57-2013 e-mail: [email protected]

Upload: others

Post on 29-Mar-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Microsoft Word - IJBSCHS-1812-F001-ppIJBSCHS Biomedical Soft Computing and Human Sciences, Vol.23, No.2, pp.83-89 [Original article] Copyright©1995 Biomedical Fuzzy Systems Association
(Accepted in November 2018)
Personal Care Robot: Mechanism, Motion Control, and Pose Estimation
Guang YANG1, Shuoyu WANG1, Bo SHEN1, Hayato ENOKI2, Kenji ISHIDA3 Kazuo OKUHATA4, Yoshinobu MIZOBUCHI4 and Shingo INO4
1School of System Engineering, Kochi University of Technology
2Faculty of Health and Welfare, Tokushima Bunri University 3Kochi Medical School Hospital, Kochi University
4Medical Division, Satt Systems Co., Ltd.
Abstract: To confront the long-term daily care challenge caused by the shortage of nursing person-
nel, we are developing a single-body personal care robot for the walking disabled and weak elderly
people. In this research, we propose a novel personal care robot with 23 degrees of freedom. Fol-
lowing an introduction to the robot’s mechanical structure and sensor configuration, a motion con-
trol method is provided. Additionally, to ensure accurate object manipulation, we present a pose es-
timation pipeline of important daily necessities for personal care. Ultimately, the proposed motion
control and pose estimation methods were verified by experiments.
Keywords: Personal care robot, Motion control, Pose estimation
1. Introduction
The lack of a caring force for the elderly and disabled people is now a worldwide problem. Accord-
ing to the data published by the Ministry of Health, Labor, and Welfare in January 2017, the population in
Japan requiring daily support and nursing care is ap- proximately 12.45 million people (almost one tenth of
the total population). As for nursing care personnel, owing to the requirement for advanced skills (depend-
ing on the various characteristics of the targeted people and the diversity of the caring tasks), the gap
between supply and demand keeps increasing [1]. Confronted with these challenges, robots that can
provide daily personal care will be of great help. Tradi-
tional industrial manipulators can perform complex and
accurate object manipulation missions [2] but suffer from
a lack of mobility. Service robots [3] focus mostly on
indoor navigation and human-robot interface techniques
using voice recognition and touch screens; therefore, they
are weak in the physical interaction with objects. Also,
there are products such as the “KUKA youBot”, which
integrates a manipulator and mobile platform [4]. Still,
considering its joint configuration, attaching various
types of sensors and performing personal care tasks in a
household environment are difficult.
In this paper, we present a type of single-body mul-
tifunctional personal care robot called “KUT-PCR”. With
its humanoid upper body and omnidirectional lower
platform, KUT-PCR can take care of users in various
ways. We then propose a motion control method for the
robot to navigate the indoor environment and manipulate
intended objects. Additionally, to provide the perception
ability required in a complete fetching and serving task,
we propose a pose estimation algorithm. Finally, we
demonstrate the possibility for KUT-PCR to provide
household living support using the proposed methods.
The remainder of this paper is organized as follows.
We start with an overview of personal care environments
and caring tasks in Section 2. Section 3 introduces the
mechanical design and sensor configuration of
KUT-PCR. The proposed control method and pose esti-
mation algorithm are introduced in Sections 4 and 5. We
then evaluate our approaches through the experiments,
including object pose estimation, fetching, and serving,
185 Miyanokuchi, Tosayamada, Kami City, Kochi
782-8502, Japan
Phone: +81-0887-57-2013
e-mail: [email protected]
84
in Section 6. The paper concludes with a summary of the
work and plans for future research.
2. Personal Care Environments and Tasks
2.1. Personal care environments When designing personal care robots, the operating
environment must be considered. In this research, we
assumed an ordinary household environment for two
reasons. First, according to our survey, lower-limb dis-
abled or weak, aged people who live independently
usually get help from care providers once or twice per
day within a fixed period of time. Therefore, long-term
home care is in short supply. Second, an ordinary house-
hold environment that has not been modified to accom-
modate the application of robots is usually narrow and
crowded. Therefore, if our robot can perform life support
tasks in such an environment, it should also work in
places that can be modified to facilitate robot applica-
tions (elderly care centers or hospitals, for example).
Figure 1 shows an indoor environment that includes
a bedroom, kitchen, bathroom, and toilet. Patient (H)
suffers from a walking disability caused by age or illness;
therefore, he cannot move around freely. Consequently,
he spends most of the day lying on the bed. When there
are no housekeepers or family members at home, it is
impossible for him to get a drink from the refrigerator on
his own, even if he feels thirsty.
Figure 1. An indoor household environment for per-
sonal care robots
2.2. Personal care tasks Three kinds of tasks are critical to keeping a healthy
life, including eating (keeping physical strength and ob-
taining nutrition), excreting (eliminating waste and tox-
ins), and taking a bath or a shower (personal hygiene). If
personal care robots help with these three types of tasks,
patients can live a comfortable life.
In this study, we focused on the task of serving the
food and drink. In Figure 1, a refrigerator is in the kitch-
en. If the patient (H) wants to have a bottle of tea, the
personal care robot (R) needs to navigate from the bed to
the refrigerator, recognize and pick up the tea, and then
return to the patient. Figure 2 demonstrates a scenario
involving the object fetching and serving task.
Figure 2. Personal care task: serve food and drink
3. Personal Care Robot KUT-PCR
Figure 3 shows our newly developed personal care
robot, the KUT-PCR, based on the research in [5]. To
perform various types of caring tasks with only one ro-
bot, KUT-PCR was given two characteristics: an omni-
directional lower body and a humanoid upper body. In
this section, we will introduce its mechanical structure
and sensor configuration.
3.1. The lower body
tional platform equipped with four omnidirectional
wheels. The platform can move in any direction and
G. YANG et al.: Personal Care Robot: Mechanism, Motion Control, and Pose Estimation
85
plane. Therefore, the KUT-PCR can move smoothly,
even in narrow indoor spaces.
Also, six ultrasonic sensors (Figure 3, number 11)
can measure nearby obstacles and moving people; touch
sensors (Figure 3, number 10) can detect touches on the
lower body; and bumper sensors (Figure 3, number 12)
alert the controller whenever the robot contacts the sur-
rounding environment, such as walls or furniture. Addi-
tionally, two laser rangefinders (Figure 3, number 9) are
provided to aim the front and back of the robot, allowing
wide-range detection of the surroundings in real time.
Apart from obstacle avoidance, the robot can also per-
form simultaneous localization and mapping algorithms
using the laser data.
3.2. The upper body
The upper body consists of the head, neck, chest,
waist, and arms. The neck (Figure 3, number 3) can ro-
tate with two degrees of freedom and is equipped with an
RGB-D camera, a set of microphones, and stereo speak-
ers (Figure 3, number 1, 2, 6). The waist (Figure 3, num-
ber 5) can rotate similar to that of a human, with three
degrees of freedom. Each of the two manipulators con-
sists of an arm (Figure 3, number 8) with six degrees of
freedom and an end effector (Figure 3, number 4) with
one degree of freedom. Also, we attached an RGB-D
camera (Figure 3, number 7) to the chest of the robot to
measure objects in front of the robot’s torso. The purpose
of using cameras both on the head and chest was to ob-
tain more environment information in cases where the
directions of the head and chest are different.
Given that this personal care robot consists of om-
nidirectional wheels and various external sensors, it can
move freely while avoiding obstacles, even in a narrow
household environment. Also, with the coordinated oper-
ation of the waist, neck, and manipulators, tasks such as
object transportation can be easily completed. Addition-
ally, users can easily accept the help of the robot more
naturally because of its friendly, human-like appearance.
4. Independent Control Method for the Personal Care Robot
For safe and reliable control of the KUT-PCR, we
propose a control method involving independent opera-
tion of the upper body and lower body. Specifically, the
humanoid upper body remains steady while the lower
platform performs omnidirectional motions. Otherwise,
when the upper body is manipulating objects, the lower
platform stays still. Based on this strategy, the robot can
achieve smooth navigation and solid manipulation mis-
sions with high reliability and low computation complex-
ity.
coordinate frame configuration demonstrated in Figure
4.
The kinematics equations are shown as Equations
(1)-(4). Equation (1) describes the lower platform kine-
matics in the world reference frame; Equation (2) de-
scribes the kinematics of the left manipulator with re-
spect to the upper body frame; Equation (3) describes the
kinematics of the right manipulator with respect to the
upper body frame; and Equation (4) describes the kine-
matics of the object with respect to the lower platform
frame.
0 0 0 1 2 3 4, , , , ,r X Y F (1)
1 7 1 2 3, , , , , ,..., , , ,L L L L L L L L L L w w wr x y z F (2)
1 7 1 2 3, , , , , ,..., , , ,R R R R R R R R R R w w wr x y z F (3)
RLoooooooo rrFzyxr ,,,,,, (4)
where, 0 0 0, ,X Y : position and orientation of the lower
platform; 1 2 3 4, , , : rotation angle of the omnidirectional
wheels; , , , , ,L L L L L Lx y z : position and orientation of
the left end effector; 1 7 1 2 3,..., , , ,L L w w w : joint angles of
the left manipulator and waist; , , , , ,R R R R R Rx y z : posi-
tion and orientation of the right end effector;
1 7 1 2 3,..., , , ,R R w w w : joint angles of the right manipu-
lator and waist; , , , , ,o o o o o ox y z : position and orienta-
tion of the object; ,L Rr r : position and orientation of the
left and right end effectors.
Biomedical Soft Computing and Human Sciences, Vol.23, No.2, 2018
86
The control flow is as follows. First, based on Equa-
tion (1), the omni platform moves so that the object can
be reached within the range of manipulation. Then, based
on Equation (4), the operation pose of the left and right
manipulators can be obtained from the position and
orientation of the object. Next, the joint frame angles of
the left/right manipulators and waist can be calculated
using Equations (2) and (3). Eventually, the angle target
values can be sent to the servo system to actuate the
joints.
5. Pose Estimation Pipeline
In addition to the motion control method, pose
estimation of the intended object is necessary to en- sure a successful pick-and-deliver task. A pose estima-
tion pipeline takes raw sensor data as the input and then outputs the categories and 6-DoF (3-DoF position
and 3-DoF orientation) poses for the recognized ob- jects. Afterward, the category-pose information set
{“Juice Box,” (x, y, z, roll, pitch, yaw)} can be used by the upper body motion planner to generate manipula-
tion actions. Figure 5 illustrates the proposed pose estimation
pipeline. At the start, an RGB-D camera obtains scene cloud and RGB images. Then, taking an RGB image
as the input, the pre-trained convolutional neural net- work (CNN) [6] model can produce the category and
bounding box of the intended object. On one hand, the model library provides the corresponding object mod-
el, considering the recognized category; the original model is then processed using multiple methods, in-
cluding down-sampling. Eventually, a model point cloud suitable for registration in the next step is pro-
vided. On the other hand, the object point cloud can be
extracted from the scene cloud concerning the bound- ing box. At this point, we have a model cloud that
purely represents the object features and an object cloud that contains part of the object surface and some
background. We then use the iterative closest point algorithm to register the model cloud to the object
cloud. The resulting transmission matrix contains the position and orientation of the object. Eventually, the
6-DoF pose and category are passed to the robot con- troller, which will carry on with the manipulation task
(using the control method discussed in Equations (1) to (4)).
Figure 5. Proposed pose estimation pipeline
6. Experiment
and drink-serving experiments to validate the pro- posed pose estimation algorithm and motion control
method. To use the proposed pose estimation pipeline in
Section 5, a deep-learning module providing object detection for daily containers was required. First, we
built our daily container dataset, which consisted of 20 categories of container, including a juice box, green
tea bottle, cola can, and milk box, which are frequent- ly seen in an ordinary household environment. For
each category, 100-200 pictures were taken from dif- ferent angles and with various backgrounds containing
hand-labeled bounding boxes. Then, we selected a CNN model “MobileNet” [7], which was pre-trained
on the Microsoft COCO dataset [8]. We retrained the model on our dataset using transfer learning [9] so that
the expected 20 categories with masks could be pro- duced.
Figure 6 shows the experimental scenario. Mul- tiple containers including a juice box, tea bottle, and
two paper cups were placed on the table along with numerous tools. Among them, the paper cups and tools
were never included in the daily container dataset and were therefore regarded as background noise. The
pose estimation target was set to be the juice box. To validate the accuracy of the pose estimation
G. YANG et al.: Personal Care Robot: Mechanism, Motion Control, and Pose Estimation
87
method, we used an artificial reality (AR) marker board to provide the grounding truth of the object’s
6-DoF pose (Figure 6).
Figure 6. Evaluation of pose estimation accuracy
The pose estimation pipeline was executed 10
times with reconfiguration of the objects on the table. The position and orientation errors are shown in Fig-
ure 7 and Figure 8, respectively. In all the trials, the position error was less than 0.0035 m; the orientation
error was less than two degrees.
Figure 7. Position error
Additionally, we carried out a pick-and-deliver ex- periment to evaluate the application of the control
method and pose estimation on the KUT-PCR. The table was set without placing QR markers, and user A
was laying on the bed nearby. The KUT-PCR was supposed to pick up a tea bottle from the table, reach a
waypoint beside the bed, and serve the drink. The trial lasted for ten repetitions with eight successes and two
failures. The experiment scenario is illustrated in Fig- ure 9.
Figure 8. Orientation error
7. Conclusions
functional personal care robot KUT-PCR, considering its
mechanical design and sensor configuration. We also
proposed the motion control method and pose estimation
algorithm, which were then evaluated through experi-
ments.
and coordinate operation skills. We intend to develop
algorithms that will allow the performance of more per-
sonal care tasks, which could increase the quality of life
of patients with lower-limb disabilities or the elderly.
Acknowledgments
Promotion Foundation.
88
References
future nursing care supply and demand,
http://www.meti.go.jp/press/2018/04/20180409004/2
robot manipulator for human-robot interaction,
Therm. Sci., Vol.20, pp.S537-S548, 2016.
[3] Triebel, R. et al.: SPENCER: A socially aware ser-
vice robot for passenger guidance and help in busy
airports, Springer Tracts Adv. Robot., Vol.113,
pp.607-622, 2016.
[4] Bin, M. et al.: Force adaptation algorithm for finger
exercise using KUKA Youbot, J. Telecommun. Elec-
tron. Comput. Eng., Vol.9, No.3, pp.27-30, 2017.
[5] Wang, S., Ishida, K. and Fujie, M., A multi-function
type human support robot for independent living,
WWLS 2010, pp.536-5372010 (in Japanese).
[6] Ren, S., He, K., Girshick, R. and Sun, J.: Faster
R-CNN: Towards real-time object detection with re-
gion proposal networks, IEEE Trans. Pattern Anal.
Mach. Intell., Vol.39, No.6, pp.1137-1149, 2017.
[7] Howard, A. G. et al.: MobileNets: Efficient convolu-
tional neural networks for mobile vision applications,
2017.
[8] Lin, T. Y. et al.: Microsoft COCO: Common objects
in context, Lect. Notes Comput. Sci. (including Sub-
ser. Lect. Notes Artif. Intell. Lect. Notes Bioinfor-
matics), Vol.8693 LNCS, No.PART 5, pp.740-755,
2014.
[9] Pan, S. J. and Yang, Q.: A survey on transfer learning,
IEEE Trans. Knowl. Data Eng., Vol.22, No.10,
pp.1345-1359, 2010.
Guang YANG
Electrical Engineering from Shenyang Uni-
versity of Technology, China in 2014 and
2017. He is currently pursuing the PhD de-
gree at Kochi University of Technology since
2017. He is also a student member of IEEE
and the Robotics Society of Japan. His
present research interests include robot con-
trol, machine vision and task planning.
Shuoyu WANG
Control Engineering from Shenyang Univer-
sity of Technology, China in 1983 and 1988;
the PhD degree in Electrical Engineering
from Hokkaido University, Japan in 1993
Dr. Shuoyu Wang is currently a professor in
School of Systems Engineering and a direc-
tor of the Advanced Robot Research Center
at the Kochi University of Technology, Ko-
chi, Japan. His research interests include
walking rehabilitation robots, control, and
fuzzy reasoning. He is an academician of the
Engineering Academy of Japan and a fellow
of the Japan Society of Mechanical Engi-
neering (JSME). He is also a member of the
following association: IEEE, the Robotic So-
ciety of Japan, the Japanese Society of In-
strument and Control Engineers, the Japa-
nese Society for Medical and Biological En-
gineering, the Japan Society for Fuzzy
Theory and Intelligent Informatics. His for-
mer positions include editor-in-chief of the
Journal of the Robotics Society of Japan
(2014-2015) and president of the Japanese
Biomedical Fuzzy Systems Association
Engineering, the Northeast University of
China in 2010. And he received the M.E. de-
gree in Detection Technology and Automa-
tion Equipment from Shenyang University of
Technology in 2013. He received the Doctor
degree in System Engineering from Kochi
University of Technology of Japan in 2016.
He has been in Kochi University of Tech-
nology of Japan. He is also a member of the
following association: IEEE, the Robotics
Society of Japan, the Japanese Society of
Mechanical Engineers, the Biomedical Fuzzy
Systems Association. His present research
interests include the human intention recog-
nition, life support robots, rehabilitation ro-
bots, motion control and fuzzy reasoning.
G. YANG et al.: Personal Care Robot: Mechanism, Motion Control, and Pose Estimation
89
Kochi Medical School, Kochi, Japan, in
2006. He worked as a physical therapist at
the Kochi Medical School Hospital, Kochi
University from 2000 to 2014. He is current-
ly an associate professor at the Department
of Physical Therapy Faculty of Health and
Welfare, Tokushima Bunri University. His
current research interests are walking reha-
bilitation and walking ability assessmentHe
is a member of the Japanese Association of
Rehabilitation Medicine, the Japanese So-
ciety for Musculoskeletal Medicine, and the
Japanese Physical Therapy Association.
degrees from Kochi Medical School, Kochi,
Japan, in 1987 and 2002. He was an asso-
ciate professor at the Kochi Medical School
Hospital, Kochi University from 2002 to
2013. He is currently an associate professor
at the Research and Education Faculty, Med-
ical Sciences Cluster, Clinical Medicine
Unit. His current research interests are
walking rehabilitation and walking ability
assessment and he performs the clinical ap-
plication of intelligent walking-support ro-
bot. He is a member of the Japanese Associa-
tion of Rehabilitation Medicine, the Japanese
Orthopaedic Association, and the Japan Col-
lege of Rheumatology. He is currently one of
instruction persons in Japanese Association
of Rehabilitation Medicine (2004-).
(1991-1994). He worked on the design and
development of electrical and communica-
tion equipment and manufacture at SISTEC
CO., LTD. (1994-2007). Then he started to
work on mechanical design development and
manufacture at Satt Systems Ltd. (2008-).
Yoshinobu MIZOBUCHI
from Kochi University of Technology, Ko-
chi, Japan in 2001, 2003 and 2006. He
worked at Ebisu Denki Ltd. (2006-2018). He
is currently working at Satt Systems Ltd.
(2018-). His interests include electronic cir-
cuit design, microcomputer software, image
processing, etc.
Shingo INO
University, Kochi, Japan, in 1988. He is cur-
rently the chairman of the BASARA Ltd.,
Satt Systems Ltd. and ZERO Ltd. His inter-
ests include mechanical design and drawing,
hydraulic circuit design.