Proposal to Participate in RoboCup 2010 Standard Platform League
Nanyang Technological University
I-Ming Chen Ming Xie
Robotics Research Center
School of Mechanical and Aerospace Engineering
Nanyang Technological University
1. Statement of Commitment to Participate in RoboCup 2010 SPL
RoboCup is an important international robotics competition that offers a standard platform for
researchers and students interested in robotics and AI and other relevant fields to examine their effort.
The School of Mechanical and Aerospace Engineering (MAE) of Nanyang Technological University
strongly support the participation of Team Nanyang for RoboCup 2010 and future RoboCup events
by providing financial, technical and manpower support to the team. The Team Nanyang with
Professors I-Ming Chen and Ming Xie as team leaders commits to participate in the 2010 edition of
RoboCup in the SPL league.
2. Leaders and Constitution of the Team
2.1 Team history
Team Nanyang was officially formed in May 2009 in the School of Mechanical and Aerospace
Engineering as part of the program to train students’ hands-on engineering and system integration
capability and nurturing future robotics and mechatronics researchers. Currently the team has 6 latest
NAO robots V3+ academic edition for the RoboCup 2010 competition and a dedicated lab space and
competition field for practicing matches and research.
The School of Mechanical and Aerospace Engineering has a strong robotics research program with
more than 15 faculty members from different disciplines, more than 40 full time researchers, and
current active research funding more than S$8 millions. RoboCup 2010 will be our first time to
participate. Other than SPL, we will also send teams for RoboCupRescue project for RoboCup 2010.
Prior to RoboCup 2010, we have contested in many local robotics competitions like Singapore
Robotics Games many times. Of importance note is the participation in the TechX Robotics Challenge
hosted by Ministry of Defense of Singapore in 2008 on an intelligent robotic platform that can
autonomously navigate in outdoor and indoor environments with stair climbing and elevator operation
capabilities. The final winning award was S$1 million which was not claimed by any of the contesting
teams due to the high complexity of the required tasks. However, among the 6 final qualified teams
(out of the 24 contested teams), our team was the best performed one that was closest to the final
targets. In terms of research capabilities, we are strong in robot vision, navigation, locomotion,
control, and system integration.
2.2 Team Composition
Our Standard Platform League team comprises mostly 3 rd year Mechatronics undergraduate students
from School of Mechanical and Aerospace Engineering, supported by faculty, research students, and
lab technicians. The mechatronics students in School of MAE are trained not only with fundamental
mechanical engineering subjects but also in electronics, embedded systems, computer programming,
and control. Hence, they have very good knowledge of robotics and passion to pursue robotics
projects. Some of the team members have experience in RoboCup and other international robotics
competition events during their prior study in polytechnics. Thus, the team is not entirely new to
2.2.1 Team Leaders
Professor I-Ming Chen – Dr. Chen is currently Director of Intelligent Systems Center in NTU. He
was JSPS Visiting Scholar in Kyoto University, Japan in 1999, Visiting Scholar in the Department of
Mechanical Engineering of MIT in 2004, and Fellow of Singapore-MIT Alliance Program from 2003
to 2007. His research interests are in wearable sensors, human-robot interaction, reconfigurable
automation, parallel kinematics machines (PKM), biomorphic underwater robots, and smart material
based actuators. Dr. Chen has published more than 190 papers in refereed international journals and
conferences as well as book chapters. He is now serving on the editorial boards of IEEE Transactions
on Robotics, IEEE/ASME Transactions on Mechatronics, Mechanism and Machine Theory, and
Robotica, and also Associate Editor-in-Chief of Frontier of Mechanical Engineering (Springer-
Verlag). He was General Chairman of 2006 IEEE Conferences on Cybernetics, Intelligent Systems,
and Robotics (CIS-RAM) in Thailand, and General Chairman of 2009 IEEE/ASME International
Conference on Advanced Intelligent Mechatronics (AIM2009) in Singapore. He is a senior member of
IEEE and member of ASME, and member of RoboCup Singapore National Committee.
Professor Ming Xie – Dr. Xie is now concurrently holding the positions of Associate Professor at
Nanyang Technological University, Editor-in-Chief of International Journal of Humanoid Robotics,
and Associate Editor of IEEE Transaction on Autonomous Mental Development. In addition, he has
served as the General Chair of International Conference on Climbing and Walking Robots in 2007,
and International Conference on Intelligent Robotics and Applications in 2009. Prof Xie has
published one best-seller book in robotics, has won two scientific awards, and also has won two best
conference paper awards. He has worked with Renault Automation (Paris/France) in 1986, INRIA
Sophia-Antipolis (Nice/France) between 1990 and 1993, and Singapore-MIT Alliance between 2000
and 2004. Prof Xie’s research strengths are in machine intelligence, humanoid robotics and
2.2.2 Team Members
Hen Ree Ooi, MAE BEng Student (Mechatronics)
Agus Herryanto, MAE BEng Student (Mechatronics)
Zhao Chen, MAE BEng Student (Mechatronics)
Daniel Chan Bin Yang, MAE BEng Student
Yan Ding, MAE BEng Student (Mechatronics)
Edward Cahyono, MAE BEng Student (Mechatronics)
Bing Bing Li, MAE BEng Student (Mechatronics)
Ning Li, MAE BEng Student (Mechatronics)
Narendra, MAE BEng Student (Mechatronics)
Kwang Ying Seet, MAE BEng Student (Mechatronics)
Roland Rustan Tan, MAE BEng Student (Mechatronics)
Minh Tri Vu, MAE BEng Student (Mechatronics)
Jian Fang Xiao, MAE BEng Student (Mechatronics)
Chao Zhang, MAE BEng Student (Mechatronics)
Peng Zeng, MAE BEng Student (Mechatronics)
3. Team’s Research Interests and Planned Activities
3.1 Research Interests
In the School of MAE, we have built a number of humanoid robot platforms for research and industry.
For example, a life-size Low Cost Humanoid (LOCH) is currently being developed by Prof Ming Xie
to achieve vision-guided mobile manipulation with fully functioned humanoid arm and hands. A lifesize
robotic lion dance system (2-humanoid dynamic manipulation system,
http://interactiverobotics.Blogspot.com) designed by Prof I-Ming Chen had performed and
demonstrated in a live ceremony on Oct 17, 2008. The system features novel dexterous lion head
manipulation mechanism and highly coordinated robot motion control system with multimedia
performance capability. In addition to these systems, extensive investigations have been made in the
study of humanoid robot locomotion, robot vision, and task coordination. With NAO humanoid robots
and RoboCup events, we hope to strengthen our current humanoid robotics research with
• Study of humanoid robot platform as a system integration tool.
• Robot vision for humanoid robot task coordination and guidance
• Humanoid robot Navigation
• Robot motion learning from Motion Capture data and wearable sensors
• Biped locomotion and use of it for self-localization
• Robot formation coordination and control
3.2 Planned Activities
With 6 NAO robots in the lab, we are currently working on the following issues related to RoboCup
with smaller student groups: 1) Robot soccer behavior and locomotion; 2) Vision analysis; 3)
Software interfacing and control. We will have regular frequent internal matches between teams to
improve the programming, control and game strategy prior to the date of RoboCup 2010.
Soccer behavior & Locomotion
The study of different robot soccer behaviors will be the prerequisite before evaluating the locomotion
possibility. All robot behaviors are to be assumed possible when designing the soccer strategy. Each
behavior will consist of a number of functional robot movements, such as walking, running and
kicking. Functional movements will be analyzed based on humanoid robot kinematics for feasibility
check. Functions that fail the check have to be redesigned and tested repeatedly (Fig.1).
Figure 1. Soccer behaviors and locomotion
NAO Vision analysis
Cameras installed in the NAO robot take picture continuously with fixed duration depending on its
frame rate. The parameters of the camera will be investigated in order to process the camera images
for ball and gate identification and localization of teammates. Sensible physical output like the goal
position and distance could be tabulated for quick decision making process.
Software interfacing and control
Implementation of various desired robot soccer functions will be identified and achieved for the
gatekeeper robot and the teammate robots. We will focus on using different learning algorithms with
the Aldebaran library functions to interface and control the robot. Robust communication among the
NAO robot team members will be studied and implemented for competition.
4. Summary of Past Relevant Work and Scientific Publications
4.1 NAO related works
NAO Robot kinematics
Observing NAO Robot from the perspective of its kinematic, it has five kinematic chains that is one
neck chain, with the head as the end-effector (as the camera is the primary, two leg chains, with the
feet as end-effectors, and two arm chains, with fingers as end-effectors. Each chain comprises of a
number of motor that can be activated one separated from the other. In total, NAO Robot has 21
motors lies from its neck to its foot. We have developed kinematics models based on D-H notions for
the legs and arms of for NAO for posture control.
Figure 2. NAO robot structure
Study and definition of soccer behavior
We have investigated basic robot soccer functions on the field based on the robot kinematics namely:
• Looking for ball
• Going to ball
• Penalty shoot-out
NAO robot locomotion
For each soccer robot behavior, a number of robot functional movements are designed, such as
standard walking, kicking, getting up from the fall, and diving. In designing each movement, postures,
speed of the joints, stability of the robot are of the main concerns.
NAO walking study
Walking that NAO Robot can perform comprises of few phases. Each phase is important to maintain
the stability of the robot. The phases are shown in Fig. 3. These phases are split into further subphases,
defining the lift, swing and weight acceptance of alternating legs. The walking phase is started
by tilting NAO Robot body toward left or right such that one of legs can be higher than the other. In
this single support phase, NAO Robot will be able to move the higher leg forward. The next phase
will be arranging the angle of the motor in the leg moving forward. There will be some sets of angles
needed to make such a motion to move the leg forward. After that, the desired leg move forward,
NAO Robot phase will move from single support phase to double support. The rest half phase will be
symmetrical to the first one.
Figure 3. Biped walking
In order to maintain the stability of NAO Robot while it is walking, the following issues are
A. Statically balanced walking
A robot is statically stable if it remains indefinitely stable if all motion of the robot is stopped (frozen)
at any point in time. Statically balanced walking is achieved by projecting the Centre of Gravity
(CoG) on the ground and ensuring that it is always inside the foot support area. The foot support area
is defined as the foot surface of the supporting leg in a single support phase or as the minimum
convex area containing both foot surfaces in a double support phase. As the dynamics of the robot are
not taken into account, static walking speeds will remain slow enough that inertial forces are
negligible. This usually results in static walkers having large feet and strong ankle joints. This initial
approach to walking has been largely abandoned due its slow speed and lack of realistic and agile
B. Dynamically Balanced Walking
Dynamically balanced walking allows for the CoG of the robot to be outside of the support region
whilst remaining dynamically stable. There is no single clear criterion to determine if a gait is
dynamically stable, however most dynamically balanced walking algorithms use two stability criteria:
the Zero Moment Point and/or the angular momentum of the robot, treating the robot as an inverted
pendulum pivoting on the support area of the foot.
NAO robot vision
The hardware for the vision system of NAO consists of two cameras in front of the head of NAO
arranged vertically with a maximum 640x480 resolution. As shown in Fig. 4, the two cameras cover
different areas in front of it. There exists an about 5cm range lies about 1m in front of the robot where
neither of the two cameras covers. With a field of view at 58 degree, the two cameras are able to cover
half the match field. The focus ranges of the cameras are from 30cm to infinity, with a height of about
40cm, it is possible for NAO to capture the position of the ball in the field.
Figure 4. Field of view of NAO cameras
During the computation, positions of balls are determined by images captured by the cameras. The
images are sent to the RAM of the NAO’s processors then returned as a robot command for the robot
to react. The speed of the robot’s motion depends on the image processing algorithm. The simple face
and logo detection functions in NAO are to be used to increase the NAO’s response speed in visual
detection of the balls.
When a camera is located at a fixed position, the real positions of the objects inside its field of view
will have a relation with the positions of them in the picture the camera captures. Consider a camera
as a simple convex lens, for most cases; the objects are beyond the camera’s focus length which
indicates that a real image is formed in the camera’s CMOS. For a 2-D system, regardless an object’s
geometry, i.e. regard it as a point and take the camera as the original point, the relation between the
real object and the image should be as follows.
Figure 5. Vision computation in NAO
The position of the real object A lies in the line determined by the origin and the image A’, and the
relation of the coordinates are
where α is the magnification of the lens of the camera (negative value), which is determined by the
focus length of the lens when the focus length is far less than the object length.
For a 3-D situation, viewing the match from the top, the situation is similar to a 2-D one. The
coordinates system can be changed with a polar coordinate system with the relation remains. Based on
this relation, we can change the configuration of the face detection function such that it detects other
NAO robots’ rather than human face, simultaneously, the configuration of logo detection can also be
change into the detection of balls in the match. The result of the detections can thus be taken from
NAO’s memory and used to analysis the position of ball and other robot.
For the case above, when the ball is detected, the position of the cross is stored in the memory. With a
resolution of 640x480, the vector between the center of the image and the center of the cross can be
obtained by math operation. The horizontal portion thus indicate the direction the robot need to turn
and the magnitude can be used to calculated the angle to turn by the relation below:
where the result is in degree, the constant 320 is half of the horizontal resolution and 28 is half of the
view of field of the camera in NAO. The vertical portion is related to distance the NAO need to walk.
Assume the NAO is looking horizontally; the position will always fall below the center line of the
image, which means the vertical portion of the vector is always negative. However, in this case the
position of the ball is hard to found by performing a simple math calculation. If the angle between the
axis of NAO’s camera and the horizontal line is α, given the height of the camera h, the distance
between the bottom of the image and the center can be found by
where h/tan(α) is the distance from the feet of the robot to the point where the axis of the camera meet
the ground, the constant of 17.25 is half the camera’s vertical view range, thus the second part is the
distance between the robot’s feet to the view limit of the camera. The distance traveled then can be set
as every resolution in the vector roughly is 1/240 of d.
 Xie M., 2003, The Fundamentals of Robotics: Linking Perception to Action, World Scientific
 Xie M, Xian L. B., Wang L. and Li J., 2009, Mobile Robots: State of the art in land, sea, air and
collaborative missions, Biologically-inspired Design of Humanoids, I-Tech Education and
 M. Xie, 2009, Five Steps of Evolution from Non-life to Life-like Robots, International Journal of
Humanoid Robotics, Vol. 6, No. 2, pp.307-327.
 Xie M, Zhong ZW, Zhang L, Yang HJ, Song CS, Li J, Xian LB and Wang L., 2008, Self Learning
of Gravity Compensation by LOCH Humanoid Robot, IEEE International Conference on
Humanoid Robots, Daejeon, South Korea.
 Wang L, Xie M., Zhong, Z. W., Wang C. and Zhang L., 2008, Power Analysis and Structure
Optimization in the Design of a Humanoid Robot, 11th International Conference on Climbing
and Walking Robots, Coimbra, Portugal.
 Rinaldo Christian Tanumara, Xie M., Au Chi Kit, 2006, Learning Human-like Color
Categorization through Interaction, International Journal of Computational Intelligence, Vol. 3,
No. 4, pp. 338-345
 Chen G. D., Sun, L. N. and Xie M, 2009, Camera Calibration Based on Extended Kalman Filter
Using Robot Motion, IEEE/ASME International Conference on Advanced Intelligent
 Xie M., 2004, Robot Vision: A Holistic View, International Conference on Climbing and
Walking Robots, Spain
 Jin Yi, Xie M., 2000, Vision Guided Homing for Humanoid Service Robot, 5th International
Conference on Computer Integrated Manufacturing, Singapore, pp. 499-509.
Wong Swee Meng, Xie M., 1999, Vision Functions for the Guidance of Smart Vehicle,
International Conference on Advanced Robotics, Japan, Japan Robot Association, pp. 457-462.
Chen, I.-M., Tay, R., Xing, S., Yeo, S. H., “Marionette: From Traditional Manipulation to
Robotic Manipulation,” IEEE Robotics and Automation Magazine, Vol. 12, No. 1, pp59-74,
Lim, K.Y., Dong, W., Goh, Y.K., Nguyen, K.D., Chen, I-M., Yeo, S.H., Duh, B.-L., “A
Wearable, Self-Calibrating, Wireless Sensor Network for Body Motion Processing,” Proc IEEE
Int. Conf. Robotics Automation, Pasadena, CA, USA, 2008.
Luo Z Q, Duh B L, Chen I-M, Luo W S. “Spatial Navigation in a Virtual Multilevel Building:
The Role of Exocentric View in Acquiring Survey Knowledge,” Proc. Virtual And Mixed Reality,
Ed. R. Shumaker, Lecture Notes in Computer Science, Vol. 5622, pp60-69, Springer-Verlag,
Yan L, Chen I-M, Yeo S H, Chen Y, Yang G, “A High-Dexterity Low-Degree-of-Freedom
Hybrid Manipulator Structure for Robotic Lion Dance”, Journal of Zhejiang University SCIENCE
A: Applied Physics & Engineering, Accepted for publication, 2009.
Nguyen K D, Chen I-M, Luo Z Q, Yeo S H, Duh B L, “A Wearable Sensing System for Tracking
and Monitoring of Functional Arm Movement,” IEEE/ASME Trans. Mechatronics, accepted for
Chen C Y, Chen I-M, Cheng C C. “Integrated design of a Legged Mechatronic System,” Frontiers
of Mechanical Engineering, Springer-Verlag, Vol. 4, No. 3, pp264-275, 2009.
Yan L, Chen I-M, Yeo S H, Chen Y, Yang G, “A High-Dexterity Low-Degree-of-Freedom
Hybrid Manipulator Structure for Robotic Lion Dance”, 1 st IFTOMM Int’l Sym on Robotics and
Mechatronics, Hanoi, Vietnam, Sep 2009
Nguyen K D, Chen I-M, Luo Z Q, Yeo S H, “A Body Sensor Network for Tracking and
Monitoring of Functional Arm Motion,” Proc. IEEE/RSJ Int. Conf. Intelligent Robots Systems
(IROS), St Louis, MO, USA, 2009.
Han B S, Ho D, Tay A, Ng T L, Yow A P, Chen I-M, Yeo S H, Li H Z, “A Life-Size Robotic
Lion Dance System with Integrated Motion Control,” Proc 18th IEEE International Symposium
on Robot and Human, Toyama, Japan, 2009