Press 'Tab' to the content

Seminar

29 APR 2019 Seminar

Multimodal Event-based Perceptive Control for Humanoid Whole Body Coordinating Actions

Mr. WANG Siyu

Mr. WANG Siyu

Abstract:

Robot implements gradually move from a structured environment such as a factory to a daily scenario where people lives. As designed for humanlike usage, humanoid robot cannot avoid interactions between unexpected and unstructured human behaviors or/and environment effect, which is one of the biggest challenges of traditional planning and control of robotics.


Humanoid robot is one of the most complicated and coupling robotic systems because of its high DoF (Degree of Freedom) and rapidly changing contact and support conditions. As equipped with humanoid limbs, the humanoid robot is expected to gain humanlike coordination ability between limbs with unexpected disturbances. Different from traditional multi robot cooperation, which is based on time-based motion reference in joint level. Human owns intelligence of responding to unexpected events and dealing with locomotion and manipulation task by coordinating limbs motion, which is based on perceptions as references and task level scheduling.


Inspired by this, event-based perceptive control framework would be adopted for multi limbs coordination of humanoid robot, which also allows sensory motion references and the integration of joint and task control levels. In different tasks (like locomotion and manipulation), different event-based sensory references are defined (balance criterion and position error respectively) for state-of-art task execution under unexpected disturbances. Also, for the limbs of multi-task (like arms), a hybrid system would be proposed where their event-based reference would change as the task present priority is changed (like priority of locomotion and manipulation tasks). Two representative tasks, locomotion and manipulation, are selected and coordination scheme would be experimentally implemented and tested on a full size 29-DoF humanoid robot.


From another point of view, the humanoid robot is also expected to interact with human with human-like natural ways. Recently, intuitive and nature interaction ways between robotic system and human have been increasingly studied and adopted in recent years. Different from traditional input of robotic system, which ask for prior knowledge or training, Nature language, EMG (Electromyogram) and others as instinct human-robot interfaces own versatility and less requirements of the users.


Present natural language or EMG based control still use time as the principal reference and require a correct temporal sequence of tasks. However, it is likely for users to speak language commands or physical EMG signal in a mixed order based on direct observation and intuitive thinking of human. So it is vital for robot to determine task/subtask order and motion planning and control based on its own sensory information, which would be integrated in the event-based control framework.


Overall, a novel perceptive planning and control scheme would be exploded and tackle the research gap in humanoid robot about cooperation of limbs with unexpected disturbance. Further usage of the scheme like human-machine interface can be studied too.

Venue

HW 8-28

Speakers

Mr. WANG Siyu

Date

29 April,2019

Time

9:00am – 11:00am

< Back