Dr. Yan Wu
Dr. Yan Wu

Deputy Head of the Robotics and Autonomous Systems Department

A*STAR Institute for Infocomm Research

Biography

Yan Wu is the Deputy Head of the Robotics and Autonomous Systems Department at A*STAR Institute for Infocomm Research, where he also works as a Principal Scientist and Leader of the Manipulation and Human-Robot Collaboration Group. Yan received his BA(Hons) in Engineering from the University of Cambridge in 2007, and PhD in Electrical Engineering from Imperial College London in 2013. From Aug 2012, he worked concurrently at the UCL Institute of Child Health as a Research Associate and Great Ormond Street Hospital as a Research Fellow. Since Dec 2013, he has been with the A*STAR Institute for Infocomm Research, Singapore. Yan is the current Chair of the IEEE Systems, Man and Cybernetics Society, Singapore Chapter, Vice President of the Pattern Recognition and Machine Intelligence Association and Member of the IEEE Robotics and Automation Society Technical Committees on Cognitive Robotics, Haptics and Neuro-Robotics Systems. He has been a volunteer at various conferences, including Programme Chair HFR2018, Local Chair ASRU2019, Programme Chair ICSR2021, Finance Chair ICASSP2022 and is the Tutorial & Workshop Chair IECON2023. He serves or has served as an Associate Editor at various editorial boards, such as IEEE Robotics and Automation Society Conference Editorial Board, IEEE Intelligent Transport Systems Society Conference Editorial Board and Frontier in Robotics and AI. Yan is a Senior Member of the IEEE. His research interests include human-robot interaction, dexterous manipulation, robot learning and service and assistive robotics.

 

Title

Veni, Sensi, Vici: Interacting with the world through touch

Abstract

The sense of touch is arguably the first human sense to develop and the most important sensing modality in physical interactions. It enriches our multimodal perception, gives rise to dexterity and acts as the last line of defense for safety. On the other hand, robots have been increasingly required to work outside their historical cages and manipulate a growing range of objects. However, the lack of standardised representation for tactile signals trails the exploration and application of interaction with tactile perception far behind those of its visual and auditory siblings. In this talk, we attempt to address the questions in building effective representation and control models for tactile perception and tactile-guided robot manipulation of the environment.