Sunday, November 2, 2008

Other Collaborative Projects

TechX robot Challenge
Organized by Defense Science and Technology Agency (DSTA), Mindef of Singapore. This is an equivalent event of Darpa Grand challenge in USA with an award of S$1 million for the final winner. In the final competition in Republic Polytechnic on Sep 22, 2008, our team - Evolution with a tracked mobile platform, called Uni-seeker, went the furtherest among the 6 finalists. No winner for this round of competition. Please refer to the link at:

Interactive Digital Media Reseach for Education
Enhancing Education In Environmental Awareness: A Game-Based Approach To Ambient Learning

PI: Prof. Vivian Chen (SCI)
Co-PI: Prof. Henry Duh (NUS), Prof Chen I-Ming, Dr. Lim Hock Beng (IntelliSys), Prof Zuo Peiwen (NIE)

Weather patterns, climate change and global warming are important environmental phenomena in the world today. As a tropical island, Singapore is heavily affected by the changing global climate. These are complex phenomena for children to learn and understand. The main goal of this project is to develop educational games for helping children to understand weather patterns, climate and environmental changes such as the relationship between weather, mosquito breeding and dengue fever occurrences. Through ambient and mobile technology, students and teachers can dynamically interact with real time weather data to gain better understanding.
(Project from Sep 2008 - Aug 2011)

Interactive Robotic Puppet System

String-operated puppets or marionettes, a disappearing art form in Singapore, exist in various cultural themes. The puppeteer techniques involved become a cultural or national heritage; the making of marionette figures becomes fine folk arts. In the modern elementary education, puppetry is also a very effective tool for shaping children’s personality, creativity, and thinking. The objective of this project is to develop modern puppetry using robotics and information technology to evoke and stimulate public interests in this traditional art form and to create new forms that combine arts with science and technology. This project is a long term pursual of fusion of robotics with cultural elements under the theme of human-robot interaction.

Over the years, we have built quite a few systems in robotic puppetry. The most recent one can be seen in YouTube here.

Replicating and Processing of Human Body Motion for Participatory Interaction in Co-Space

(new project, Nov 2008 - Oct 2011)
The human body motion is a very important mean of expressing a person’s emotion, knowledge and experience, as well as an effective communication tool in inter-personal interaction. This project investigates human body motion as a representation of human experience in Co-space. It is envisaged that every real human will have a digital counterpart in the Co-space. We aim to provide a methodology for seamless integration of movements between the real human and the virtual one in Co-space, enabling motion replication in both directions. A set of hardware and software tools will be developed to: 1) facilitate high-fidelity 3D human body motion replication in Co-space via an wearable interface suit; 2) represent, calibrate and process 3D human body motion information in digital form across various platforms; 3) allow physical users to conduct virtual activities in Co-space jointly with others across the Internet to obtain real physical interactions. A Co-space yoga group lesson application will be implemented in a Second-Life-like virtual environment with participants from different parts of the world to demonstrate participatory body motion interaction. This is achieved based on the tangible hardware prototypes and software applications from the results of this research work.
Project funded under NRF Interactive Digital Media Initiative on Co-Space.
MDA press release is here:

Two (2) PhD student openings are available. For more information, please contact Prof I-Ming Chen (

Micro Motor and Actuator for Therapeutic Ingestible Microcapsule

(new project Sep 2008 - Aug 2011)
The main objectives of the proposals are to analyze and design the potential micro actuator and motor solutions for the therapeutic ingestible capsule. The motor/actuator will be used for orienting onboard vision system, onboard targeted labelling, and onboard biospy. This is a collaboration project between DSI and NTU under ASTAR MedTech Research Programme (Medical Robotics) (PI – Dr. Bi Chao, DSI)

One PhD student opening available.
Several undergraduate student research opportunities are avialable.

Ultrasonic Motor Transfer Technology

(on-going project)
This research work is a continuation of our earlier work on piezoelectric ultrasonic motors (USM) for industrial technology transfer. In this project, a linear USM and a rotary USM will be designed, fabricated and assembled according to the specified requirements.

SmartSuit - Wearable embedded human motion processing systems

(completed project)
This project is to develop a high performance un-tethered wearable human motion processing system for real-time sensing and processing of human motion data. Embedded sensing techniques for human movement detection, motion data fusion and processing capability, ergonomics of the wearable system, interaction between the wearable sensing system and the physical/virtual environment will be the main research issues. Major applications of this wearable sensing system are in entertainment, animation, medicine, sports and virtual manufacturing. In this project we have developed a low-cost, miniature, flexible, low-component-count goniometer that uses optical linear encoders (OLE) packed in a small pliable casing mounted on a flexible substrate for measuring body joint angle. A flexible steel wire guided by a Teflon tube is attached to the linear encoder to allow linear sliding movement due to the stretch of the skin surface as the body joint bends. The linear displacement taken from the linear encoder can be converted into an angle proportional to the angle formed by the particular body joint. Shown in the figure are sensor pads, sewn with several OLEs, for detecting movements of different body joints. Based on the modularly design sensor pads, we will be able to develop a full body motion suit for digital animation as well as critical joint measurement tools for medical and sports applications. A US provisional patent has been filed in May 2007.
An application of the OLE sensors for rehabilitation has been demonstrated in SICEX exhibition in Suntec City, 12-13 Jan 2008.
This project is supported under ASTAR TSRP Embedded hybrid systems Phase II. For program description and our project details, please refer to:

Interactive Robotic Lion Dancing System

(completed project)
Lion dancing is a folk art performed during many Chinese festivals and important events. It is a widely seen art form in many part of the world. The lion dance is usually performed by well-trained acrobats with years of practices. In this project, we aim to use advance robotic technology to develop a mechatronic system that can perform life-size lion dancing with the traditional lion dance outfit. We intend to experiment the fusion of traditional art form and robotic technology so as to stimulate people’s interest in the disappearing traditional art form and folk lords, and give new meanings to the new art form. The completed system has been showcased in the opening of ASTAR Fusionpolis on 17 October 2007.
(Project funded by ASTAR. Special thanks go to Jenny Ang, Han Boon Siew, Desmond Goh, Yan Liang and many ASTAR researchers and NTU students.)
ASTAR press release:


Current research students and staff (Dec 2009)

Research Fellows
Dr Ding Zhongqiang (Embedded systems for human motion replication)
Dr Lim Chee Kian (Ultrasonic motors and spherical motors)
Dr Peter Luo Zhiqiang (Human robot interaction)
Dr Yan Liang (Spherical motors and actuators)

Research Associate
Yang Wei-Ting (Motion replication algorithms)

Project Officers
Gu Chao (InteraceSuit hardware)
Li Kang (SmartGlove)
John Nguyen Kim Doang (SmartSuit)
Tee Ke Yeng (Co-space visualization and interaction design)

PhD student
Guo Wenjiang (Nano precision surface feature detection)
Lee Shang Ping (Human motion replication)

Teo Tat Joo (PhD 2009)
Tang Xueyan (PhD 2007)
Ni Wei (PO 2009)

Welcome to Interactive Sensing and Robotics Blog

This is my first attempt to use blog as an update of my research website.
For my research work prior to 2005, please refer to my NTU research webpage: