(new project, Nov 2008 - Oct 2011)
The human body motion is a very important mean of expressing a person’s emotion, knowledge and experience, as well as an effective communication tool in inter-personal interaction. This project investigates human body motion as a representation of human experience in Co-space. It is envisaged that every real human will have a digital counterpart in the Co-space. We aim to provide a methodology for seamless integration of movements between the real human and the virtual one in Co-space, enabling motion replication in both directions. A set of hardware and software tools will be developed to: 1) facilitate high-fidelity 3D human body motion replication in Co-space via an wearable interface suit; 2) represent, calibrate and process 3D human body motion information in digital form across various platforms; 3) allow physical users to conduct virtual activities in Co-space jointly with others across the Internet to obtain real physical interactions. A Co-space yoga group lesson application will be implemented in a Second-Life-like virtual environment with participants from different parts of the world to demonstrate participatory body motion interaction. This is achieved based on the tangible hardware prototypes and software applications from the results of this research work.
Project funded under NRF Interactive Digital Media Initiative on Co-Space.
MDA press release is here:
Two (2) PhD student openings are available. For more information, please contact Prof I-Ming Chen (firstname.lastname@example.org)