Human Avatar Project
The Pesaran Lab at NYU’s Center for Neural Science is working to develop advanced brain-machine interfaces that enable dexterous control of robotic systems. As part of the lab’s research agenda, I am developing a real-time virtual avatar using data from a state-of-the-art motion capture system that tracks all the movements of the human arm and hand, comprising 27 joint angles during object reach and grasp.
The skeleton data is generated from an infrared system that captures motion data. This data is then processed in Cortex to calibrate bone segments and SIMM to resolve joint angles. I use this segmental model to define the degrees of freedom of a virtual avatar in 3DS Max. The result is presented in Vizard. The avatar responds to real-time arm and hand movements of a subject. As the subject moves in physical space, the avatar moves in tandem in the virtual space. The system renders the virtual arm in real-time at ~20 Hz.
Role : Producing the real-time graphic rendering of the arm by skinning the skeletal model to a 3D mesh. Defining degrees of freedom using a Hierarchical Translation Rotational model for joint angles. Ensuring system performance during live subject tests.