HIWI Job: Adjusting Human Models using a

HIWI Job:
Adjusting Human Models using a Depth Sensor
In the new research group wearHEALTH, we want to estimate human dynamics from body-worn inertial
measurement units (IMUs). As basis for this, we need to estimate the approximate bodyshape of a
person. At the same time we need a human model representation that is fully rigged and can be used
for motion capture scenarios. Thus we aim at adjusting existing parametric human models via a depth
sensor, such as the Kinect.
Your tasks:
• Extract a body pose from a depth sensor and transform this pose to a MakeHuman skeleton
• Align the resulting parametric model to the depth data
• Create a pipeline and evaluate it with respect to different recordings
Requirements:
• Experience in 3D Computer Vision and Point Cloud Processing
• Knowledge about human character rigging and experience with the BVH format
• Advanced implementation skills in Python and C++
References:
• http://www.makehuman.org/
• http://openkinect.org/wiki/Main_Page/
• http://en.wikipedia.org/wiki/Biovision_Hierarchy
The position will be in the new research group wearHEALTH (Department of Computer Science,
Technical University Kaiserslautern). We work in close collaboration with the Department Augmented
Vision at DFKI, lead by Prof. Didier Stricker (http://av.dfki.de/en/).
Please send your application to the contact provided below.
Dr. Bertram Taetz
Technical University Kaiserslautern
Department for Computer Science
AG wearHEALTH
Gottlieb-Daimler-Straße 48
D-67663 Kaiserslautern
Phone: +49 (0)631 205 2644
Email: [email protected]
www.wearhealth.org
24.04.2015