Researchers from MIT have made a studio where movements of a person are recorded, then the student wearing the suit has a go at mimicing it: "When their movements don't match the teacher's, the suit vibrates at key locations around the joints to encourage them to make the correct ones."
I still think that the LightStage blogged about earlier has a lot more potential - Similar stuff has cropped up a fair bit before e.g. here
The concept of using part of or the whole body as an interface to control a computer (program) has been seen by various companies, including the zcam.
Conventional mocap can use reflective dots/leds/IR etc - the movements of these points are measured through cameras capturing the light from these points. Then this can be turned into a computer generated skeleton, and then to a animated character.
Adelsberger at ETH, along with researchers at MIT, and Mitsubishi have looked into cheaper mocap systems to be used outside the lab or studio. By using sensors on the bogy using both accelerometers and gyroscopes to mearue motion alongside ultrasound, together all these sensors. A computer like a laptop can then analsye all the data and make a model.
Keepon is a cute lil yellow robot -
Keepon can find the beat in music, and move to the beat. It can also detect movement and track rhythmic motion, of objects, including people.
More information from New Scientist here
For a yellow blob it can dance pretty well!
Doing the robot
Bill Gates: just at the end of the recent Seinfeld, Gate, Microsoft ad, there's a Bill Gate's robot.
About 4 minutes 10 seconds in