Project: PuppetFist

Animation is one of my hobbies, but I’m not particularly good at it. It’s an art form that requires patience, practice, and subtlety - none of which appear on my resume. But just because I suck at something doesn’t mean I’m not going to do it. It just means that I’m going to lean on my other skills to lower the total suckage. In this case, that means programming and electronics.
PuppetFist is the project that will let me create decent quality animation quickly, by capturing a performance instead of slogging through all that tedious keyframing. And the performance part couldn’t be easier. If you can sock-puppet, you can PuppetFist.
◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇◆◇
This project actually started 9 or 10 years ago, and I managed to get a prototype running, but then life intervened and everything had to go into mothballs, temporarily, I thought. Well “temporarily” can mean ten years later, so that’s what’s happening. Call it a creative pause.
Anyway, the idea of PuppetFist is pretty simple. By using orientation sensors, I can measure how much an object is leaning. So if I attach one to my forearm, I can measure its angles. And if I attach one to my hand, I can measure those angles too. So if you think of your arm and hand as being the torso and head of a puppet, then you have all the information you need to capture a Muppet’s movements.
Well, not quite. To really capture the performance, you also need to encode the open/close position of the mouth, but that’s just another sensor, which I can control using my thumb.
The result isn’t going to win any academy awards, and in its initial form it can’t handle more complicated puppets, like marionettes, but for the quick capture of a Muppet speaking to camera, it’s surprisingly effective.
The work of this project now is to resurrect my old files, finish the testing and tweaking, and then publish everything so others can use it. (Plus there will probably be some fun demonstration videos created along the way. :-)
Related Documents
- Nothing posted yet.