Project: DoppelRender

Ever notice how much time you spend rendering identical images? Maybe your animated character pauses to think, or maybe you’re just using (and reusing) a limited number of poses. But sometimes it seems that over half the frames you devote precious time to rendering are identical to the frames you’ve already rendered. Sure, you can try to keep track of which ones are duplicates and then wrangle those files by hand, but situations get complicated quickly, and it ends up being easier to just bite the extra render time and save yourself the headache.
Well DoppelRender solves all that. It’s not a new renderer. It’s not an image processor. It’s just a really simple Python script that uses a couple of clever tricks to figure out which frames in your animation are duplicates, and then renders them only once. Then it fills in all the other doppelganger frames by soft linking to the single rendered image. Easy peasy. And really, really fast.
In my test cases, (real animation projects creating illustrations for my YouTube series) DoppelRender cut my rendering time in half.
I’ve been using DoppelRender in my own production process for a couple of weeks now, and I think it’s time to start getting some feedback from other users.
The Trick
The core of the idea is to recognize that if two images are identical, they will still be identical when rendered at a small size. And if they aren’t identical, the small versions will still differ in at least a pixel value or two. So DoppelRender renders out the entire sequence at a tiny size (5% seems to be good) and then compares the MD5 checksums of all the rendered frames. Any sets of frames with identical checksums are grouped together and only rendered once, with the resulting render soft-linked to all the sibling frames in the group.
Addendum
This algorithm requires a renderer that has no randomness in the image based on the keyframe. So it worked great in Blender’s internal renderer, but when I switched to Cycles, its stochastic nature made the algo useless.
In the spirit of sharing the algorithm, I posted the code on GitHub, but I never intended to polish it up myself. Fortunately, somebody else took over.
Related Documents
- None posted yet.