Microtuble parody animation
This post will cover some behind the scences stuff of an animation I rendered in Blender that parodied the ad for the original microsoft surfacebook. Watch it here.
Building a microtubule
The most important thing is to set up a reasonably accurate model of a microtubule. For this animation, I used the PDB entry 3EDL for this project. I have a separate post on how to import this into blender here. In this case, I aligned the tubulin dimer orientation by eye. A better alternative would be to take a PDB entry with a whole microtubule, and then remove all but one dimer pair. I then used 2 approaches to build the microtubule:
- 2 pairs of array modifiers
- 1 array modifier and follow spline modifiers per protofilament
An array modifier repeats (i.e. copy-paste) the object at specified offset and rotation etc. One modifier was set up to reproduce the 3-start helix of a 13 protofilament microtubules (3 monomer per rotation), the other simply repeats dimers along the protofilament (i.e. elongates the microtubule). This works well for the ‘blunt’ microtubule scenes.
For the scences with a flaring microtubule, each protofilament (offset appropriately in z to get the 3 start helix) was animated by having each dimer follow a spline (see image below). Splines were offset in z to give the microtubule end a more ragged appearance, and the number of protofilaments was also varied in time and number between protofilaments. Timing was all done via keyframes (animate the z position of each spline, the number of repeats in array modifier etc). The resulting animation is not scientifically accurate of course (tip dynamics are stochastic), but it gives an impression. One could make it more accurate by using the built in python scripting in blender.
Microtubule shading
I have opted for a cartoon shader look for the microtubule. This was a very simple toon shader, which was laid out as in the image below. To get the black outine, a new black material was added, with a solidify modifier, and flipped normals. This flipped normals make it such that you only see material ‘from the inside’ of the object, which means you only see it on places where you would expect an outline.
Video export and Text
Blender outputs a series of png images, one for each frame. I used Davinci resolve (free software) to import these images and their text functions and export settings to render the video. I use this software for timelapses, as it automatically handles import of series of images into clips and handles export for youtube.
What I would do differently next time
I would probably spend a bit more time reducing the vertex count before animating, as some scences were tricky to work with in terms of performance. Blender has a problem that as soon as you keyframe (i.e. animate) any array modifier, it rebuilds the whole scene in every frame (not just the ones where the array modifer value changes) which made rendering and working with a scene like the microtubule growing and depolymerising take a long time. I would probably also work on making the growth and shrinking part more accurate by doing some scripting via python.