This past few months I’ve been beavering away at Lola Post on 2 series of shows, creating VFX of a weathery, Earth-scale nature for Britains’ Most Extreme Weather, and shots of all scales for series 3 of How The Universe Works.
Ordinarily I’d put together blog posts before a show goes to air, but in the case of Britain’s Most Extreme Weather it slipped from my mind as soon as I rocked back onto How The Universe Works. Much of my weathery input was particle systems and strands, either using existing setups from previous shows or creating new ones as appropriate. A particular favourite of mine was a system showing the movement of air around cyclones and anticyclones; A strand system that rotates particles around many points, allowing them to move fluidly from one direction to another as if air, all wrapped around a lovely spherical Earth.
How The Universe Works is a series I’ve been on for many many months now. I first started on it in November I think. The first episode, all about our Sun, is to be shown on 10th July on Science in the USA.
For that show I took Lola’s existing Sun cutaway setup, introducing a more boiling lava-like feel through judicious use of animated fractals and grads.
Overall I’ve worked on 8 episodes with a handful of shots in each show. After all that dedication to spheres in space I am now supervising the VFX on one of the last shows for this series!
More geeky details and videos for both shows to come!
Recently in America, The History Channel broadcast The Bible Series, knocking American Idol into the weeds for ratings. The real reason of course to celebrate this fact is that I worked on VFX for this, along with many others hired by / working at Lola Post, London.
There were hundreds of shots. As the series covers many well-known events that are either epic in scale or miraculous in nature, it’s hard to cut corners with this kind of content.
One of the advantages of VFX is the ability to extend sets or create new ones. The most used model shared amongst the 3d crew was that of Jerusalem. It was originally an off-the-shelf-model of a real scale model, intended to be seen from a distance, so it needed to be tweaked and improved upon where appropriate on a shot by shot basis. With so many artists having touched the model at one point or other, the lighting setup, materials and textures got improved to the extent that once composited, the shots really shone out. Many of the shots I did for The Bible featured Jerusalem, either as an entirely CG set or an extension tracked into existing footage.
One story that is covered in the show is that of Moses parting The Red Sea, with the Israelites being chased by Egyptians through the parted waves. The shot I did for this sequence is a slightly top down shot, following the fleeing crowds through the freshly created gap in the ocean. To achieve this, I effectively split the 3d ocean into horizontal grids and vertical grids. The horizontal grids were simulated with aaOcean in Softimage. The vertical ones were distorted to represent the sea walls, textured with composited footage of waterfalls running upwards. The join where the two sets of grids met was blended using a matte and Nuke’s iDistort node. Softimage’s CrowdFX was used for the fleeing crowd. Twirling smoke elements were added once passed to the comp.
An advantage of Softimage’s ICE simulation system is that making a convincing cloud or mist is a fairly straight forward procedure. I was tasked with creating a storm over Jericho, a swirling mass of cloud and debris that had to look huge and imposing whilst looking down through the eye of the storm. With clouds, water, and many other fluids, scale can be half the battle. A large wave only looks large if surrounded by smaller ones, a cloud only looks like a huge ominous mass if seen as a collection of smaller masses, but go too small and the effect is lost entirely. In the case of the cloud, if too many small details were apparent it very quickly seemed fluffy. Cute a storm is not. Once the cloud’s scale was correct, there was the issue of it having to spin, distort and generally seem organic. Handily ICE has a node for rotating clouds around points in space so that solved that one. The distortion was shape animation applied to a lattice attached to the cloud.
The rest of my involvement on The Bible was tracking shots in PFTrack and adding in set extensions. Most of the 3d content was rendered using Solid Angle’s Arnold Renderer.
The work in the following reel is created using Softimage, Terragen, Nuke and PFTrack.
Text in the bottom right shows what I created for each shot.
See PDF for further details. Download PDF shot breakdown
Edited on 15th Oct – Now updated with work from The Bible Series and How To Build a Planet
For the past few months I’ve been working at Lola Post, London, on Mankind, soon to be shown on the History channel both here in the UK and the USA.
I worked on quite a few sequences, 30 shots in total. Most of these involved creating projectiles of differing sorts, predominantly arrows; People firing arrows, being shot by arrows, and avoiding arrows while simultaneously cheating the whole archer deal by using guns. All arrows in the sequence above are CG.
As with many documentaries, many shots on Mankind were illustrative map shots, presented as full scale Earth scenes and as full CG shots they were subject to much change. Luckily, the flexibility of CGI makes it easy to work outside the boundaries of reality and to change one’s mind.
A few of the shots I worked on involved creating digital sets. Firstly I created an aqueduct for a sequence of shots with Caesar in. This was a case of tracking shots, matching on set details and extending upwards.
The trickiest shot was a bullet time shot, first in the sequence above, showing an Irish navvy unwittingly getting a little too close to a tunnel blast within the Appelacians. The original footage was green screen with the actor effectively sitting on a green pole with the camera moving around him. This introduced a wobble but was significantly easier and cheaper than a timeslice rig. As the footage was ramped up and down as well as being slow mo, getting rid of the wobble was high priority and after many tests it was eventually solved with simple yet nifty 3d camera trickery.
To smooth out the wobble, I followed a suggestion of Lola’s MD, Grahame. Having tracked the raw footage in PFTrack I projected that original footage through the camera in Softimage onto a card, positioned where the actor should be. That way the actor stayed in the same place in 3d space whilst I moved my new 3d camera around him.
The entire environment in that shot is a 3d set I threw together out of multiple particle instances of the same handful of rock models.
Most of the other shots were relatively straight forward, the exception being another bullet time shot, this one actually being one of the first bullets ever fired! The footage for the start of the shot was different to that of the end, so although the start had lots of people thrusting spears and poles in a smokey landscape, the end was completely clear of people and smoke, plus the target dummy was way too near. To solve this I made a new 3d gun, texturing it with various camera projected textures from the original footage, then made a new background out of a humongous psd stitched together out of footage and photos. In the end none of the original footage is being used as footage, more as texturing inspiration! It’s a really long shot so I split it in the sequence above.
All the work I did on this show bar the Earth-scale shots was rendered using Arnold. This has an advantage over Mental Ray of being a fast method of getting realistic lighting complete with indirect light bouncing. The quality is superb. To me, Mental Ray is much more flexible, but Arnold trumps it for speed between initial light placement and realistic render. I’m very glad I’ve forced myself to learn it.
A few of the aforementioned Earth-scale map shots are shown below.