Image Image Image Image Image
Scroll to Top

To Top

Featured Projects

MPGjr 2015MPGjr 2015MPGjr 2015MPGjr 2015

By antiform

On 10, Nov 2015 | In | By antiform

MPGjr 2015

One week from shoot to TX – and five crippling disasters that almost stopped the whole thing.


I have seldom gone through more trouble on one project to get it on air. Here’s what happened and how I solved it.

  1. Friday – No measurements taken on set
  2. Tuesday – Photogrammetry for backgrounds didn’t work out
  3. Tuesday – Computer crash
  4. Wednesday – Render 300 hours of CG when you only have 48
  5. Friday – Nuke Studio refuses to export a single frame
  6. Friday – Your 3oo hour render isn’t really up to scratch

No measurements taken on set

I was actually aware that I hadn’t taken any measurements of distance between trackers on set, which is a cardinal sin when you have the opportunity. I figured “how hard can it be” to just eyeball it afterwards. Turns out it’s actually pretty hard to gauge how high a door needs to be for a person on green screen to be able to walk beneath it. It took problably 15 test renders before I got the speed of the camera and the scale of the scene right. And it would have been so much easier if I had just taken the time to measure Margrethes height, and the distance and height between a couple of the tracking markers.


One measurement that I did bring home from set was a lens grid, luckily. Look how much curvature there is in this 28 mm lens.

You’re kinda asking for trouble when you have 5 working days for post work when post work means 420 of full cg background frames, 3d tracking and greenscreen, so I knew I was in trouble the minute we called “it’s a wrap” on set. It already began on ingest – one of the cards from the RED Alexa had a unreadable file, which prevented the following shots to get copied using Offload ingesting software. Luckily they could be read manually, and hopefully the unreadable file wasn’t the magical academy award take that never could be used. The card has since been taken out of circulation.


The concept behind the promo was that Margrethe – the presenter – would do the journey the kids at MGPjr are doing; from their private rooms, through rehearsals, into recording and finally on stage at the finale at Oslo Spektrum. I was inspired by AB/CD/CD’s music video for Uffie – difficult, which I find intrigiung both from a technical and a estethical viewpoint. I assume its done partially on green screen, with CG doors concealing cuts. So descicion was made to shoot Margrethe walking on a threadmill and green behind. To lower the difficulty setting, we didn’t shoot any feet.



Doing our promo on greenscreen was kinda convenient since Margrethe’s schedule two weeks before the show didn’t allow for an entire day zipping across the city for a promo shoot. Also, doing it all in camera would have been difficult. So we decided to do the first 3 rooms in CG, and the final stage shot for real.

Photogrammetry for backgrounds didn’t work out

The inital plan was to visit real world locations, shoot them from every angle and recreate them in 3d using photogrammetry. Autodesk’s Memento was the weapon of choice.

Unfortunately, the locations hade very little natural light, and time and budget restrictions didn’t allow us to light them properly. They didn’t really have the layout to allow Margrethe to walk from room to room either. So I had to scrap that idea.

PHATCATScreenshot of Memento interpretation of PhatCat Studios, downtown Oslo

Enter Turbosquid. I found a teenagers room and a mixing studio, both from the same highly skilled author. With a few adjustments I was able to get the architecture right for Margrethe to be walking through them sequentially. Most of such architectual models are made with v-ray materials, and I don’t have V-ray. The materials can be converted, but doing it on such complex models are quite a mission, and if you make mistakes it can be really hard to get it right again. The plan was to render the CG on a local renderfarm (since I knew a was short on render time) and as genius as Backburner is, it’s also really picky when it comes to missing shaders and maps. It will fail and halt at the slightest sight of any such things. So after converting most of the teenager room’s materials manually and finally and messing it up, I had to reach out to scripting to save my ass. Luckily there are other stubborn mental ray users out there, and with a single click Thorsten Hartmann’s Shader Utilities had converted a scene that took me most of a day to do, in a few seconds.

Before and after comp, walking in parallell universes.

Computer crash

Yes, I work locally. Yes, it’s stupid. Yes, the hardware could fail. But you know, it never has, so I’m naively trusting my HP z800 without hestitation. However, on this unlucky day, on one of the tightest, most important deadlines of this year at least, it crashes. Google Disk had been filling up my OS drive to the brink without me noticing and with less than 10 megs of free space, my main computer crashed 20 minutes after I last checked it and went to bed. So Backburner, which at least is running from an external computer, suddenly had nowhere to write it’s renders to. Major problem. We are now on Wednesday. Deadline is Friday.



Frame from the final render

Render 300 hours of CG when you only have 48.

So returning to work on Wednesday morning and realizing what had happened, I quickly understood I had to render externally. These services arent cheap though, and the frames where quite heavy, going into 35 minutes per frame on my speediest PC. So before sending it of, I took the outmost care of making sure the animation and timing was airtight. To get quick results I used the Viewport Grab functionality in 3ds max, and composited it below the greenscreen footage using Nuke Studio and the excellent (and free) DJV Imaging framebuffer software. Rebusfarm’s plugin was installed and on submitting the finished scene – it failed. 10 minutes before having to leave work.

The error was non-sensical as well. Rebusfarm couldn’t locate my RED-footage!! Every single take of the footage was for some reason referenced in my 3ds max project. After some fiddling I found the culprit: PFtrack, which I used to do the inital track.

It exports a scripts which for some reason keep unnessecary references to all the video footage. Lucklily, the support guys at Rebusfarm knew what to do, and advised me to create a new scene and merge everything from the failing project into it. Voila, job submission accepted and off the frames went. What the support guys failed to mention though (which ultimately isn’t their responsibility either) is that 3ds Max isn’t importing your render settings, file types or render elements from the original scene. So with minutes to go before having to leave, I’m ignoring the fact that what I have ordered is 16 bits TIFs and no additional passes.

Nuke Studio refuses to export a single frame

Nuke is a moody beast. What it can be accomplished (or rather fixed) is stunning, but it’s also quite picky when it comes to what kind of footage it will be reading without crashing. So returning to work on thursday and realizing that 16 bit Tiffs is what i got for my 400$, Nuke is giving me quite a hard time trying to merge them with the RED Alexa 5k files. The footage is perfectly lit for green screen, so the IBK keyer keys Margrethe with almost a single click. The render looks less than perfect though, since I had very high light values (due to my lack of experience with 3d). The highlights’ values went into the 15’s in the 32 bit render, which isn’t catastrophical, but with 16 bits, these values will be capped and look choppy. Also I had no Z-depth nor vector channels to give the render the post treatment I had intended. As a last minute effort, I sent off a local Backburner render using scanline and overriding materials with a quick to render standard material, and hopefully it would finish before 24 hours and the deadline.

Thankfully it did, although only barely. I’m still searching for a way to render utility passes like Z-depth, Vector and Masks in a quick way but have yet to find one. The scanline render using standard materials still came in at 20 minutes per frame.

Nuke features a hard to beat autosave, so even after crashing in the tenfolds during Friday I didn’t loose a click of work. Exporting the final promo however turned out to be close to impossible. Using the export functionality in Nuke Studio only left me with the “Error -1” which you can google and gasp at. It just would not spit out a single frame. The next solution was therefore to manually make a write node in the Nuke-script – but it takes ages to write the frames the frameserver in NukeStudio does in minutes. I didn’t have those ages, and NukeStudio had already written a sequence of full quality dpx files that it uses for previewing the comp within the application itself. It just wouldn’t export them, and importing them in After Effects or Premiere Pro left mewith a Cineon-looking low saturation piece of footage that did my perfectly crafted nuke comp no justice.

cineon afxLeft side show how AFX displays DPX files. There is no way to get it to look like it did in Nuke without Sony’s OCIO plugin.

Enter Sony OCIO plugins for After Effects. I was barely aware of it existence up until this point, but a desperat google search turned up the most relevant result I have ever gotten. Install the plugin, load a particular Nuke cineon to srgb LUT, and before you lays a perfectly recreated look, read from the ridicously fast rendered DPX-files which NukeStudio wouldn’t let you export if you begged it.


So against a million and a half in bad odds, the promo airs on Friday as the marketing strategy intended.

Your 3oo hour render isn’t really up to scratch.

But although it had already had its premiere, I was less than satisfied with the result. So during the weekend I sent off a high quality FG -map for every 20 frames, and re-rendered the entire CG on my local Backburner farm (consisting of 4 dedicated iCore7 machines, my Dual Xeon 5660 workstation and 3 run of the mill open-plan office PCs. Autodesk has a incredible flexible licensing when it comes to render nodes, just install the 3ds Max trial on any PC and render away. If you leave your PC unlocked in my vicinity, be prepared to have it included in the render farm minutes later).

The initial comp as it looked on air when it premiered. The poster was dull and uninteresting, the walls where empty and dead, the light was overblown and horrible.

But the result was not optimal (which is what you get when you play the role of director, CG-artist, compositor, render engineer, and IT-department, all at once), the design of the teenager room was kind dull, and there was flickering in the highlights of the rehearsal room. I had finally gotten all my passes correct though, all in exr format which Nuke seems to love and churns through like cornflakes. Submitting another render was futile though, since it would barely finish before the promo had ran it’s time.

I didn’t like the poster on which the camera starts, and the room it self lacks a few details to really give it the feeling of beeing a teenagers room.


So what to do now? Well, the render was done in passes, so I had diffuse, indirect diffuse, reflections and highlights all ready to be plussed together. So if I changed the poster and added some details to the walls and cabinets in the diffuse+indirect diffuse pass, I could still comp the highlights and reflections on top and make it look like a part of the final render.

To allow this, I used the excellent Nuke’m script for 3ds Max, which allows you to paste 3d-cameras and objects into Nuke. PASTE. I made some really simple geometry to put a new poster on to, and likewise with some simple sticker details for the wall and the cabinet in the corner. Use Nukes built in Scanline render to add them onto the diffuse pass and there you go. 30 minutes renders updated in seconds. I also exported parts of the wall geometry and applied FillMat to mask the changes off when they go behind walls and doors.


The flickering was also fixed using this method. I exported the part of the wall that had flickering in the reflections from max using Nuke’m, and projected a static frame without flickering onto the reflections pass. Again I masked it off with some more geometry and FillMat and I was left a fully automated fix (and roto’d) end result, rendered in a few seconds.


Okay, so that’s a pretty thourough walkthrough of all the troubles I went through to get this promo right. Hopefully I’m not doing these exact mistakes over again, and maybe there’s a lesson to be learned for somebody else in here as well.

Here’s the finished result:

And here it is on Facebook:

Software in use:

Editing: Premiere Pro cc2015
Tracking: PFtrack 2015
3d-modeling/rendering: 3ds Max with mental ray and Backburner
3ds max plugins: Nuke’n 2.9.4 and Shader Utilities Pro
Compositing: NukeStudio v9.08

Director and all VFX: Pål Gustav Widerberg
DOP: Martin Edelsten
Lighting: Ole Bernt Helgås

This article was first publised at