Q&A with Animal Logic’s Max Liani – Creator of ‘Glimpse’ Ray-Trace Rendering Engine

In every way you look at it Warner Brother’s The Lego Movie has been a monumental success for the studio, taking (world-wide) over $337 million (USD) at the box office so far (source). Much of the CG in the film was created by the Sydney based Australian VFX house – Animal Logic. Recently CGSociety.org held a Q&A with Max Liani (Key Lighter at Animal Logic) to discuss his work at ‘Animal’ and the use his ray-trace rendering engine ‘Glimpse‘ (used on The Lego Movie ); you can read the entire thread here. In addition, CGWorkshops (the training arm of CGSociety) are conducting a live webinar session with Max at 6pm PST (LA time) on Monday 17th March (which you can register for here). Here is the Q&A reproduced below:

The Lego Movie - Image Courtesy of Warner Bros. Pictures.

The Lego Movie – Image Courtesy of Warner Bros. Pictures.


Q: Thanks for joining us Max! Can you tell us how you got your break as an artist, and what inspired you to join the industry?

A: I’ve always loved movies. But when I was a kid I never imagined I would make my living out of them. Of course I was inspired by the spectacular imagery of Jurassic Park and Toy Story, but that was so far from where I was. I grew up in Italy, the visual effects industry there is not much.
My passion developed in the early 90’s when I was 16, taking photos and creating some rather unlikely compositions of cubes and spheres with my shiny new 386 PC.

Q: How did you learn? It must have been difficult to find material to learn from – much harder than it is today.

A: It was hard back then to find information outside a good university. Books were difficult to find and very expensive. None of us had an internet connection at home and there were no tutorials, no schools to teach me. I studied on my own and I met a good friend, sharing my same passion. We refined our skills by collaborating on projects. I went to my very first real job interview holding a couple of 3.5” floppy disks in my hand… that was all I had… and I got the job! I spent most of my time on those Silicon Graphics workstations I couldn’t possibly buy. I studied software engineering to implement solutions to the technical problems I faced. I stated to teach students.
Jumping between small companies, 10 years went like that.

Q: You’re in Australia working at Animal Logic at the moment. How did you get that break after 10 years in Italy?

A: Despite all the hard work, it had been difficult to get into the big game. I needed a decent reel, good skills, some connections and some lucky timing. A friend I knew from one of my classes told me that Animal Logic was looking for people for the upcoming Happy Feet, back in 2005. My reel was weak, but he pushed from inside to get me hired. He knew I was good, I only needed a chance to prove myself. I took a plane for Australia on a Saturday, with my beloved wife. I started working on Monday morning, jet-lagged like hell after 24 hours flight.

For the first time in a long while I was surrounded with people with great experience, I felt like a dry sponge, absorbing everything I could. In 6 months I went from mid lighter to senior lighter to key lighter. I got trusted with some of the most strange and visually complex sequences and hero shots in the movie. When production ended, most artists where let to go, I was one of the few lighters that were chosen to stay.

Q: Why do you think that was?

The fact that most of my life I had to find solutions by myself, certainly made me learn how to connect the dots. To me, knowing how things get done is not enough, I want to know how things really work, from a theoretical stand-point, I mean. Knowing what is inside that box allows me to think outside of it.
In most studios, it doesn’t matter how good you are when you get in. It is far more important the pace at which you improve when you are there. Now that I’m on the other side of the desk, running the interviews, I value more an artist that comes in as a junior and in 6 months performs as a “mid”, than an artist that starts as a “high mid” but doesn’t improve much from there. Does that make sense?

Q: How has your role developed since you’ve been at Animal Logic?

After a couple of years in Animal Logic I started to innovate. I designed from first principles three generations of our shading system, used in every production since 2007. I grew from senior lighter in “Happy Feet”, to lead in “Legend of the Guardians, the Owls of Ga’Hoole”, to supervisor in “Walking with Dinosaurs”. It has been a constant learning path. When the lesson was not artistic or technical, it was about management, diplomacy and trust. In general, when you want to make changes you make friends and supporters, but some people might still be unhappy. You might be the best artist or engineer in the world but you have to learn a great deal about diplomacy and respect if you want to succeed in a team.



Q: Ok, so let’s talk about The Lego Movie. You stepped out of your regular role for this movie to create something new and very necessary. Can you tell us about this?

A: During production of The Lego Movie I stepped aside from my supervision role to fill a much more needed position. We already had a great team of Lighting Supervisor (Craig Welsh) and Art Director (Grant Freckelton). What the production needed was technical solutions that were outside the box.

Our facility was using Pixar’s RenderMan. It’s a good engine. Unfortunately what we had to render for The Lego Movie was a toxic combination of everything that makes that engine suffer. I’m talking about raytracing without radiosity caching (specular and glossy reflections and refractions), subsurface scattering and dense geometry, all of it, everywhere.

Think about “plastic”. Most of us will say, “How hard can it be?” But the fact is everybody has an imprinted idea from childhood on what that Lego plastic looks like! There is nowhere to hide. Traditional CG lighting tricks won’t work. Some approximate global illumination won’t cut it. The last thing we wanted was this movie to look like… well “CG”.

Q: So what led you down the path of writing your own renderer? Surely it can’t have been your first solution?

A: We discussed many possible solutions, most of which were either forcing us out of time or budget or with too little margin of success. I’m sure that there are other renderers out there that would have been ok with the specific mix of rendering challenges we were facing, but the truth is you do not switch render engines during a production. It’s like “the first rule” in Fight Club. You could do that in a small studio, for a small production where artists are working almost entirely with off-the-shelf tools. In large studios there are millions of dollars of investment in proprietary tools, pipeline and automation. It takes years to adopt a technology as pervasive as a new render engine, unless, of course, you write your own renderer, to be compatible with existing assets and processes. But even that takes years. In the end it comes down to intuition of what you feel you are going to need in few years time, so that you are ready when the time comes. This takes us back to the end of 2010.

Q: It began before “Lego”?

A: When I got assigned the lighting supervision of Walking with Dinosaurs 3D I spent one year in research and development before hiring my crew. I was pushing to renew or shading system to better service both “Walking with Dinosaurs” production and the upcoming “The Great Gatsby”. So for one year my effort went first hand into research, design and develop our physically based shading and lighting system “PHX” (before Prman had any). But I wanted to give lighters some edge. I wanted to push the boundaries of the quality we could deliver within the budget we had. In my own time I began developing an experimental interactive path tracer under the name of “Glimpse”.



Q: Why the name “Glimpse”?

A: Artists are hardly creative if they don’t see what they are doing, if they are unable to explore the creative process. For a lighter, Glimpse is the rationalisation of that process. It is the compelling need to work interactively; to “have a glimpse” of what the lighting looks like.

Q: Can you explain how you went about achieving this lighting glimpse within your renderer?

A: You can argue there are many ways to achieve that. I believed stochastic path tracing was the solution. My crew, for the first time, was able to work interactively. I could do my rounds, ask them to change something, give them my approval on the spot and tell them to render overnight.

Q: How is that different from relighting systems like LPics or “LightSpeed” or even modern openGL rendering?

A: Stochastic path tracing is fundamentally the very same algorithm that can produce the final high quality result. It allows for very fast interaction without any setup or pre-computation. You press “render” and in 1 second you are there. It won’t do 60 frames per second but if you sculpt a piece of geometry to “paint with light and shadows”, or tap the timeline to get to another pose of your characters, it’s accurate and instantaneous.

Q: But that was just to preview lighting. How did you make use of it for final rendering?

A: That’s right. If I’d gone for some openGL solution I would have been stuck.
For Lego we just had to push the boundaries once more. I needed to turn my “experimental” engine into a solid production renderer, and we had less than a year to do so, while people needed to use it all along from day one. I’m talking about everything we take for granted, from a robust API, scene file format, plugins, programmable shaders, procedural geometry, per-light channels, extra output channels, statistics and error logging, unit tests, stereo rendering, depth of field, and the list goes on and on. Plus we were mainly two engineers. We had some help from other R&D engineers to pipeline the tool and in some other math heavy challenges, but for most of the ride it was me and the talented Luke Emrose. We had to write it, we had to integrate it inside Prman also, so that we could migrate our rendering one feature at a time, while supporting the production crew. At first it was just shadow casting, quickly followed by global illumination, then subsurface scattering. As we were switching more over to Glimpse, the quality of our imagery improved substantially, the lighting setup got simpler and quicker, while rendering was getting faster and faster.

Q: So now you have a complete renderer?

A: Can I answer with a “yes and no”? Towards the end of the movie, I believe it was 2 or 3 weeks before the end, we had our renderer! We achieved more than we planned for. Unfortunately not all lighters switched over because it was a bit late in the game and some kept using the hybrid approach. But some brave fellows did it and we had a small bundle of shots with background/midground and even close-up elements and characters renderer directly in Glimpse. To better answer your question, we have a great renderer for hard surfaces. It doesn’t handle volumetrics, hair or particles yet. So we are still using Prman plus Glimpse for many of our daily challenges.

For more information check: http://www.cgsociety.org/
For more information on Animal Logic check: http://www.animallogic.com/

Be first to comment