The Man in the High Castle, Season 2 VFX

Watch the VFX breakdown reel by Barnstorm VFX for Season 2 of the Emmy Award-winning show The Man in the High Castle (Amazon Video). The visual effects seen here were nominated for the 15th Annual Visual Effects Society (VES) Awards and for an Emmy for "Outstanding Special Visual Effects".

Amazing.
over 1 year agoFebruary 23, 2017
These effects are all made in blender?
Cycles?
Lawson
over 1 year agoFebruary 23, 2017
Yes, we used Blender for almost all of our work and rendered in Cycles. Nuke and After Effects were used for compositing. You can check out our Reddit AMA where we discuss our process here: https://www.reddit.com/r/IAmA/comments/5rvwo2/we_produced_the_visual_effects_for_man_in_the/
Oturner
over 1 year agoFebruary 26, 2017
What, that is amazing... Never seen Blender used in serious production, really good to hear this. Would you also consider using open source compositing softwares in the future? Thinking of Natron, or even Blackmagic Fusion. Keep up the good work, very inspiring, cheers!
Lawson
over 1 year agoFebruary 26, 2017
We are pretty happy with Nuke and After Effects for compositing and we already have  a workflow that utilizes them very effectively, but we're always experimenting with new things. I know a few guys who have messed around with Natron and Fusion. We landed on Blender in a pretty organic way, but it is already a challenge in the industry finding good Blender people, and I'd think that it would be very difficult to find a critical mass of artists who are not Nuke or AE people, enough to justify the change.
Michael Berhanu
over 1 year agoFebruary 23, 2017
Did you track the footage in Blender?
Lawson
over 1 year agoFebruary 23, 2017
Most of our footage tracking was done in NukeX (for 3D tracking) and Mocha (for 2d/planar tracking). We used Blender for nearly all the 3D modeling, animation, and rendering, but we don't use Blender for comp work at our company, though its obviously capable of it.
Anonymous
over 1 year agoFebruary 23, 2017

Stephan Koppel
over 1 year agoFebruary 23, 2017
You are one of few companies who dont scare to use blender and, I believe, other opensource softwares. Can you tell, why do you choose blender?
And also - did you make your own "improvements" to blender? I am long time blender user (about 7 or 8 years now)and only few days ago I found that rendering profile is wrong (https://www.blenderguru.com/tutorials/secret-ingredient-photorealism/) so, I am talking about this: is there many thing to change to use blender for big projects?

And one more question: what type of specialists you are looking for? And will you take foreign specialist (from russia, for example), if you find his portfolio and personal knowledge really interesting?
Lawson
over 1 year agoFebruary 23, 2017
We made a conscious decision to switch to Blender a few years ago because we really liked its features and the way it gives the individual artist a lot of power over say, Maya, which is designed around the idea of having a lot of specialists and support personnel.

We did do research into ways of improving Blender and have funded some development designed to make it better for visual effects artists. These improvements will hopefully make it into the core software sometime soon.

Regarding Blender's colorspace issues, we've been aware of them for a while and have been using an 'ACES' colorspace in Blender for a while now https://blenderartists.org/forum/showthread.php?414363-More-OCIO-Colorspaces
Note also that we render out our scenes in multi-layer EXR files for compositing. OpenEXR uses a linear 32 bit colorspace and thus does not have the same issues with luminance clipping that the default Blender sRGB does. As such, we built all of our scenes with a realistically high dynamic range of lighting and 'graded' them in comp. We've discovered that Blender also has some issues with the math that Cycles uses (including more colorspace idiosyncrasies) and so have been working on how to get around these challenges.

We are looking for all kinds of good Blender artists currently (and we work with people all over the world), so if you have talent and the ability to solve problems and create photographic looking work, whether its modeling, texturing, animation, lighting, or rendering, we can talk. Send inquiries and portfolios to jobs@barnstormvfx.com
David
over 1 year agoFebruary 27, 2017
What workstation do you use?
Lawson
over 1 year agoFebruary 28, 2017
We use a number of custom-built higher end consumer machines. The minimum specs currently include Core i7 5280k processors, 32 gigs of ram, and GTX 980Ti cards, but we have some more heavily armed boxes as well with more ram, more processor cores, and dual Titan Z cards for rendering.
Manas Roy
over 1 year agoFebruary 28, 2017
02:59:69
Are you Blender people working to include something like Maya Motion Graphics or C4d's mograph types of feature? May i know your Render machine's configuration details ?
Lawson
over 1 year agoFebruary 28, 2017
Render machine specs specifically are 64gb of ram, dual Titan Z videocards, and dual core processors, though we did a lot of rendering on Amazon AWS as well. The answer to your other question I believe is below.
Anonymous
over 1 year agoFebruary 28, 2017
there is an add on called animation nodes for that. blender is in fact capable of everything you would need for a professional pipeline.
Michał
over 1 year agoFebruary 28, 2017
Wow, just wow O_O
Alex
over 1 year agoMarch 1, 2017
What process did you use to bring the 3D tracking data from NukeX into Blender? I've been wanting to use Blender as a VFX solution for ages but the tracking system in Blender isn't that good.
Lawson
over 1 year agoMarch 1, 2017
There's an add-on for importing .chan files into Blender (its in the software by default but has to be enabled in the User Preferences menu). This allows cameras to be passed back and forth between Nuke and Blender (you can export your tracked camera from Nuke as a .chan and import them as well). You just need to make sure in the import dialogue in Blender that you set the film back size to match what's in Nuke and change the transform order to ZXY, which is what Nuke uses for its cameras.
Alex
over 1 year agoMarch 1, 2017
Thank you! This is really useful for me, It's great to see Blender being used professionally.

You guys have done really amazing work, keep it up!
odoyo favour.
over 1 year agoMarch 1, 2017
I am very happy to see that, blender is part of a great studio like your studio.
right from the day I started using blender, I have always believed that this software can do more than I see some of my folks use it to do.
please I want to ask:
how many machines would I need to render an animated features and what could be the possible specifications.
thank you.
Lawson
over 1 year agoMarch 2, 2017
For an animated feature many many high end machines would be  necessary. At our peak, we were leveraging something like 200 computers on AWS to render the material we needed for this show (which is only a few minutes of CGI footage). I know that companies like Disney and Pixar are rendering on more machines than that day in and day out for a long time on their features. When we weren't rendering on the AWS render farm, we rendered smaller shots internally. For internal rendering, we had machines with 64 gigs of ram 2x Titan Z videocards, and dual processors.
drgci
over 1 year agoMarch 2, 2017
really amazing what you can get from a free open source program , this is the reason why i love blender its simble the best program out there ,powerfull and flexible +free :)
odoyo favour
over 1 year agoMarch 2, 2017
Thank you very much for your reply, i will keep that in mind.
Пездец
over 1 year agoMarch 16, 2017
Это просто охеренно, пездец, сука, блядь, блядь, сука.

I can translate to english, but i can make some mistakes. It is the most exciting project i've ever seen made in free open source program.
Your comment