This past Friday I was asked to do a Blender demo for the Facebook | CHIbug
meeting at RoboModo Games
. Since RoboModo is a game production house, I wanted to focus on something real-time and game-y. "PBR" seems to be the buzz word these days so I decided to explore that concept within Blender's viewport.
The video below is a pre-recorded version of the Google Hangouts demo, just in case there were connectivity issues. Thus the reason for the odd intro/outro. It's not a tutorial, but simply a couple insights into achieving PBR-like shading in the viewport.
First, what is "PBR"? This helpful article
reveals that it stands for "Physically-Based Rendering" but what does that even mean, especially in the context of real-time application? As far as I can tell (not being an avid gamer / game artist myself) PBR is the incorporation of physically accurate lighting, shading, and rendering concepts into the real-time realm. Things like ambient occlusion, HDRi environment lighting, blurry and fresnel reflections, and sometimes global illumination.
The demo .blend file used in the video above is available for download here!
If you come from the non-realtime side of computer graphics like I do, you know that these concepts are far from new. In fact they've been popular in standard render engines for decades. So why all the hullabaloo about PBR? I honestly can't say I know the answer to that question. To me it seems like a natural progression of technology that games would figure out how to approximate such concepts. But the closer you look into PBR you'll see that they've swapped conventional shading terms for new ones. For example, "Diffuse" becomes "Albedo" and reflectivity becomes "Metallic". I think this is where a lot of the confusion comes from, making PBR sound like a new thing when it's not so much. I tend to agree with Ton Roosendaal on the misleading hype around it. (Pictured to the left)
Ok Kent, enough personal opinion regarding PBR - More importantly, can we implement the concept of physically-based rendering within Blender's OpenGL viewport to mimic programs like Marmoset Toolbag
? Yes! Or at least we can get very close. Let's call it "physically-inspired" realtime rendering.
I will venture to state that the most impressive thing about PBR systems is the ability to use HDRi images as the primary source of illumination and reflection. While we cannot currently sample lighting information from HDRi environment textures within Blender's OpenGL viewport, we can employ an effective real-time lighting scenario with standard spot lamps complete with soft shadows - thanks to some hidden light settings buried in the BGE and the new screen-space ambient occlusion available in 2.74.
The second important facet of physically-inspired rendering is reflections. The more I explore shaders in general, the more important I think reflections are for realism and appeal. Realtime rendering is no exception. Currently Blender's viewport doesn't support ray-traced reflections from the World (like PBR systems do) but the effect can be approximated quite convincingly by mapping with an equirectangular environment texture to the model using the "reflection" UV coordinate option. From there we can create 2 versions of reflection - one blurry, one sharp - and interpolate between them with a normal falloff to achieve fresnel.
After that I simply plugged in my remaining texture maps like color, scratches, and normal, then mixed shader elements similar to the way I mix in Cycles to achieve the final material. This setup requires Blender Internal's node editing implementation for mixing. It's different than Cycles' though - Less intuitive and clunky by comparison but it get's the job done.
I think overall Blender's current tools offer impressive PBR emulation. Though it's not perfect. Here's a small list of things that aren't ideal:
- Faked reflections don't reflect actual geometry.
- SSAO in the viewport multiplies over top of reflections and shading, which causes inaccurate darkening.
- Blender's viewport mipmap settings could benefit from additional user control. Currently we have two options: 1) Mipmap filtering OFF resulting in extreme pixelation of textures and 2) Mipmap filtering ON resulting in washed out texture details. Fine tuning this filtering would preserve texture fidelity.
I learned a lot about physically-inspired rendering in realtime with Blender's viewport. I'm looking forward to exploring more and recording some future training about it!