Rocket League is one of the most popular eSports today and has maintained an active community for nearly 5 years.
Developed by San Diego-based studio, Psyonix, the game revolves around a simple (yet brilliant) concept of rocket-powered cars playing soccer. The game's physics engine fosters an ever-expanding skill ceiling and competitive scene while perpetual cosmetic upgrades fuel a thriving in-game marketplace.
Take it from me, a dedicated casual player.
Recently, I connected with two artists at Psyonix who use Blender as their primary tool.
In this interview, Ethan Snell and Eric Majka share valuable insight into how Blender fares on the front lines of professional game production. Buckle up for an exhilarating conversation!
Hi Ethan and Eric! What are your roles on the Psyonix team?
Ethan: Hi, I’m Ethan. I’m a 3D cinematics artist at Psyonix.
I’m part of the Video Production team where we make all the promotional video and trailer content for Rocket League. My role on the team is working on custom animated scenes for videos or trailers, so anything that’s not recorded gameplay or doable in the game engine.
I end up being a 3D generalist; I take any ideas we have through full production—from design and blocking, animation, lighting, and shading—all the way through rendering and compositing.
Eric: Hello! I’m one of three Lead Artists at the company.
Really what that means is that I’m here to support the rest of the art team, so if someone is trying to learn something new or to solve a problem, I do what I can to teach or to fix things. I do make a lot of art, and although I do make a good bit of wheels and hats, I primarily am in charge of map (Arena) production.
How much work do you do with Blender?
Ethan: Almost all of my work involves Blender in some way.
When I’m working on motion graphics or logos, I’ll often jump over to Cinema 4D, since it has tools that make working on those things much easier. I’m also working in Photoshop and After Effects a large portion of the time. I’d say I’m probably in Blender for 60-70% of what I do.
Eric: All of my modeling at this point is done in Blender.
For years, I used more mainstream software, but after seeing what 2.80 was going to have and then seeing all of the amazing add-ons available I switched over.
I like using a variety of software for other things, like Allegorithmic’s suite of software, but modeling is all Blender.
How does Blender cooperate with other software in the pipeline?
Ethan: Blender is very powerful for me because for the most part it’s the full package.
It’s got great tools for modeling, shading, and animating. Cycles, as well as Eevee, continues to become faster and more powerful too. This works really well for video… I don’t need Blender to work with any tools other than handing off good renders to After Effects or Premiere for additional compositing. Since I’m not integrated in the game art pipeline, it’s much simpler for me.
Eric will likely have more to talk about there.
That being said, Blender’s render exports could be aimed at integration a little more.
Rendering out multiple passes (like depth, mist, normal, etc) is very possible, but it requires setup in nodes and setting an export path individually for each new file.
This is a daily frustration.
Also, new tools like Cryptomatte (which I thank the engineers at Tangent for daily) have node-setups that require a custom setup for every file… it might sound lazy to complain about this, but when you have to set it up manually for every shot of every new project, it really starts to wear.
These options are things that I wish were a little more accessible.
I’ve been asked why I don’t do compositing in Blender before, since Blender’s compositing nodes are reasonably robust. And it’s true -- it’s a powerful system for what it is, but it’s very rigid compared to Photoshop or After Effects… where making a change to the order of effects is a simple layer drag and drop, compared to re-arranging and relinking nodes in Blender.
That’s not to mention Photoshop’s powerful integration with After Effects and other products in the Adobe suite… it’s easy for a collaborator to save a Photoshop file with adjustment layers that I can simply import into After Effects.
Cinema 4D has an option to render to a Photoshop file, with all of the passes intact (diffuse, spec, etc, as well as object mattes and custom passes).
If someone could write something like that for Blender, I’d use it daily haha.
Eric: There were some “gotchas” that I had to figure out when I made the switch to Blender, but once I figured those out, things were really easy. Mostly it was figuring out the proper scaling, pre-triangulating the mesh a certain way, etc...
One pain point that I found was the UV mapping tools. Personally, they just didn’t do a lot of the things that I wanted them to do or were used in a way that I had to re-learn (which isn’t always what you want to do in a production environment when something that should be simple isn’t).
I mostly UV map in external software though, so I just stuck with that.
Is Blender a popular tool at Psyonix?
Ethan: I’m seeing more and more people using it, which is exciting, but I think the majority of 3D artists at Psyonix are using other software.
As long as it works within the pipeline though, artists are pretty free to use the software that works for them, which is awesome.
Eric: It wasn’t, but it’s getting there.
It’s hard to deny how nice 2.8x is and all of the amazing things that are flying out of the Foundation.
There have been a few people in our concept art team that had struggled with learning 3D software with other software, but after seeing some of the stuff I was doing they asked if I could teach them Blender, so I did.
They found it easy enough with the help of some online tutorials, and as a result, they’re now pretty comfortable blocking stuff out for their drawings (which has also saved the modeling team a lot of time and guess-work).
What is Blender's reputation among non-users?
Ethan: I mean, Blender isn’t hindering in any way… the fact that it keeps up with other tools speaks for itself, especially with the work that Eric does.
I think most people are starting to regard it as a package that is comparable to any other.
2.8 has certainly turned a lot of heads, and the video team is looking at using Blender to support a lot of projects coming down the line.
Additionally, we’re starting to see our parent company Epic Games start to support it in relationship to Unreal Engine 4 (UE4), which is really exciting for longer-term usage and makes it look a lot more sustainable.
Eric: I think lots of people, myself included, had dismissed Blender previously as a niche software. I still get teased about making the switch by some artists, not because what I make isn’t up-to-par, but more that they know I’ll talk at any opportunity about how much I enjoy the software.
Why use Blender when you could use commercial software?
Ethan: Blender is a tool, much like any other.
I do use commercial software when it works better for me. Cinema 4D has a much more powerful and intuitive procedural animation system, and when I do motion graphics it’s the obvious choice. It also imports and works with vectors much better than Blender does… I build logos exclusively in C4D.
Despite using Blender for most of my professional life, I don’t think I’m as much of a Blender fanboy as others might be, haha. Blender has just always been a reliable tool that I can always access and it’s become more and more sustainable (read: renders faster) even in just the past year. For me, that’s more important than it being open source, I guess. That being said, I’ve been using Blender for over a decade (which is wild to say), so it is my go-to.
Eric: For me, it’s the speed of development out of both the Blender Foundation as well as the community. I’ve been using 3D software for 20+ years now—13 of which at Psyonix—and there were some bugs in commercial software that persisted over YEARS of releases.
On the flip side, I feel like I can’t keep up with how quickly Blender is progressing, which is a good problem to have. And if things are missing from the main software, there’s most likely an add-on out there, either for free or for incredibly cheap, that will make your life easier.
Have there been moments of Blender "saving the day" or making a task significantly easier?
Ethan: For sure, there have definitely been moments where a feature drop has been super timely.
In a recent project I worked on, I had a fairly heavy volumetrics scene that I needed to render in Cycles. Trying to get it to a point where it was noiseless was taking too long, and as our publish date loomed, I was actively getting concerned about how we were going to accomplish that. The built-in denoiser wasn’t giving great results and Cycles simply wasn’t fast enough to render it in the time we had.
At that time, the new compositing denoiser was added into one of the nightly builds.
I downloaded that and, with some tweaking, it ended up being much more feasible in terms of render time. While the rest of the project was all built in 2.80, that specific scene was built and rendered in the much riskier nightly build so I could take advantage of the denoising.
Eric: Absolutely! Most of it comes in the form of add-ons to me, so my Gumroad and Blender Market libraries are a little out of hand.
The biggest ones for me have been Team C’s HardOPs and Boxcutter, and MACHIN3’s DECALmachine, MESHmachine, and MACHINEtools. Some of the things they do have either saved me hours of manual labor or flat-out fixed something that otherwise I’d have to redo.
On the other hand, have there been moments of Blender completely dropping the ball? (I hope you detected my Rocket League pun...)
Ethan: There have been a couple of panicky moments, mostly relating to rendering.
I’ve been trying to use Blender’s new collection override system and it’s been great for the most part, but at some points will just completely fall apart with no warning… opening up a new file to find that for no good reason everything is unlinked and won’t update isn’t super great.
I look forward to the development team working out the linking system a little more because it’s still on shaky ground right now.
Overall, rendering still is a little bit of a pain point in Blender. I haven’t found a good distributed render farm setup that works for us yet and cycles alone doesn’t quite feel optimized against its competition yet. In terms of balancing speed vs accuracy, there is no situation in which I would choose accuracy, honestly. If there’s a way to fake something while having it look non-distracting, I will always take it.
I often do render passes, rendering scenes partially in Eevee and only rendering in cycles when I need to. There are 2.5 shots in the RLCS S9 trailer rendered in Eevee, while most are in cycles. Try to guess which ones are which!
There are good ways to optimize Cycles to speed it up. I never use the default amount of light bounces… if I can get away with 1 bounce on everything, I’ll take it. I often use the AO bounces in the Simplify panel as well.
I’m looking forward to 2.83, since it looks like they are adding in some speed upgrades to Cycles. When you’re rendering thousands of frames per project, every second of render time counts and it can really make the difference on if an effect or feature is possible or not.
Eric: Yeah, for sure. Like any software, there are some things that I wish were improved. A lack of radial/circular array has been a huge annoyance of mine given how many times a month I make wheels for our cars.
Which Rocket League car design is your favorite?
Ethan: Fennec for sure. I’ve always driven a hatchback in real life, so when I saw the design I fell in love.
Eric: Scarab. Just because everyone hates it. When we released the prequel to Rocket League, which was unfortunately named “Supersonic Acrobatic Rocket-Powered Battle-Cars”, I made all but one of the vehicles and just came up with silly ideas on my own. Scarab was one of them and it’ll always have a place in my heart.
What's the process for building cars?
Eric: We always start with a very rough blockout of the mesh based on approved concept art.
That way we can get it in the game, see how it feels, and see how it looks from the player’s perspective. After that we move on to modeling the body, which is done without any normal maps at all - just edge loops to control the smoothing. The chassis parts are all low poly meshes with baked normals, AO, curvature, etc...
We’re using Unreal Engine 3 for Rocket League so we don’t have the benefits of PBR materials. Once all of that is done we get the car in the game and set up the materials, then work on the various skins (Decals/paint) that the player can equip.
Are cinematic and game models developed independently or does their production overlap?
Ethan: Right, so because I’m on the video production team, I’m not part of the art team working on the game. Obviously, I work very closely with the art team, but I don’t necessarily have a hand in designing any of the battle cars. My role is taking what they’re working on, and making sure it looks good in any animation we want to do.
To that end, cars are built for cinematics on a need-to basis.
When I build a car for cinematics, I always start with the game models. That’s the base standard and for the most part, we want to make sure the cars are accurate.
Our artists are rockstars, and the in-game models are really fantastic.
Depending on the objective though, we may want to try to be higher fidelity than the game, and in that case, I’ll reach out to the artists involved for additional assets (be that high-poly models, pre-bake or otherwise non-optimized assets).
These assets would never work in-game, but poly optimization isn’t as important for renders so I have a little more flexibility.
Rigs and materials need to be rebuilt for Blender, obviously. With materials, I often take a little more liberty. The game doesn’t support PBR, so just setting up materials with metalness and roughness makes them look super good. For the rig, I’m using a custom one I’ve built that took inspiration from digicreature’s rigacar.
My rig has a lot more flexibility though, like the ability to stretch the car to the absurd dimensions we did for the climax of the Rocket Pass 5 trailer.
Sometimes, we want to push past the boundary of what our game looks like.
For the RLCS Season 9 announcement, the in-game models didn’t work at all. Specifically, because the video is so driven by the reflections of the environment, the low-poly nature of the car was super evident. Because the cars are built for in game use in mind, they strategically use tris and topology tricks which work really well in engine, but react poorly to a pasted subsurf modifier.
Luckily, Eric, who was already working with me on that project, was able to take the car and basically rebuild the geometry, so we were really able to focus on the reflections. He’s a poly wizard and I’m really excited it came out as nicely as it did!
Eric: It depends on what it’s being used for. Typically we just make stuff for the game since that’s our primary focus but Ethan will oftentimes need to use the high poly meshes we’ve made and baked from since polycount is less of an issue for him compared to visual fidelity.
On the RLCS Season 9 trailer I was fortunate enough to work with him on making some assets from scratch to fill out the scene, as well as make a higher poly version of the car that was used. The video had a lot of reflections on the car, so using a low-poly version just wasn’t going to cut it.
What software do you use to bake your normal maps?
Ethan: Haha. I try to avoid using normals as much as possible, since I’m not limited by geometry.
Eric: Personally, I use Marmoset Toolbag because I love how you can paint the skew mask and have it update immediately.
It's been revealed that Rocket League cars are notably smaller than real-world cars, with the Octane being 4.72 ft long. I've long wondered: Is there a straightforward reason for this smaller-than-real-world scale? Is it funky developing at such a scale?
Ethan: I’ve actually never heard this. This is wild talk. I drive to work every day in my 17.5 foot long Octane, so I have no idea what you’re talking about.
No, but for real, I don’t know why that is, but the scale is something that shouldn’t limit picture making. Good composition + animation > accuracy when it comes to video. If a shot makes a car look too small, I’ll adjust the sizes so it looks correct.
At the end of the day, my job is about making things look good. Whatever trick or technique it takes to get there, I’m jumping on it, even if it breaks the rules of physically based rendering. Blender is awesome enough to give me a lot of flexibility to make it there!
Eric: Straightforward? No. Let’s say it is part of the lore of the game ;)
Unreal doesn’t really care what your scale is - it just reads the numbers. Once you get your grid set up in Blender, everything is fine. I’m sure there’s some physics reason for the size everything is.
You may also recognize Ethan as a bonafide CG Cookie instructor! Learn from him here:
|Production Design with Blender 2.8 and Eevee||Masking & Layering Techniques for Blender Materials|