Technical Lighting Artist. Or maybe VFX and Material Artist. Also Optimization and some other stuff. Changes from project to project, really.
Technical Lighting Artist. Or maybe VFX and Material Artist. Also Optimization and some other stuff. Changes from project to project, really.
— PROJECT
The Outer Worlds 2
— PLATFORM
Windows PC, XBox, Playstation
— ENGINE
Unreal Engine 5
— ROLE
Principal Technical Artist
— DATE
2025
The Outer Worlds 2.
I was hired in 2020 to work on TOW2 as a Senior Technical Artist, and later made Principal Technical Artist, despite my admitted inability to code. To alleviate my impostor syndrome, I made a lot of ambitious material effects and increasingly mad requests to our rendering engineers. In hindsight I was likely too ambitious as my responsibilities also included asset optimization and pipeline planning which left me constantly struggling to get everything done. I am however very proud of my contributions and glad to finally have a shipped title at Obsidian, a company I had wanted to join since college.
TOW2 is an Unreal Engine 5 game. The out-of-the-box Unreal volumetric clouds (and SkyAtmosphere) were an excellent head-start in making a good looking sky but, as anyone who has used UE5 will tell you, they require a lot of additional work to get them both looking good and performing well. The clouds are still using Unreal's volumetric cloud actor, but their material is custom made by a certain person who is me. Over the course of TOW2's production I spent a majority of my time adding more and more visual effects and lighting tricks to create the full atmospheric system that shipped with the game.
The 'AtmoSys', as I called it was a group effort.
The custom class and its core functionality, including support for applying weather states as data assets and controlling the celestial lights through animation curves, was created by programmer Matt Campbell, initially for use on Avowed. When the class was brought over to TOW2, further additions and code support were provided by programmer Michael Edwards. I planned and created all of the material and volumetric effects, and all of the additional blueprint logic controlling them. Later on, fellow technical artist Weston Mitchell had the unenviable task of nativizing my BP functions into C++ for sake of CPU performance.
I owe major thanks and appreciation to Matt Campbell, Michael Edwards, and Weston Mitchell for all their help in making the system actually work.
This image is a test of the Atmosphere System’s twilight fake lighting. This is to cover the transition between directional lights when both are at intensity = 0; There is no active directional light in this scene. All lighting in the sky and clouds is a material effect, plugged into the emissive input of the cloud material, injected into the fog as an inscattering texture, and added by the skydome mesh.
Because shadows, both on the ground and on the clouds, are very expensive, only 1 directional light can be active at any given time. This meant that at exactly sunset, when the sun is at the horizon, all lighting in the sky must be faked just for that time. To cover that transition period, I use a series of low resolution material render targets to keep track of all of the celestial lights and inject their desired color and brightness into the clouds, fog, and sky.
The underlighting in the clouds is informed by an incrementally updated render target array, using a material that calculates directional shadowing from the sun direction and ambient occlusion. The results are stored them in the R and G channels respectively. The cloud material then uses that texture array as a volume texture to apply the desired sun and sky color when the actual sun and sky are completely dark during twilight because they had to be shut off for the swap between sun and moon. An additive sky dome mesh with a matching sky color completes the effect.
I asked a programmer to add the option to use a 2D LatLong texture instead of a cubemap for the fog inscattering texture, which enabled me to use a small render target for that as well (material render targets can not write to cubemap UVs and the fog code was easier to change).
The clouds move and the sun moves across the sky in the Paradise Island area of the game’s first act. Since it’s an island, the player can see all the way out to the horizon. With basic-lit fog this makes the ocean vista really boring because Unreal’s native atmosphere and cloud shadowing was too expensive to maintain on lower end machines, as well as being limited in art-direction options.
I used a combination of various methods to get cloud shadowing on the fog/atmosphere and the sun rays coming through the clouds at the right spots and at the right angle at all times, in the far distance over the sea.
For the dynamic sky lighting on the distant fog, I used a material render target that was a single row of 2048 pixels, which is the equivalent of 32x64. This rendered a material that multisampled another constantly-updating 256x256 render target that kept track of the top-down cloud densities at any given time, allowing quick access to cloud coverage in any given direction without using the actual clouds. The result is built up, amortized over many frames, and saved to the fog’s inscattering texture with some additional material work to match all the colors and intensities of the sun, moon, and sky.
The same cloud-density render target was used in the light function material to get accurate cloud shadows on the ground, with the UVs offset by the sun vector divided by that vector’s Z component. To avoid crazy tiling at lower angles, the Z component was also used to ramp up the mip bias on the texture and fade to a single value before the sun reaches the horizon.
The near-foreground sun rays are of course just Unreal’s volume lighting. The mid-distance sun rays were a separate feature added by Michael Edwards to the cloud pass to unify the scene a bit more and they look great. But both methods are limited by sample distance and would be much too expensive to just expand way out to the horizon.
So to complete the effect into the vista, I made the far-distant sun rays out of a big mesh that follows the player. It was several concentric cylinders with another custom material. The material reads the same low-resolution cloud-density render target used in the fog to determine whether any vertex, given its world position on the ground, the sun angle, and the cloud density, would or would not be in sunlight. If it would be in sunlight, it stretches up along the sun sun vector and brightens. If not, it collapses so you don’t pay the pixel cost for a ray that isn’t there.
The pixel cost of many overlapping rays is mitigated by their relatively small size on screen and by setting the pixel evaluation to 2x2. This cuts the cost by 75%, but can only be used in this case because the result on-screen is just a gradient so the pixellation is not as noticeable as it would be if it were more detailed.
Not actually used in the game, but it was an initial request, given TOW2's multiple moons, that the nighttime lighting be able to transition smoothly between two primary light sources. This is keeping in mind that there can only ever be 1 directional light on at any given time.
The ability to assign a secondary additive skylight cubemap was added by Matt Campbell, and was later altered to allow a LatLong texture by Michael Edwards. This enabled me to use a low-resolution material render target to keep track of the desired lighting from all celestial light sources. As the brightness of both the primary and secondary moon get closer to equal, they are both ramped down and their remaining contribution is instead added to the skylight LatLong. At the exact moment when the directional light is switched between them, they are both at 0 and the scene is entirely sky-lit. This is only supposed to be for a short while to cover the swap. It worked well but was later found not necessary.
The LatLong render target is still in use however, as it also provides lighting from non-dominant moons in the clouds and on the ground since the multiple moons are still so prominent in the sky at all times.
The various visual effects on the moons, as well as their materials, were all made by me, though the textures were made by environment artists.
The moon material is a unlit emissive that tracks the sun via material parameter collection to light properly. The material is translucent so it can be a single mesh for both solid and atmosphere, and because it will only ever be in front of the stars and nothing else, so it need not render to the pre-pass.
The cloud shadows are just resampling the cloud map a second time with a UV offset according to sun vector and inverting its values. The atmosphere height and density are all adjustable, as are its scattering color and amount.
The material supports light scattering on gas giants as well as rocky planets.
The planet rings are a separate mesh and material, also an unlit emissive. It uses hacky math to fake a shadow from the planet. It also supports backscatter from the sun as you can see in the shot below.
There is an additional additive ring mesh that always faces the camera. When the sun is behind it, the ring’s material logic expands its vertices along their tangents in world space to give a dramatic glow in front of the planet during an eclipse. The ring provides additional atmosphere blending when not used in an eclipse.
Dynamic eclipses are also supported.
I also did other things besides the AtmoSys.
There are two subtle but extremely important effects that must be present in human eyes in order to keep them from looking dead or unnatural: The tiny shadow on the eyeball from the upper eye lid and the wet line underneath at the line of contact with the lower eye lid. Without both of these effects, the eyes will always look plastic, regardless of how realistic the shading or iris refraction is.
Both effects are problematic to create but, as seen in this comparison, they have a massive impact on believability and are therefore always required.
The industry-standard method of creating these effects – and the one still used by epic’s metahuman – is to have separate strips of rigged geometry adhere to the edges of the eyelids and keep as close as possible to the contact point without clipping through as the eye moves. This method is a huge pain in the ass for artists who have to vert-snap this geometry for every base eye shape, rigging artists who have to weight it to the various eye and face joints, and animators who have to make sure it doesn’t break as the character blinks, speaks, and emotes.
To avoid all this additional work, I came up with a way to do both effects dynamically in the eyeball material itself. It is driven directly by the eye’s contact line with any intersecting geometry and therefore supports any eye shape at any time without additional mesh sections.
To function, the material effect has 2 code requirements and 2 art requirements. The first code change was made by Michael Edwards whereby custom depth buffer was taken off the verboten list for opaque materials. Custom depth is its own independent pass and happens before the base pass so there is no reason it cant be accessed by opaque materials. The second was to add the option to opt-out an opaque material from writing to custom depth, even if the actor had it enabled. The art requirements are that the eye socket must now be a cone rather than an empty cavity, and that the geo behind the eye render to custom depth. With both reqs met, the opaque eye material can reference the screen depth of the eye socket behind it, and mask out the area where they meet, as if it were a translucent material using depth-blend.
The rest of the effect was otherwise simple manipulations of diffuse color, AO, specular, and pixel normals.
Almost every light actor in the game is a spotlight for performance reasons. This can leave light fixtures looking fake as they do not then self-illuminate realistically. To give self-illumination, lighters will sometimes place additional point lights in the fixtures, which looks much better but is not at all optimal and is therefore forbidden and shameful. Point lights are exceptionally expensive to shadow and, even unshadowed, the number of overlapping lights is a major concern for large scenes.
To avoid the need for additional point lights, all the lighting fixtures have an additional material effect driven by the attached light color and brightness to fake self-illumination. The masks are pre-baked, per fixture mesh, to vertex colors in maya. The effect simply multiplies itself by the base color of the asset to give the illusion of being lit.
In this image, the actual light component is a spotlight with a cone of limited width that does not shine on the fixture and there is no second light component. The illumination of the shade and arm is faked in the the material.
The game has a lot of screens and an attempt was made, by me, to standardize their look. I made a material function that quantizes the input texture’s UVs to match up with a tiling RGB texture to fake the appearance that the image was being displayed on a real CRT with an explicit resolution and not just a texture with an arbitrary texel density on a card.
I also added several tracking and looping options for scrolling different parts of the input image, independent of the final UVs.
It was important that the UVs be quantized after all other manipulations so there would never be a pixel of the image that sat between two RGB pixels in the effect.
Here the effect cycles through several screen resolutions, showing that, when applied correctly, the CRT effect never breaks.
I made the pointy crystal meshes in houdini, and also made their fake-depth material effect in unreal.
I had not used Houdini before so this was a real learning experience. Shown here is the the final system to generate all the different crystal variants.
My changing the random seed of just the very first scattering of the largest meshes, many different variants could be generated from a single collection of different size base meshes.
The crystal material muti-sampled a greyscale cracks texture with a UV offset informed by camera angle to fake the depth of the cracks going further into the surface, even though the material is opaque.
The mip bias increases as the facing angle approaches 0, allowing the effect to work with only 3 texture samples as the blurring of the mips fills in the gaps as the offset increases.
Here’s another angle. The textures were authored by env artists, I just made the material.
This one is not green.