Skip to content

Texturing Visual Effects

One of the key elements of making good visual effects is choosing the right assets to make your effects. One of the most important aspects of this is choosing among a wide variety of techniques involving textures. Depending on what you want to achieve, the process will be completely different. This chapter covers most of the common use cases involving textures and try to explain in which cases they can be useful, or not.

Textures mechanics take always a big part of your game engine, and obey a relatively constant set of rules. These conventions will help you master the quality and the performance of what you are importing and thus what you will be rendering. It is mostly a matter of process, but in some cases knowing the underlying gears of textures in your engine can help you come up with new processes.

Authoring For Particles

Authoring textures for particles is a delicate work that requires clever and subtle artistic knowledge in order to create reusable and efficient textures that avoid common pitfalls.

Software, Tools and Processes

Tools for particle authoring are quite various and serve very different purposes. They go from 2D Painting Software to DCC Renderers or Procedural Generators. Most of the time we cope with a set of weapons-of-choice that we sharpen in order to cover most of our use cases.

2D Painting/Editing Software

Basically anything Photoshop or alike. Photoshop being the most commonly used in big studios, it is often a requirement for vfx jobs as many companies use it. Some alternatives exist with different feature sets, prices and user base. Among free alternatives you will be able to use The GiMP or Krita (the latter having new cool features in order to paint normals or use particle brushes)

Editing in Krita

DCC Packages and Renderers

These will be used to render 3D objects to sprites, export point caches, meshes or even 3D Textures based on the VFX capabilities of the package. Autodesk provides many alternatives regarding basic 3d authoring and rendering : 3DSMax / Maya (the latter being more featured regarding fluid, particle and dynamic simulations). Houdini is also a really good asset when it comes to work with VFX because of the breadth of the tools, solvers and the extent of capabilities of rendering and compositing.

Free alternatives exist with more or less success, but Blender is as 2019 making a huge catch-up in terms of user experience and rendering quality.

Generators, painters and other software

Adobe Substance is also one of the many other tools that will be able to help you work your specific maps, especially if you work based on other data sets. You can also generate procedural masks, noises and shapes.

Particle Sprites & Texture Sheets

Deciding what will be the output of your rendering will imply relatively different process for making good effects. So it’s a good idea to have a clear view of what we can achieve with each method, and decide quickly which method can be most effective for our needs.

Methods Overview

Simple Sprites

Simple sprites are the "low-tech" way of making particles and often considered to be the fastest in terms of performance for your game. Designing effects by only dealing with textures can be a bit of a headache, especially if you are not used to this kind of process.

If you want to approach these kind of effects, I advise you to take a look at Torchlight and Torchlight II's effects. If you acquire either game you can access the level editor and their particle editor. The tech is pretty simple but the particle effects are really worth taking a look at.

Making good textures in this kind of pipeline is a bigger challenge, as our sprites will look the same and we will need to take extra caution to avoid outstanding details that could ruin your effects.

Also, use of simple sprites tend to increase the overdraw in order to generate a really uniform mass. This can be balanced by the simplicity of the shader used only to display sprites. However, this can lead to overuse of particle overdraw if you are not .

Advanced Shader Sprites

Using a shader to display your sprites is the "next-gen" way to deal with sprites. Instead of using a single texture for every particle, we use a shader that can combine none, one or many textures, randomizes some deformation and computes everything based on the artist's needs.

This method increases the pixel cost but can help solve many outstanding details issues as every particle can have a distinct and unique look and feel.

Flipbook Texture Sheet

Flipbook texture sheets are a way to bake complex simulations rendered in DCC Packages directly into a sequence of images packed into a texture sheet. These textures are known to be really memory consuming at the expense of relatively low detail.

Different Kinds of Texture Sheets

Flipbooks Sprite Sheets

Random Packed Sprite Sheets

Usual Pitfalls

Alpha Coverage Efficiency

Alpha coverage of a particle sprite is one of the most underestimated sources of performance. Bad coverage can lead to overly expensive effects that will require more particles or larger particles, as the surface coverage of one sprite will decrease. Below is a comparison of different coverage. Coverage can be optimized by cropping efficiently the flipbook cards in order to maintain as few transparent pixels around.

Over-cropping can lead to seams being visible on particles. To reduce this artifact, you can apply edge fading to each card in order to fade alpha as it comes close to the edge of the quad. Moreover some pixels from the adjacent cards in the flipbook.

Outstanding details

Outstanding details are a nightmare of every VFX Artist. These patterns are often part of one sprite texture and make this texture stand out as a single element while being inside a pack of particles. These patterns induce spatial tiling : a repetition of all the outstanding details that betrays the effect.

Temporal tiling artifacts.

Temporal tiling is the worst artifact that can happen to one of your flipbooks, and will lead to make them stand out as repeating and un-natural. In the video above, all the trails end up in a circular shape that catches the eye in a really annoying manner. This is utmost important that you try to remove this kind of shapes so the flipbook can be used in many particles without having your spectator noticing.

Pre-baked lighting

Using pre-baked lighting within a single sprite texture can prevent it to be rotated in order to reduce the spatial tiling of one texture. Depending on the pre-baked lighting hardness, the effect can use more or less angle randomization. While Angle Randomization helps solving outstanding detail issues, some outstanding details can still appear in the pre-baked lighting.

The best bet is not to bake any directional lighting, only self occlusion and micro-shadowing and use a normal map to compute the directional lighting.

Textures for Shader Animation

Alpha Erosion Maps

Alpha Erosion Maps are used to

Progression Maps

Progress Maps are textures containing progression gradients to create animated masks. The principle is the following: At T time (relatively evolving from 0.0 the beginning of the animation to 1.0 the end of the animation), I will draw only the pixels from the mask which values are below T.

Figure showing an animation progress map

Warning : these maps need to be imported as linear data (such as normal maps) as the grayscale gradients as you see them are in-fact temporal data. Importing them as sRGB will create an acceleration effect while playing the animation.

RGB(A) Maps

Sometimes we want to create different masks or maps to store various information for our particles (lighting, occlusion, deformation, temperature) and these maps are often using only one or two channels.

Instead of keeping all these into separate textures, it is often interesting to pack all these data inside RGB/RGBA maps. Every mask will be present in the R,G,B, or A channel of the texture. This method has some advantages, but also some other constraints so every gain can hide drawbacks.

Packing to reduce the number of texture reads.

The main advantage of packing data inside RGB(A) maps is to benefit reading 3 (or 4) different values in a single texture read. Reading textures inside a shader is not trivial and can become quickly expensive. Every texture read ("I need this pixel from this texture to draw my current pixel") will cost. Reading twice the same texture at different locations (by tiling/offset) will cost roughly the same as reading different textures. (For more information, see Performance sections of this documentation).

To sum up, if we need to read this texture once at the same tiling/offset for each channel, we have a winner, otherwise RGBA packing will be less effective.

Packing to reduce the texture count bound to the shader.

Another advantage is to reduce the CPU overhead of binding textures, this is a slight optimization but can prove useful if used on a regular basis. However, packing data inside the same texture will mean your data will be present at the same resolution, so if you needed a 16x16 tiling mask, it is not necessarily a good idea to pack it inside a 256x256 channel.

Effects of DXT1/5 Compression on RGB(A) Maps

Compressing a RGBA map can be tough at times, especially if your RGB composition creates a really colorful composition. Every compressed block of 4x4 pixels will have a hard time being compressed and will present artifacts. (To understand more of this, see the BC Compression section on this page).

Most of the time the culprit channel will be a channel that presents high contrast and high-frequency variations. The best idea in this case is to swap it with the alpha channel or use it in the alpha channel if it was previously unused.

(Need figure with RGBMap Poorly Compressed, with comparison with a Alpha Swap)

What's happening when moving data to a compressed Alpha channel? The data compressed by BC(DXT) compression is compressed separately RGB on one side, Alpha on the other side. So if a channel creates a perturbation in RGB, let's say the G, both R and B will be altered as well. However this does not apply to the alpha channel as it is compressed separately. Also in DXT5, the Alpha channel benefits from 8 grayscale values by 4x4 pixel block instead of a 4 grayscale value per 4x4 pixel block if it was present in a RGB channel.

Flow Maps

Flow maps are 2D textures containing 2D deformation Data intended to apply offset while reading other textures, or to be used inside Screen-Space deformation Channels.

Lookup Tables (LUTs)

Look up Tables are indirection textures that contain information to be looked up by another computation. The most common Lookup Tables in VFX are Color Mapping tables. Such textures are often layout in 1xN 1D Textures (the wider the)

Dithering Patterns

img img

Texture Import Options

Correctly importing a texture is important as it will determine how the color data or the other data (normals, etc.) behaves in the engine. Here are some key points that covers the import options commonly found in game engines. You will find more information about some options and their consequences in the texture sampling section.

Texture size

Texture size rules in video games are really often the same for every game engine:

  • Power of two sizes textures for mip-mapping (and sometimes mip streaming)
  • Multiples of 4 for DXT Compression
  • Not necessarily square textures
  • XXX Size limit (depending on the Rendering API your engine is using)

While making effects you will be abiding by these rules on most of the cases, because of technical requirements that we will cover briefly :

  • Power of Two textures are required by most of the engines to generate Mip-Maps. Moreover, textures have to be multiples of 4 to be Compressed using DXT/BC compression on PC and Console.

  • Non-square textures is also a convenience that will ensure generating mipmaps, but in the end, lower Mip-Maps will stop generating depending on the smaller dimension of the two. For instance a 1024x256 will end up with its lower mipmap being 164.

There is technically nothing (aside from import checks) that prevents you from importing any texture size you want. The only downsides of these unconventional imports will be that your asset will be less optimized. So, in some cases, if you technically require a particular texture resolution and are ready to understand and take responsibility for the underlying implications, go for It.

Engines will try to resize your textures to nearest power of twos (unless you tell them not to). While this is historically required it mainly serves the purpose of keeping a consistent mip chain.

Also, Engines will enable you to contain the sizes to consistent budgets by setting maximum sizes in order to respect the project's texture size guidelines.

Wrap Modes

Wrap modes will enable you to make a texture tile, clamp or repeat as a mirror when outside of the 0..1 UV Range. These modes can be set per-axis so one axis can be clamped while the other is repeating.

Filtering Modes / Anisotropy

Filtering and Anisotropy controls the quality vs performance ratio on how the texture is read. Filtering enables to smooth from one pixel to another using less or more precision, while the anisotropy uses extra texture reads in order to reduce blurriness in the distance and keep the visuals sharp, at the expense of extra performance. For more information see the sampling section.

Mip Generation

When importing a texture into the engine, the importer can generate mip-maps : lower resolution variants of the texture. While these are still part of the texture, the engine can automatically detect which mip is required based on the distance to the camera and the view angle. Mip maps help reduce texture aliasing and save a lot of performance in the cases where the texel density per pixel density would be too high.

For more information see the sampling section.

Coding and Decoding data on multiple channels.

The Value/Multiplier Couple

Simple multiplier coding is a technique that involves at least two channels interpreted at different ranges. A common example of this is RGBM (RGB-Multiplier) encoding. To sum up roughly, the alpha channel serves as a multiplier applied to the RGB Values based on an arbitrary range. For example deciding on a (1.0 ... 6.0) range for the black to white values of the Alpha channel results in color values inside the (0.0 ... 6.0) range.

More information about RGBM

The High-Bits / Low-Bits Packing

While packing data inside 8-bit channels, you end up using (at maximum) 256 different values by channel. For instance if we want to use a 8-bit mask for a progression map, we will end up with only 256 interpolated states of the animation (256 frames). However, for longer animations, we would need more animation steps. High-Bits/Low-Bits packing solves this by introducing another channel where 256 sub-steps will be used for higer precision animation.

Texture Streaming

Texture streaming is a method to load and free textures resources on demand to save up video memory. It is used in some Game Engines and has some behavior that has to be known in order to work correctly with.

Engine without Streaming

When loading textures in your engine, simplest game engines will try to load all textures at all times so they are available whenever needed. This method is simple but has many drawbacks : first, the loading time of the editor or the game level will increase dramatically, especially when working in AAA-size projects. Another drawback is loading everything all the time can fill up and exceed the maximum texture memory available in VRAM (Graphics Memory used by your GPU). The latter involves that all your textures of all your game level will be loaded in memory regardless they are used or not.

As this is not (or I guess "less of") a problem nowadays on PC or PS4/XONE console targets, It was really problematic on the 7th Console generation (PS3/X360) when the video memory was around 160MB of VRAM.

Engines with texture Streaming

Engines with texture streaming provides a Texture Streaming Pool where textures can be loaded into and unloaded from. This pool is a reserved GPU Memory area where texture data will be streamed in and out by the engine.

When a Mesh is displayed on screen, its shader/material will require some textures to display : At that time, the texture streaming system will start loading from the disk the texture and fit its data inside the Texture Pool. As textures can take up a lot of space, meaning a lot of time to load, the streaming system will start loading Mip-Maps, from the highest (small size) to the lowest (best size).

This process will sometimes produce a blurry-to-sharp effect on the scene. Most of the time, it will be noticeable at level start, when all textures are not loaded but are needed at the same time.

At other times, texture streaming will also induce this artifact when changing viewpoint of an object:

  • If the object is viewed from close view point, the texture streaming will try to load lower Mip-Maps in order to display most details.
  • Then, when the object is displayed from afar, the lower Mip-Maps can be discarded as they are not needed anymore (often, the streaming out of low mip-maps happens after some delay in order to avoid crazy In/Out streaming of these mip-maps).
  • If the viewpoint goes back close after that, the artifact can happen again as the streaming in will happen again.

VFX and Texture Streaming

Texture streaming in VFX can be really efficient in order to save up texture budget (which is often quite tight). However, the automatic streaming systems have a major drawback when it comes to display an effect where the textures were not streamed in beforehand.

Most of the time it's referred by the "Shitty blurry crap effect" that happens when I need a majestic unexpected explosion, and I end up in a pile of blurry sprites because it is so sudden that the texture streaming hasn't been able to load up mid-range mip-maps by the time the effect ended.

To avoid this, some engines implement texture streaming hinting to pre-stream textures in advance (when it is possible, for instance, during a cut-scene). This process tells the texture streaming system that a texture (or many) will be required shortly.

Nomenclature

So, we have started going into pre-production and now, the team is starting throwing ideas for the file structure of the project, and the nomenclature of the file names. You also are a part of it and this stage will be crucial during all the production so you will have to decide what’s the best for you, in conjunction of what the rest of the team find important for themselves.

As the nomenclature is something decided collectively it is not a good idea to come up with your own nomenclature, regardless of other team’s choices. Also, It is not a good idea to try to fit perfectly into other team’s nomenclature, as every job has its own requirements, you do not want to add unnecessary complexity to your namings.

Directory Tree

Most of the time, your VFX assets will be sorted into a VFX directory where you will be the only one master onboard, and you will be in charge to keep your realm clean and accessible.

In some cases, some of your effects will have to be separated from your hierarchy : this can happen for instance if gameplay programmers want all the assets (mesh, textures, sounds, VFX) associated with a gameplay element (e.g. a weapon), to reside in the same directory as the weapon’s script or game object. In this case you will have to ensure that you will be able to track these resources out of your realm, and access them easily.

To sort your VFX directory structure, you have multiple choices and you can use one or multiple criterias to create a powerful database. Such criterias can be asset type, theme, level, usage, etc.

Criteria Description
Asset Type The kind of asset used for this effect, directory names could be : Mesh, Particles, Shaders, Materials, Textures, AnimationsAlso please note that in this kind of structure It can be best to keep all the VFX at the root of the folder and all the asset used in these effects in asset type subfolders.
Theme The theme of the effect, can be Fire, Water, Blood, Explosions, Electric, Weather, ...
Level The level name of where the effects will be placed
Usage What will be the main usage of your effect. Can be : Gameplay, Cinematic, Environment, Scripted Events, ...

img

These criterias are the ones that I used most commonly throughout my projects. But if your game involves other criterias, such as teams, races, etc. feel free to use whatever is most relevant for you.

In the end, the best is to use a combination of all of these criterias, for instance using a Usage>Theme>AssetType structure can be enough for most projects. And in this case you can even use the levels as the theme.

In this example, the Effects tree is divided among Cinematic , Gameplay and Generic effects. Cinematic tree will hold levels as themes because they will hold a lot of specific effects that will occur during level’s cutscenes. Gameplay directory will contain as themes the target’s of the effects : in this example I used prefixes (NPC_ WP_) to identify better the target’s type. If you can, use the same name & prefix as game objects.

In the end I use the generic folder as a root for all themes (Fire Explosions, and even Generic effects!) to hold all the assets to make my effects, and generic effects that I can use throughout the game.

This directory serves as base for all effects. And If I have to create a specific asset, for instance an animation texture mask for a cinematic of the Level 2, It would not go in the generic data structure but instead into the Cinematic/Level2/Textures directory.

Prefixes & Postfixes

These are your best companions, as you can use them as search patterns for finding & identifying resources. For instance if your team uses the CH_ EN_ FX_ prefixes to indicate the domain of usage of the texture, it’s really useful to filter resources within your engine if It has a search field.

On the contrary, if the engine is able to tag resources, it should be a better idea to tag your resource using your job’s name (Character, Environment, VFX,...) instead of adding it as a prefix. However, this depends also on the way your engine will be able to print out (as a debug for example) the resource name : if the engine is only able to print out the texture name (and not its location for instance), you will have to be more precise (and thus for instance use a FX_ prefix).

Another criteria that you have to take into account is how your file system handles the length of your file names. Please refer to your data manager and ask him the maximum length of a filename, directory name, and full path.

In my own opinion I try to keep things as simple as they can be, using required prefixes to match the project’s requirements, or my own needs, but tend to keep most of the pre/postfixes optional to have my asset names as simple as possible.

Here is a list of ideas you can use for prefixation and postfixation:

Pre/Post fix Description Example
Job Describes the job that uses the resource. FX_ , EN_ , CH_, SD_,...
Asset Type Short description of the asset type T_ (texture), M_ (mesh), A_(animation), FX_ (effect), S_(sound)
Gameplay Usage Usage in gameplay CIN_ (cinematic), WP_ (weapon), NPC_ (non playable character), VH_ (vehicle),
Flipbook Info Number of columns & rows of the flipbook _NxM (N columns by M rows)