Using GPU Textures
While the idea behind textures is simple in principle (a grid of pixels stored on GPU memory), GPU Textures are surprisingly complex objects.
Capabilities
- Shaders can read (sample) from textures
- Textures can be set up as render targets (by attaching them to a framebuffer).
- Arrays Textures can have multiple images (texture arrays, cube textures, 3d textures...), indexed by a
depth
parameter. - Mipmaps Each texture image can have a "pyramid" of "mipmap" images representing
Textures are supported by additional GPU objects:
- Samplers - The specifics of how shaders read from textures (interpolation methods, edge behaviors etc) are controlled by GPU Sampler objects. luma.gl will create a default sampler object for each texture, but the application can override if designed
- TextureViews - A texture view specifies a subset of the images in a texture, enabling operations to be performed on such subsets. luma.gl will create a default TextureView for each texture, but the application can create additional TextureViews.
- Framebuffers - A framebuffer is a map of "attachment points" to one or more textures that can be used when creating
RenderPasses
and made available to shaders.
Setting texture data from CPU data:
- There is a fast path for setting texture data from "images", that can also be used for 8 bit RGBA data.
- General data transfer is more complicated, it needs to go through a GPU Buffer and a CommandEncoder object.
Notes:
- Some GPUs offer additional Texture-related capabilities (such as availability of advanced image formats, more formats being filterable or renderable etc).
- Check
DeviceFeatures
if you would like to take advantage of such features to discover what features are implemented by the currentDevice
(i.e. the current WebGPU or WebGL environment / browser the application is running on).
Texture Dimension
Most textures tend to be two dimensional textures, however GPUs can support additional configurations
Dimension | WebGPU | WebGL2 | Layout | Description |
---|---|---|---|---|
1d | ✅ | ❌ | TextureData | Contains a one dimensional texture (for compute) |
2d | ✅ | ✅ | TextureData | Contains a "normal" image texture |
2d-array | ✅ | ✅ | TextureData[] | Holds an "array" of 2D textures. |
3d | ✅ | ✅ | TextureData[] | Holds a "stack" of textures which enables 3D interpolation. |
cube | ✅ | ✅ | TextureData[6] | Holds 6 textures representing sides of a cube. |
cube-array | ✅ | ❌ | TextureData[6][] | Holds an array where every 6 textures represent the sides of a cube. |
Sometimes a composite texture can be used as a unit, but sometimes it is necessary to specify a specific subtexture.
Texture Formats
A "texture format" specifies which components (RGBA) are present in pixels, and how those pixels are stored in GPU memory. This is an important property of a GPU Texture which must be specified on Texture creation.
In luma.gl, texture formats are identified using string constants, and the TextureFormat
type can be used to ensure texture format strings are valid.
The following table lists all the texture formats constants supported by luma.gl (ordered by how many bytes each pixel occupies).
Note that even though a GPU supports creating and sampling textures of a certain format, additional capabilities may need to be checked separatey, more information below the table.
GPUs support a wide range of texture formats. In luma.gl, each format is identified with a string (the TextureFormat
type).
Compressed Textures
Compressed textures refers to textures that are compressed in a way that can be decompressed performantly on the GPU during sampling. Such textures do not need to be decompressed fully (neither on CPU or GPU) but can be uploaded directly to the GPU in their compressed format, and will remain compressed there. There are some considerations when using compressed textures as support varies between devices, so assets must typically be prepared in multiple formats.
For more information, see e.g. Compressed Textures in 2020.
For supported compressed texture formats, see Texture Formats.
Supercompressed Textures
Supercompressed textures solve the portability problem of compressed textures by defining a common super-compressed format which can be decoded after load into a supported compressed texture format.
To use Basis supercompressed textures in luma.gl, see the loaders.gl BasisLoader
which can extract compressed textures from a basis encoded texture.
Texture Data
The textures may not be completely packed, there may be padding (per row)
Texture creation
const texture = device.createTexture({});
Sampling
A primary purpose of textures is to allow the GPU to read from them.
vec4 texel = sampler2D(texture)
Texture filtering
Texture formats that are filterable can be interpolated by the GPI
Filtering is specified using a separate Sampler
object.
Sampling is a sophisticated process
- magnification
- minification
- anistropy
The parameters used for sampling is specified in a sampler object.
Mipmaps
To improve samling mipmaps can be included in a texture. Mipmaps are a pyramid of lower resolution images that are used when the texture is sampled at a distance. Mipmaps can be generated automatically by the GPU, or can be provided by the application.
Mipmap usage is controller via SamplerProps.mipmapFilter
:
mipmapFilter | minFilter | Description | Linearity | Speed |
---|---|---|---|---|
none | nearest | Mo filtering, no mipmaps | ||
none | linear | Filtering, no mipmaps | bilinear | slowest * |
nearest | nearest | No filtering, sharp switching between mipmaps | ||
nearest | linear | No filtering, smooth transition between mipmaps | ||
linear | nearest | Filtering, sharp switching between mipmaps | bilinear with mipmaps | fastest * |
linear | linear | Filtering, smooth transition between mipmaps | trilinear |
Performance typically depends on texture LOD, and perhaps surprisingly, mipmaps not only improve visual quality, but can also improve performance. When a scaled down, lower quality mipmap is selected it reduces memory bandwidth requirements.
Texture filtering is considered bilinear because it is a linear filter that is applied in two directions, sequentially. First, a linear filter is applied along the image's x axis (width), and then a second filter is applied along the y axis (height). This is why bilinear filtering is sometimes referred to as linear/bilinear.
Binding textures
Before textures can be sampled in the
Texture Rendering (Writing to Textures on the GPU)
Texture formats that are renderable can be bound to framebuffer attachments so that shaders can write to them.
Blending
Data Textures
In WebGPU/WGSL, textures can be used with compute shaders through storage bindings.
Copying Textures
Texture data can be copied between GPU and CPU and vice versa, via GPU Buffers. See CommandEncoder
for more information.
Texture Format Capabilities
Even though a device allows a Texture
to be created with a certain texture format, there may still be limitations in what operations can be done with that texture.
luma provides Device
methods to help applications determine the capabilities of a texture format.
Can textures with the format... | Check using |
---|---|
be created and sampled (using nearest filters)? | device.isTextureFormatSupported(format) |
be sampled using linear filtering? | device.isTextureFormatFilterable(format) |
be rendered into? (render targets / color attachments) | device.isTextureFormatRenderable(format) |
be used for storage bindings? | N/A |
be blended? | Yes, if sampler type float is supported |
support multisampling? | N/A |
Remarks
- Mipmaps can only be auto created for formats that are both filterable and renderable.
- A renderable format is either a color renderable format, or a depth-or-stencil format
- All depth/stencil formats are renderable.
- Samplers always read a "vec4" regardless of which texture format is used. For formats with less than 4 components, missing red, green and blue components in the texture format are read as
0.0
, alpha as1.0
/ - Note that some formats are not mandated by the base standard but represent additional capabilities (e.g. a WebGL2 device running on top of an OpenGL ES 3.2 driver). .