Hello Cube
The tutorial pages have not yet been updated for luma.gl v9.
Hello Cube
Drawing a textured cube
Rendered using the luma.gl Model
, CubeGeometry
and AnimationLoop
classes.
In this tutorial, we'll pull together several of the techniques we've looked at in the previous tutorials (and add a few new ones) to render a more complex scene: a rotating 3D cube. We'll use luma.gl's built-in geometry primitives to create a cube mesh and handle 3D math using math.gl.
math.gl can be installed by running npm i math.gl
As always, we'll start with our imports:
import {AnimationLoop, Model, CubeGeometry} from '@luma.gl/engine';
import {clear, setParameters} from '@luma.gl/webgl';
import {Matrix4} from '@math.gl/core';
Our shaders are somewhat more involved that we've seen before:
const vs = `\
attribute vec3 positions;
attribute vec2 texCoords;
uniform mat4 uMVP;
out vec2 vUV;
void main(void) {
gl_Position = uMVP * vec4(positions, 1.0);
vUV = texCoords;
}
`;
const fs = `\
precision highp float;
uniform sampler2D uTexture;
uniform vec3 uEyePosition;
in vec2 vUV;
void main(void) {
gl_FragColor = texture2D(uTexture, vec2(vUV.x, 1.0 - vUV.y));
}
`;
The two biggest additions to the shaders we've seen before are transforming
the positions to rotate our model and create the 3D perspective effect
(via the uMVP
matrix) and sampling a texture to color fragments (via the texture2D
call).
The set up to render in 3D involves a few extra steps compared to the triangles we've been drawing so far:
model;
viewMatrix;
mvpMatrix;
override onInitialize({device}) {
const texture = device.createTexture({
data: 'vis-logo.png'
});
const eyePosition = [0, 0, 5];
const viewMatrix = new Matrix4().lookAt({eye: eyePosition});
const mvpMatrix = new Matrix4();
const model = new Model(device, {
vs,
fs,
geometry: new CubeGeometry(),
uniforms: {
uTexture: texture
},
parameters: {
depthTest: true
}
});
}
Some of the new techniques we're leveraging here are:
- Using
setParameters
to set up depth testing and ensure surfaces occlude each other properly. Compared to setting these parameters directly, thesetParameters
function has the advantage of tracking state and preventing redundant WebGL calls. - Creating a texture using the
device.createTexture
method. For our purposes, this is as simple as passing a URL to the image location (the image used in this tutorial is available here, but any JPEG or PNG image will do). - Creating view and MVP matrices using math.gl's
Matrix4
class to store the matrices we'll pass to our shaders to perform the animation and perspective projection. - Generating attribute data using the
CubeGeometry
class and passing it to ourModel
using thegeometry
property. The geometry will automatically feed vertex position data into thepositions
attribute and texture coordinates (or UV coordinates) into thetexCoords
attribute.
Our onRender
is similar to what we've seen before with the extra step of setting up the transform matrix and passing it as a uniform to the Model
:
override onRender({device, aspect, tick, model, mvpMatrix, viewMatrix}) {
mvpMatrix.perspective({fovy: Math.PI / 3, aspect})
.multiplyRight(viewMatrix)
.rotateX(tick * 0.01)
.rotateY(tick * 0.013);
clear(device, {color: [0, 0, 0, 1]});
model.setUniforms({uMVP: mvpMatrix})
.draw();
}
We use Matrix4
's matrix operations to create our final transformation matrix, taking advantage of a few additional parameters that are passed to the onRender
method:
aspect
is the aspect ratio of the canvas and is used to set up the perspective projection.tick
is simply a counter that increments each frame. We use it to drive the rotation animation.
If all went well, you should see a rotating cube with the vis.gl logo painted on each side. The full source code is listed below for reference:
import {AnimationLoop, Model, CubeGeometry} from '@luma.gl/engine';
import {clear, setParameters} from '@luma.gl/webgl';
import {Matrix4} from '@math.gl/core';
const vs = `\
attribute vec3 positions;
attribute vec2 texCoords;
uniform mat4 uMVP;
out vec2 vUV;
void main(void) {
gl_Position = uMVP * vec4(positions, 1.0);
vUV = texCoords;
}
`;
const fs = `\
precision highp float;
uniform sampler2D uTexture;
uniform vec3 uEyePosition;
in vec2 vUV;
void main(void) {
gl_FragColor = texture2D(uTexture, vec2(vUV.x, 1.0 - vUV.y));
}
`;
const loop = new AnimationLoop({
override onInitialize({device}) {
setParameters(device, {
depthTest: true
});
const texture = device.createTexture({data: 'vis-logo.png'});
const eyePosition = [0, 0, 5];
const viewMatrix = new Matrix4().lookAt({eye: eyePosition});
const mvpMatrix = new Matrix4();
const model = new Model(device, {
vs,
fs,
geometry: new CubeGeometry(),
uniforms: {
uTexture: texture
}
});
return {
model,
viewMatrix,
mvpMatrix
};
},
override onRender({device, aspect, tick, model, mvpMatrix, viewMatrix}) {
mvpMatrix
.perspective({fovy: Math.PI / 3, aspect})
.multiplyRight(viewMatrix)
.rotateX(tick * 0.01)
.rotateY(tick * 0.013);
clear(device, {color: [0, 0, 0, 1]});
model.setUniforms({uMVP: mvpMatrix}).draw();
}
});
loop.start();