In the previous post, we explored the WebGL rendering pipeline, understanding how data flows from raw vertices through various stages to produce the final image on the screen. Now, it’s time to put that knowledge into practice. In this post, we’ll set up a WebGL context, create simple shaders, and render a point for the canvas.
To use WebGL, we need an HTML canvas element for rendering. Here’s a simple HTML template to get started:
<!DOCTYPE html>
<html>
<body>
<canvas id="example-canvas"></canvas>
<script src="index.js"></script>
</body>
</html>
Save this as index.html. Next, we’ll write index.js.
Our first task in JavaScript is to access the canvas and obtain the WebGL rendering context:
const canvas = document.getElementById("example-canvas");
const gl = canvas.getContext("webgl");
canvas.getContext('webgl') gives us the WebGLRenderingContext object gl, which we’ll use for all WebGL operations.
Building on our previous discussion about the rendering pipeline, we will now focus on shaders. Shaders are small programs written in GLSL (OpenGL Shading Language), which has a syntax similar to C. These programs execute on the GPU to control the programmable stages of the rendering process. For this example we’ll create two:
Vertex Shader: Processes each vertex.
void main() {
gl_Position = vec4(0.0, 0.0, 0.0, 1.0); // Point at the center
gl_PointSize = 100.0; // 100x100 pixel point
}
gl_Position is the vertex’s position, and gl_PointSize sets the point’s size.
Fragment Shader: Processes each fragment. A fragment is the output of the rasterization stage, which is a potential pixel.
void main() {
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // Red color
}
gl_FragColor sets the color of each fragment. Here, it’s red.
Now, let’s compile these shaders:
// Vertex shader: defines the point's position and size
const vertexShaderSource = `
void main() {
gl_Position = vec4(0.0, 0.0, 0.0, 1.0); // Point at the center
gl_PointSize = 100.0; // 100x100 pixel point
}
`;
// Fragment shader: sets the point color to red
const fragmentShaderSource = `
void main() {
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // Red color
}
`;
// Create and compile the vertex shader
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexShaderSource);
gl.compileShader(vertexShader);
// Create and compile the fragment shader
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentShaderSource);
gl.compileShader(fragmentShader);
To make the pipeline run, we combine our shaders into a program:
// Create a program, attach shaders, and link them
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.useProgram(program);
This links our vertex and fragment shaders, preparing the GPU to process data through the pipeline.
Now, let’s provide the “raw materials” (vertex data) and trigger the pipeline, as we discussed in the previous post. Since our vertex shader hardcodes the point’s position, we don’t need to supply vertex data manually yet, just set up the canvas and draw:
// Clear the canvas with a white background
gl.clearColor(1.0, 1.0, 1.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
// Draw the point
gl.drawArrays(gl.POINTS, 0, 1); // Draw one point
gl.clearColor() and gl.clear() prep the framebuffer with a white background.
gl.drawArrays(gl.POINTS, 0, 1) kicks off the pipeline, telling the primitive assembly stage to treat our vertex as a point (remember gl.TRIANGLES or gl.LINES from last time?). It processes one vertex, as specified.
The pipeline then flows: the vertex shader positions the point, primitive assembly defines it as a point, rasterization turns it into fragments, and the fragment shader colors them red. The result is a red 100x100 pixel point in the center of the canvas.
Here’s the complete index.js, showing how every line ties to the pipeline:
const canvas = document.getElementById("example-canvas");
const gl = canvas.getContext("webgl");
// Vertex shader: defines the point's position and size
const vertexShaderSource = `
void main() {
gl_Position = vec4(0.0, 0.0, 0.0, 1.0); // Point at the center
gl_PointSize = 100.0; // 100x100 pixel point
}
`;
// Fragment shader: sets the point color to red
const fragmentShaderSource = `
void main() {
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // Red color
}
`;
// Create and compile the vertex shader
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexShaderSource);
gl.compileShader(vertexShader);
// Create and compile the fragment shader
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentShaderSource);
gl.compileShader(fragmentShader);
// Create a program, attach shaders, and link them
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.useProgram(program);
// Clear the canvas and draw the point
gl.clearColor(1.0, 1.0, 1.0, 1.0); // White background
gl.clear(gl.COLOR_BUFFER_BIT); // Clear the canvas
gl.drawArrays(gl.POINTS, 0, 1); // Draw one point
Save this, open index.html in a browser, and watch the pipeline render a red point.
Notice how this ties back to last time:
We skipped extra vertex data and depth testing for simplicity, but we’ll build on these soon.
This basic setup mirrors the pipeline’s foundation we explored last post. Next time, we’ll:
For now, tweak the shaders, move the point (gl_Position), resize it (gl_PointSize), or recolor it (gl_FragColor) and see the pipeline respond.