Rendering Guide
This guide covers practical patterns for rendering with the @web-engine-dev/renderer package. For conceptual background, see Core Concepts: Rendering. For full API details, see the Renderer package documentation.
Setting Up the Renderer
Creating a Device
The renderer uses a unified GpuDevice interface with a WebGPU runtime backend. Use createDevice() with preferredBackend: 'auto' (kept for API compatibility):
import { createDevice } from '@web-engine-dev/renderer';
const canvas = document.querySelector('canvas')!;
const { device, backend } = await createDevice({
canvas,
preferredBackend: 'auto',
powerPreference: 'high-performance',
});
console.log(`Using ${backend} backend`); // always 'webgpu'The returned backend is always 'webgpu'. If WebGPU is unavailable, device creation throws.
WebGPU-First Policy
All engine code must use preferredBackend: 'auto'. The renderer runtime is WebGPU-only.
Creating a Forward Renderer
The ForwardRenderer handles the main rendering pass:
import { ForwardRenderer } from '@web-engine-dev/renderer';
const renderer = new ForwardRenderer(device, {
clearColor: { r: 0.1, g: 0.1, b: 0.15, a: 1.0 },
msaa: 4,
});Standard Render Pipeline
For a production-ready pipeline with forward rendering and post-processing, use createStandardRenderPipeline:
import { createStandardRenderPipeline } from '@web-engine-dev/renderer';
const { renderer, postProcess, effects } = createStandardRenderPipeline(device, {
quality: 'high',
antialiasing: 'fxaa',
shadows: true,
postProcessing: {
bloom: true,
bloomIntensity: 0.5,
ssao: true,
tonemapOperator: 'aces',
},
});Quality presets ('low', 'medium', 'high', 'ultra') provide sensible defaults for all sub-settings, which you can override individually.
Creating Cameras
Standalone Camera
The Camera class handles view and projection matrices with automatic frustum extraction:
import { Camera, ProjectionType, Vec3 } from '@web-engine-dev/renderer';
// Perspective camera
const camera = new Camera({
position: new Vec3(0, 5, 10),
projectionType: ProjectionType.Perspective,
perspective: {
fov: Math.PI / 4, // 45 degrees
aspect: canvas.width / canvas.height,
near: 0.1,
far: 1000,
},
});
camera.lookAt(Vec3.zero());
// Access derived matrices
const viewProj = camera.viewProjectionMatrix;
const frustum = camera.frustum; // For frustum culling
// Coordinate transformations
const screenPos = camera.worldToScreen(worldPoint, canvasWidth, canvasHeight);
const ray = camera.screenToRay(mouseX, mouseY, canvasWidth, canvasHeight);// Orthographic camera
const orthoCamera = new Camera({
position: new Vec3(0, 10, 0),
projectionType: ProjectionType.Orthographic,
orthographic: {
left: -10,
right: 10,
bottom: -10,
top: 10,
near: 0.1,
far: 100,
},
});ECS Camera Component
When using the ECS rendering integration, spawn camera entities with spawnCamera:
import {
spawnCamera,
CameraComponent,
ActiveCameraEntity,
ProjectionTypeValue,
} from '@web-engine-dev/renderer';
const cameraEntity = spawnCamera(world, {
position: { x: 0, y: 5, z: 10 },
fov: 60, // degrees
near: 0.1,
far: 1000,
active: true, // sets as the active camera
});Materials
Materials define the visual appearance of meshes through shader properties, textures, and blend modes.
PBR Standard Material
The primary material follows the metallic-roughness PBR workflow, aligned with glTF 2.0:
import {
Material,
PBRMaterialDefinition,
BlendMode,
Color,
} from '@web-engine-dev/renderer';
const material = new Material(device, PBRMaterialDefinition);
material.setProperty('baseColor', new Color(1.0, 0.2, 0.2, 1.0));
material.setProperty('metallic', 0.8);
material.setProperty('roughness', 0.2);
material.setProperty('emissive', new Color(0.0, 0.0, 0.0, 1.0));
material.blendMode = BlendMode.Opaque;Setting Textures
// Load and assign textures
material.setTexture('baseColorTexture', albedoTexture);
material.setTexture('normalTexture', normalMap);
material.setTexture('metallicRoughnessTexture', ormTexture);
material.setTexture('occlusionTexture', aoTexture);
material.setTexture('emissiveTexture', emissiveTexture);PBR Advanced Material
For extended PBR features like clearcoat, transmission, sheen, and iridescence:
import { PBRAdvancedMaterialDefinition } from '@web-engine-dev/renderer';
const glassMaterial = new Material(device, PBRAdvancedMaterialDefinition);
glassMaterial.setProperty('baseColor', new Color(1.0, 1.0, 1.0, 1.0));
glassMaterial.setProperty('metallic', 0.0);
glassMaterial.setProperty('roughness', 0.0);
glassMaterial.setProperty('transmission', 1.0); // Full glass transmission
glassMaterial.setProperty('ior', 1.5); // Index of refraction
glassMaterial.blendMode = BlendMode.AlphaBlend;Unlit Material
For UI elements, debug visualization, or stylized rendering:
import { UnlitMaterialDefinition } from '@web-engine-dev/renderer';
const unlitMaterial = new Material(device, UnlitMaterialDefinition);
unlitMaterial.setProperty('baseColor', new Color(0.0, 1.0, 0.0, 1.0));
unlitMaterial.setTexture('baseColorTexture', spriteTexture);Blend Modes
Materials support multiple blend modes that control depth writing and render queue sorting:
| Blend Mode | Behavior |
|---|---|
BlendMode.Opaque | Full depth write, front-to-back sorting (default) |
BlendMode.AlphaTest | Depth write with alpha cutoff, alpha-to-coverage with MSAA |
BlendMode.AlphaBlend | Transparent, back-to-front sorting, no depth write |
BlendMode.Additive | Additive blending for effects like fire or glow |
BlendMode.Multiply | Multiplicative blending |
// Foliage with alpha cutoff
const foliageMaterial = new Material(device, PBRMaterialDefinition);
foliageMaterial.blendMode = BlendMode.AlphaTest;
foliageMaterial.setProperty('alphaCutoff', 0.5);
// Double-sided rendering (no backface culling)
foliageMaterial.doubleSided = true;Updating Materials
After changing properties, update the GPU uniform buffer before rendering:
material.setProperty('time', elapsed);
material.updateUniforms();Geometry
Built-in Primitives
Create meshes from built-in geometric primitives:
import {
Mesh,
createBox,
createSphere,
createPlane,
createCylinder,
createCone,
createTorus,
} from '@web-engine-dev/renderer';
const boxGeometry = createBox({ width: 2, height: 1, depth: 1 });
const sphereGeometry = createSphere({ radius: 1, segments: 32 });
const planeGeometry = createPlane({ width: 10, height: 10, widthSegments: 1, heightSegments: 1 });
const cylinderGeometry = createCylinder({ radius: 0.5, height: 2, segments: 16 });
const coneGeometry = createCone({ radius: 0.5, height: 2, segments: 16 });
const torusGeometry = createTorus({ radius: 1, tube: 0.3, radialSegments: 16, tubularSegments: 32 });
// Create GPU mesh from geometry data
const boxMesh = Mesh.fromGeometry(device, boxGeometry);
const sphereMesh = Mesh.fromGeometry(device, sphereGeometry);ECS Primitive Helpers
When using ECS integration, use spawnPrimitive for quick entity creation:
import { spawnPrimitive, createTransform } from '@web-engine-dev/renderer';
const cube = spawnPrimitive(world, device, {
type: 'box',
transform: createTransform(
{ x: 0, y: 1, z: 0 }, // position
{ x: 0, y: 0, z: 0 }, // rotation (Euler radians)
{ x: 1, y: 1, z: 1 }, // scale
),
});Instanced Rendering
For drawing many copies of the same mesh efficiently:
import { InstancedMesh } from '@web-engine-dev/renderer';
const instancedMesh = new InstancedMesh(device, boxMesh, 1000);
instancedMesh.updateInstances(device, {
transforms: worldMatrices,
count: 500,
});Lighting
Directional Lights
Directional lights simulate distant light sources like the sun:
import { DirectionalLight, Vec3, Color } from '@web-engine-dev/renderer';
const sun = new DirectionalLight(
new Vec3(0.5, -1.0, 0.5), // direction
new Color(1.0, 0.95, 0.9, 1.0), // warm white color
1.5, // intensity
);
sun.castShadows = true;
sun.shadowCascades = 4;Point Lights
Point lights emit light in all directions from a position:
import { PointLight } from '@web-engine-dev/renderer';
const torch = new PointLight(
new Vec3(0, 2, 0), // position
new Color(1.0, 0.8, 0.5, 1.0), // warm orange
2.0, // intensity
10.0, // range
);
torch.castShadows = true;Spot Lights
Spot lights emit a cone of light:
import { SpotLight } from '@web-engine-dev/renderer';
const spotlight = new SpotLight(
new Vec3(0, 5, 0), // position
new Vec3(0, -1, 0), // direction
Math.PI / 6, // inner cone angle
Math.PI / 4, // outer cone angle
new Color(1.0, 1.0, 1.0, 1.0), // white
3.0, // intensity
);Light Manager
The LightManager collects lights and uploads their data to the GPU:
import { LightManager } from '@web-engine-dev/renderer';
const lightManager = new LightManager(device);
lightManager.addLight(sun);
lightManager.addLight(torch);
lightManager.addLight(spotlight);
// Upload to GPU (only re-uploads when dirty)
lightManager.updateBuffer();ECS Light Entities
When using ECS integration, spawn light entities directly:
import { spawnDirectionalLight, spawnPointLight } from '@web-engine-dev/renderer';
const sunEntity = spawnDirectionalLight(world, {
direction: { x: 0.5, y: -1.0, z: 0.5 },
color: { r: 1.0, g: 0.95, b: 0.9 },
intensity: 1.5,
castShadow: true,
cascadeCount: 4,
});
const torchEntity = spawnPointLight(world, {
position: { x: 0, y: 2, z: 0 },
color: { r: 1.0, g: 0.8, b: 0.5 },
intensity: 2.0,
range: 10.0,
});Image-Based Lighting (IBL)
For realistic environment reflections and ambient lighting, set up IBL from an HDR environment map:
import {
EquirectToCubemapGenerator,
IBLManager,
HDRLoader,
TextureLoader,
} from '@web-engine-dev/renderer';
// Load HDR environment map
const hdrData = await HDRLoader.load('environment.hdr');
const hdrTexture = TextureLoader.createFromHDR(device, hdrData);
// Convert equirectangular to cubemap
const equirectGen = new EquirectToCubemapGenerator(device);
const cubemap = equirectGen.generate(hdrTexture, 512);
// Create IBL manager (generates irradiance, prefiltered, BRDF LUT)
const iblManager = new IBLManager(device);
await iblManager.setupFromCubemap(cubemap);Shadows
Cascaded Shadow Maps (Directional Lights)
Directional lights use cascaded shadow maps (CSM) to provide high-quality shadows across large distances:
import { ShadowPass, ShadowAtlas, CascadeShadowCalculator } from '@web-engine-dev/renderer';
const shadowAtlas = new ShadowAtlas(device, { size: 4096 });
const shadowPass = new ShadowPass(device, {
cascadeCount: 4,
shadowMapSize: 2048,
shadowBias: 0.005,
});Shadow Quality
Fine-tune shadow quality with filter modes and bias settings:
import { ShadowFilterMode, ShadowQuality } from '@web-engine-dev/renderer';
// Higher quality shadow filtering
shadowPass.filterMode = ShadowFilterMode.PCF;
shadowPass.shadowBias = 0.002;Alpha-Test Shadows
Objects with alpha-test materials (foliage, fences) automatically cast cutout shadows using a dedicated shadow shader that samples the base color texture. Transparent (AlphaBlend) materials do not cast shadows by default, but this can be overridden:
import { createMaterialRenderable } from '@web-engine-dev/renderer';
const renderable = createMaterialRenderable({
mesh: foliageMesh,
material: foliageMaterial, // BlendMode.AlphaTest
worldMatrix: transform,
// Alpha-test shadows are automatic for AlphaTest materials
});Post-Processing
The renderer includes a pipeline of screen-space effects applied after the main rendering pass.
Setting Up the Pipeline
import {
PostProcessPipeline,
BloomEffect,
TonemappingEffect,
FXAAEffect,
SSAOEffect,
DepthOfFieldEffect,
ColorGradingEffect,
VignetteEffect,
TonemapOperator,
} from '@web-engine-dev/renderer';
const postProcess = new PostProcessPipeline(device, {
hdrEnabled: true,
effects: [
new SSAOEffect(device),
new BloomEffect(device),
new TonemappingEffect(device),
new FXAAEffect(device),
],
});Configuring Effects
Each effect can be configured individually:
// Bloom
const bloom = new BloomEffect(device);
bloom.threshold = 1.0;
bloom.intensity = 0.5;
bloom.radius = 0.85;
// Tonemapping
const tonemap = new TonemappingEffect(device);
tonemap.operator = TonemapOperator.ACES;
tonemap.exposure = 1.0;
// SSAO
const ssao = new SSAOEffect(device);
ssao.radius = 0.5;
ssao.intensity = 1.0;
ssao.bias = 0.025;
// Depth of Field
const dof = new DepthOfFieldEffect(device);
dof.focusDistance = 5.0;
dof.focalLength = 50.0;
dof.fStop = 2.8;Available Effects
| Effect | Description |
|---|---|
BloomEffect | Glow from bright areas (threshold, intensity, radius) |
TonemappingEffect | HDR to LDR conversion (Reinhard, ACES, KhronosPbrNeutral) |
FXAAEffect | Fast approximate anti-aliasing |
TAAEffect | Temporal anti-aliasing with jitter |
SSAOEffect | Screen-space ambient occlusion |
SSGIEffect | Screen-space global illumination |
DepthOfFieldEffect | Bokeh-based depth blur |
ColorGradingEffect | LUT-based color correction |
VignetteEffect | Darkened screen edges |
CASEffect | Contrast adaptive sharpening |
FSREffect | AMD FidelityFX Super Resolution |
SSREffect | Screen-space reflections |
MotionBlurEffect | Per-object velocity-based blur |
ChromaticAberrationEffect | RGB channel separation |
FilmGrainEffect | Film-like noise overlay |
Loading 3D Models
glTF Loading
Load 3D models in glTF/GLB format using @web-engine-dev/gltf:
import { GLTFLoader } from '@web-engine-dev/gltf';
const loader = new GLTFLoader(device);
const gltf = await loader.load('model.glb');
// Access loaded data
const meshes = gltf.meshes; // GPU meshes
const materials = gltf.materials; // PBR materials
const scene = gltf.scenes[0]; // Scene hierarchyThe glTF loader handles:
- Meshes with multiple primitives
- PBR metallic-roughness materials (all standard textures)
- Skeleton and skinning data
- Morph targets (blend shapes)
- Animation clips
- Scene hierarchy (parent-child nodes)
Texture Loading
The glTF loader uses industry-standard direct GPU upload via createImageBitmap() with premultiplyAlpha: 'none' to preserve straight alpha values. This avoids the Canvas 2D premultiplication issue that can cause black rectangles in opaque regions of alpha-tested textures.
ECS Rendering Integration
The recommended way to use the renderer in a game is through ECS components and systems. This provides automatic transform propagation, frustum culling, and render submission.
Core Rendering Components
import {
Transform3D, // Position, rotation, scale (SoA layout)
LocalToWorld, // Computed 4x4 world matrix (auto-updated)
CameraComponent, // Camera projection settings
DirectionalLightComponent, // Directional light parameters
PointLightComponent, // Point light parameters
SpotLightComponent, // Spot light parameters
MeshHandle, // Reference to mesh in MeshRegistry
RenderFlags, // Visibility, shadow flags, render layer
} from '@web-engine-dev/renderer';Transform3D stores position (px, py, pz), rotation as a quaternion (rx, ry, rz, rw), and scale (sx, sy, sz). The LocalToWorld component holds the computed 4x4 world matrix and is automatically updated by TransformPropagationSystem.
Setting Up Registries
GPU resources are stored in registries, which are ECS resources:
import {
MeshRegistryResource,
MaterialRegistryResource,
TextureRegistryResource,
RenderDeviceResource,
RendererResource,
createMeshRegistry,
createMaterialRegistry,
createTextureRegistry,
} from '@web-engine-dev/renderer';
// Insert registries as world resources
world.insertResource(MeshRegistryResource, createMeshRegistry());
world.insertResource(MaterialRegistryResource, createMaterialRegistry());
world.insertResource(TextureRegistryResource, createTextureRegistry());
world.insertResource(RenderDeviceResource, device);
world.insertResource(RendererResource, renderer);Rendering Systems
The renderer provides systems that run in the ECS scheduler:
import {
TransformPropagationSystem, // Computes LocalToWorld from Transform3D + hierarchy
CameraUpdateSystem, // Updates camera matrices from CameraComponent
BoundsUpdateSystem, // Computes world-space bounding boxes
FrustumCullingSystem, // Sets visibility flags based on frustum
LightCollectionSystem, // Collects lights into LightManager
RenderSubmissionSystem, // Submits renderables to queues
ForwardRenderSystem, // Executes the forward rendering pass
PostProcessRenderSystem, // Applies post-processing effects
} from '@web-engine-dev/renderer';
// Add to scheduler in correct order
scheduler.addSystem(CoreSchedule.PreUpdate, TransformPropagationSystem);
scheduler.addSystem(CoreSchedule.PreUpdate, CameraUpdateSystem);
scheduler.addSystem(CoreSchedule.PostUpdate, BoundsUpdateSystem);
scheduler.addSystem(CoreSchedule.PostUpdate, FrustumCullingSystem);
scheduler.addSystem(CoreSchedule.PostUpdate, LightCollectionSystem);
scheduler.addSystem(CoreSchedule.PostUpdate, ForwardRenderSystem);Using the RenderPlugin
The RenderPlugin from @web-engine-dev/engine handles all system registration automatically:
import { createRenderPlugin, RenderPlugin } from '@web-engine-dev/engine';
// Use the default RenderPlugin
engine.addPlugin(RenderPlugin);
// Or customize which systems are enabled
engine.addPlugin(createRenderPlugin({
enableCulling: true,
enableLights: true,
enableForwardRender: true,
enableSystems: true, // false = registries-only mode
}));The RenderPlugin registers:
MeshRegistryResource,MaterialRegistryResource,TextureRegistryResourceTransformPropagationSystemandCameraUpdateSystemin PreUpdateFrustumCullingSystemandLightCollectionSystemin PostUpdateForwardRenderSystemin PostUpdate
Registries-Only Mode
Set enableSystems: false to get only the resource registries without any systems. This is useful when you want to manage rendering manually (e.g., in demos that call systems directly) while still using the registry infrastructure.
Spawning Renderable Entities
Use helper functions to spawn entities with all required rendering components:
import {
spawnMesh,
spawnPrimitive,
spawnCamera,
spawnDirectionalLight,
spawnPointLight,
createTransform,
} from '@web-engine-dev/renderer';
// Register a mesh and material in the registries
const meshRegistry = world.getResource(MeshRegistryResource);
const materialRegistry = world.getResource(MaterialRegistryResource);
const meshId = meshRegistry.register(boxMesh);
const materialId = materialRegistry.register(pbrMaterial);
// Spawn a mesh entity
const cube = spawnMesh(world, {
meshId,
materialId,
transform: createTransform(
{ x: 0, y: 1, z: 0 }, // position
),
castShadow: true,
receiveShadow: true,
});
// Spawn a camera
const camera = spawnCamera(world, {
position: { x: 0, y: 5, z: 10 },
fov: 60,
near: 0.1,
far: 1000,
active: true,
});
// Spawn lights
const sun = spawnDirectionalLight(world, {
direction: { x: 0.5, y: -1.0, z: 0.5 },
color: { r: 1.0, g: 0.95, b: 0.9 },
intensity: 1.5,
castShadow: true,
});Environment Setup
Configure scene environment settings through ECS resources:
import { setSceneEnvironment, setupFog } from '@web-engine-dev/renderer';
// Set scene environment (ambient light, skybox, IBL)
setSceneEnvironment(world, {
ambientColor: { r: 0.1, g: 0.1, b: 0.15 },
ambientIntensity: 0.3,
});
// Add fog
setupFog(world, {
mode: 'exponential',
color: { r: 0.7, g: 0.8, b: 0.9 },
density: 0.02,
});Resource Lifecycle
Always dispose GPU resources when they are no longer needed:
material.dispose(); // Uniform buffers, bind groups, default textures
mesh.dispose(); // Vertex and index buffers
lightManager.dispose(); // Light buffer
postProcess.dispose(); // All effect resourcesDeferred Deletion
When GPU resources are replaced dynamically (e.g., switching environment maps), do not destroy the old resource immediately. Bind groups from the current frame may still reference it. Use deferred deletion (wait 2-3 frames) to let the GPU pipeline drain:
// Queue old resources for deferred deletion
setTimeout(() => oldEnvironmentMap.dispose(), 100);Capability Checks
Use capability flags to gate optional features:
const { device } = await createDevice({ canvas, preferredBackend: 'auto' });
if (device.capabilities.supportsDeferredRendering) {
// Use GPU-driven rendering for large scenes
}
if (!device.capabilities.supportsTimestampQuery) {
// Disable timestamp-based profiling paths
}Next Steps
- ECS Guide -- Learn the ECS patterns used by the rendering integration
- Renderer Package Documentation -- Full API reference
- Core Concepts: Rendering -- Conceptual overview