Rendering
The renderer (@web-engine-dev/renderer) provides a complete 3D rendering pipeline with a WebGPU-first, WebGPU-only runtime. It integrates with the ECS through data-driven components and systems.
Device Creation
The renderer uses a unified GpuDevice interface. Use createDevice() with preferredBackend: 'auto' (kept for API compatibility):
import { createDevice } from '@web-engine-dev/renderer';
const { device, backend } = await createDevice({
canvas: document.querySelector('canvas')!,
preferredBackend: 'auto',
powerPreference: 'high-performance',
});
console.log(`Using ${backend} backend`); // always 'webgpu'The device provides GPU resource creation and management:
GpuBuffer-- Vertex, index, uniform, and storage buffersGpuTexture/GpuTextureView-- Texture resources with mipmapsGpuRenderPipeline/GpuComputePipeline-- Shader pipelinesGpuBindGroup/GpuBindGroupLayout-- Resource bindings for shaders
WebGPU-First Policy
All engine demos and examples use preferredBackend: 'auto'. The runtime backend is WebGPU-only.
Camera System
Cameras handle view and projection matrices with automatic frustum extraction:
import { Camera, ProjectionType } from '@web-engine-dev/renderer';
const camera = new Camera({
position: new Vec3(0, 5, 10),
projectionType: ProjectionType.Perspective,
perspective: {
fov: Math.PI / 4,
aspect: canvas.width / canvas.height,
near: 0.1,
far: 1000,
},
});
camera.lookAt(Vec3.zero());
// Access derived data
const viewProj = camera.viewProjectionMatrix;
const frustum = camera.frustum; // For cullingFeatures include perspective and orthographic projections, cached matrices with dirty flags, frustum extraction for culling, and coordinate transformation helpers (worldToScreen, screenToRay).
Materials
Materials define the visual appearance of meshes through shader properties and textures.
PBR Materials
The primary material type follows the metallic-roughness PBR workflow, aligned with the glTF 2.0 specification:
import { Material, PBRMaterialDefinition, BlendMode } from '@web-engine-dev/renderer';
const material = new Material(device, PBRMaterialDefinition);
material.setProperty('baseColor', new Color(1, 0, 0, 1));
material.setProperty('metallic', 0.8);
material.setProperty('roughness', 0.2);
material.setTexture('baseColorTexture', albedoTexture);
material.blendMode = BlendMode.Opaque;Built-in material definitions:
| Material | Description |
|---|---|
PBRMaterialDefinition | Standard metallic-roughness PBR (base color, metallic, roughness, normal, occlusion, emissive) |
| PBR Advanced | Extended PBR with clearcoat, transmission, sheen, iridescence, specular extensions |
UnlitMaterialDefinition | Simple unlit rendering (base color + texture only) |
Blend Modes
Materials support multiple blend modes that control how they interact with the depth buffer and render queue:
BlendMode.Opaque-- Full depth write, front-to-back sortingBlendMode.AlphaTest-- Depth write with alpha cutoff (alpha-to-coverage with MSAA)BlendMode.AlphaBlend-- Transparent, back-to-front sorting, no depth writeBlendMode.Additive-- Additive blending for effects like fire or glowBlendMode.Multiply-- Multiplicative blending
Geometry
Create meshes from built-in primitives or custom geometry data:
import { Mesh, createBox, createSphere } from '@web-engine-dev/renderer';
// Built-in primitives
const boxGeometry = createBox({ width: 2, height: 1, depth: 1 });
const sphereGeometry = createSphere({ radius: 1, segments: 32 });
const mesh = Mesh.fromGeometry(device, boxGeometry);
// Instanced rendering for many copies
const instancedMesh = new InstancedMesh(device, mesh, 1000);
instancedMesh.updateInstances(device, { transforms, count: 100 });Standard vertex layouts:
| Layout | Size | Contents |
|---|---|---|
Position | 12 bytes | Position only |
PositionNormal | 24 bytes | Position + normals |
PositionNormalUV | 32 bytes | Position + normals + UVs |
PositionNormalTangentUV | 48 bytes | Full PBR (tangents for normal mapping) |
Lighting
Three light types are supported, managed through a LightManager that uploads data to the GPU:
import {
DirectionalLight,
PointLight,
SpotLight,
LightManager,
} from '@web-engine-dev/renderer';
const sun = new DirectionalLight(
new Vec3(0.5, -1, 0.5), // direction
new Color(1, 0.95, 0.9, 1), // color
1.5, // intensity
);
sun.castShadows = true;
sun.shadowCascades = 4;
const lightManager = new LightManager(device);
lightManager.addLight(sun);
lightManager.addLight(new PointLight(new Vec3(0, 2, 0)));
lightManager.addLight(new SpotLight(
new Vec3(0, 5, 0), // position
new Vec3(0, -1, 0), // direction
Math.PI / 6, // inner angle
Math.PI / 4, // outer angle
));
// Upload light data to GPU (only re-uploads when dirty)
lightManager.updateBuffer();Image-Based Lighting (IBL)
For realistic environment reflections and ambient lighting, the renderer provides a full IBL pipeline:
- Equirectangular HDR maps are converted to cubemaps
- Irradiance maps provide diffuse ambient light (cosine-weighted hemisphere sampling)
- Prefiltered environment maps provide specular reflections at varying roughness (GGX importance sampling)
- BRDF LUT encodes the split-sum approximation for real-time IBL
The IBL pipeline includes anti-firefly techniques (K=4 LOD bias, soft HDR compression) aligned with Google Filament's approach, and multi-scatter compensation (Fdez-Aguera 2019).
Shadows
The renderer supports shadow mapping for all three light types.
Cascaded Shadow Maps (Directional Lights)
Directional lights use cascaded shadow maps (CSM) to provide high-quality shadows across large distances:
import { ShadowPass, ShadowAtlas } from '@web-engine-dev/renderer';
const shadowAtlas = new ShadowAtlas(device, { size: 4096 });
const shadowPass = new ShadowPass(device, {
cascadeCount: 4,
shadowMapSize: 2048,
shadowBias: 0.005,
});Point and Spot Shadows
Point lights use omnidirectional shadow maps (cubemap rendering). Spot lights use single-direction perspective shadow maps.
Alpha-Test Shadows
Objects with alpha-test materials (e.g., foliage, fences) cast cutout shadows using a dedicated shadow shader that samples the base color texture and discards fragments below the alpha threshold. Transparent (alpha-blend) materials do not cast shadows by default.
Post-Processing
The renderer includes a pipeline of screen-space effects applied after the main rendering pass:
import {
PostProcessPipeline,
BloomEffect,
TonemappingEffect,
FXAAEffect,
} from '@web-engine-dev/renderer';
const postProcess = new PostProcessPipeline(device, {
hdrEnabled: true,
effects: [
new BloomEffect(device),
new TonemappingEffect(device),
new FXAAEffect(device),
],
});Available effects include:
| Effect | Description |
|---|---|
| Bloom | Glow from bright areas (threshold + intensity control) |
| Tonemapping | HDR to LDR conversion (Reinhard, ACES, and more) |
| FXAA | Fast approximate anti-aliasing |
| TAA | Temporal anti-aliasing with jitter |
| SSAO | Screen-space ambient occlusion |
| SSGI | Screen-space global illumination |
| Depth of Field | Bokeh-based depth blur |
| Color Grading | LUT-based color correction |
| Vignette | Darkened screen edges |
| CAS / FSR | Contrast adaptive sharpening / AMD FidelityFX |
| Motion Blur | Per-object velocity-based blur |
ECS Integration
The renderer provides ECS components and systems for seamless integration with the ECS. This is the recommended way to use the renderer in a game.
Rendering Components
import {
Transform3D, // Position, rotation, scale (SoA layout)
LocalToWorld, // Computed world matrix (set by system)
CameraComponent, // Camera projection settings
DirectionalLightComponent, // Directional light parameters
PointLightComponent, // Point light parameters
SpotLightComponent, // Spot light parameters
MeshHandle, // Reference to mesh in registry
RenderFlags, // Visibility, shadow casting, etc.
} from '@web-engine-dev/renderer';The Transform3D component stores position (px, py, pz), rotation as a quaternion (rx, ry, rz, rw), and scale (sx, sy, sz) in SoA layout. The LocalToWorld component holds the computed 4x4 world matrix (16 f32 fields) and is automatically updated by the transform propagation system.
Rendering Resources
Registries store GPU resources and map ECS handles to them:
import {
MeshRegistry,
MaterialRegistry,
TextureRegistry,
MeshRegistryResource,
MaterialRegistryResource,
TextureRegistryResource,
RenderDeviceResource,
RendererResource,
} from '@web-engine-dev/renderer';
// Initialize registries as world resources
world.insertResource(MeshRegistryResource, new MeshRegistry());
world.insertResource(MaterialRegistryResource, new MaterialRegistry());
world.insertResource(TextureRegistryResource, new TextureRegistry());
world.insertResource(RenderDeviceResource, device);Rendering Systems
The renderer provides systems that run in the ECS scheduler:
- TransformPropagationSystem -- Computes
LocalToWorldfromTransform3Dand parent-child hierarchy - FrustumCullingSystem -- Sets visibility flags based on camera frustum intersection
- ForwardRenderSystem -- Executes the main forward rendering pass
import {
TransformPropagationSystem,
FrustumCullingSystem,
ForwardRenderSystem,
} from '@web-engine-dev/renderer';
scheduler.addSystem(CoreSchedule.Update, TransformPropagationSystem);
scheduler.addSystem(CoreSchedule.Update, FrustumCullingSystem);
scheduler.addSystem(CoreSchedule.Update, ForwardRenderSystem);Shader System
The renderer uses WGSL (WebGPU Shading Language) for runtime shaders. @web-engine-dev/shader-compiler remains available for preprocessing, variants, manifests, and optional transpilation workflows.
Bind Group Layout
The renderer uses a 4-group bind group layout for stable update frequency boundaries:
| Group | Purpose | Update Frequency |
|---|---|---|
| 0 | Camera (view/projection matrices, position) | Per-frame |
| 1 | Model (world matrix, normal matrix) | Per-object |
| 2 | Material (properties, textures, samplers) | Per-material |
| 3 | Lighting (lights, shadows, IBL, fog) | Per-frame |
Draw calls are sorted by material (group 2) to minimize bind group changes.
Shader Variants
The shader system supports compile-time feature composition through defines and template markers:
import { ShaderVariantCache } from '@web-engine-dev/renderer';
const cache = new ShaderVariantCache();
const shader = cache.getOrCreate(device, {
template: PBRTemplate,
features: [ShaderFeature.NormalMapping, ShaderFeature.Skinning],
defines: { MAX_LIGHTS: 16, SHADOW_CASCADES: 4 },
});GPU-Driven Rendering
For scenes with many objects, the renderer supports GPU-driven indirect rendering with compute shader-based culling:
import { GPUDrivenRendererSoA } from '@web-engine-dev/renderer';
const gpuRenderer = new GPUDrivenRendererSoA(device, {
maxInstances: 100000,
maxBatches: 1000,
});GPU-driven rendering moves frustum culling and draw call generation to the GPU, enabling scenes with hundreds of thousands of objects.
Resource Lifecycle
Always dispose GPU resources when they are no longer needed:
material.dispose(); // Uniform buffers, bind groups, default textures
mesh.dispose(); // Vertex and index buffers
texture.dispose(); // Texture and views
lightManager.dispose();Deferred Deletion
When GPU resources (textures, buffers) are replaced dynamically, do not destroy the old resource immediately. Bind groups from the current frame may still reference it. Use deferred deletion (2-3 frames delay) to let the GPU pipeline drain before destroying the old resource.
Capability Checks
The renderer exposes device capability flags so you can branch features safely:
const { device } = await createDevice({ canvas, preferredBackend: 'auto' });
if (device.capabilities.supportsDeferredRendering) {
// Enable deferred pipeline path
}
if (!device.capabilities.supportsTimestampQuery) {
// Disable timestamp-based GPU profiling features
}