Operators API
Neural network operators implemented as WebGPU compute shaders.
Operator Base Class
All operators extend the base Operator class:
typescript
abstract class Operator {
constructor(context: GPUContext);
abstract computeOutputShape(inputShape: TensorShape, params?: OperatorParams): TensorShape;
abstract forward(inputs: Tensor[], params?: OperatorParams): Promise<Tensor>;
destroy(): void;
}1
2
3
4
5
6
7
8
2
3
4
5
6
7
8
Conv2dOperator
2D convolution operator.
Parameters
typescript
interface Conv2dParams {
kernelSize: [number, number]; // [kernelHeight, kernelWidth]
stride: [number, number]; // Default: [1, 1]
padding: [number, number]; // Default: [0, 0]
useBias?: boolean; // Default: true if bias provided
}1
2
3
4
5
6
2
3
4
5
6
Usage
typescript
const conv2d = new Conv2dOperator(context);
const output = await conv2d.forward([input, weight, bias], {
kernelSize: [3, 3],
stride: [1, 1],
padding: [1, 1]
});1
2
3
4
5
6
2
3
4
5
6
Shapes
| Tensor | Shape | Description |
|---|---|---|
| input | [N, C, H, W] | Input feature map (NCHW) |
| weight | [K, C, kH, kW] | Convolution kernels |
| bias | [K] | Bias vector (optional) |
| output | [N, K, outH, outW] | Convolved feature map |
MaxPoolOperator
2D max pooling operator.
Parameters
typescript
interface MaxPoolParams {
poolSize: [number, number]; // [poolHeight, poolWidth]
stride: [number, number]; // Default: same as poolSize
}1
2
3
4
2
3
4
Usage
typescript
const maxpool = new MaxPoolOperator(context);
const output = await maxpool.forward([input], {
poolSize: [2, 2],
stride: [2, 2]
});1
2
3
4
5
2
3
4
5
Shapes
| Tensor | Shape | Description |
|---|---|---|
| input | [N, C, H, W] | Input feature map (NCHW) |
| output | [N, C, outH, outW] | Pooled feature map |
ReLUOperator
ReLU activation function: output = max(0, input)
Usage
typescript
const relu = new ReLUOperator(context);
const output = await relu.forward([input]);1
2
2
Shapes
Input and output have identical shapes.
SoftmaxOperator
Softmax activation function with numerical stability.
Parameters
typescript
interface SoftmaxParams {
axis?: number; // Default: -1 (last axis)
}1
2
3
2
3
Usage
typescript
const softmax = new SoftmaxOperator(context);
const output = await softmax.forward([input], { axis: -1 });1
2
2
Limitations
Currently only supports softmax along the last axis (axis: -1).
Properties
- Output values are in [0, 1]
- Values sum to 1 along the specified axis
DenseOperator
Fully connected (dense) layer.
Parameters
typescript
interface DenseParams {
units: number;
useBias?: boolean; // Default: true
}1
2
3
4
2
3
4
Usage
typescript
const dense = new DenseOperator(context);
const output = await dense.forward([input, weight, bias], { units: 128 });1
2
2
Shapes
| Tensor | Shape | Description |
|---|---|---|
| input | [N, inputSize] | Input features |
| weight | [units, inputSize] | Weight matrix |
| bias | [units] | Bias vector (optional) |
| output | [N, units] | Dense output |
FlattenOperator
Flatten tensor to 2D (zero-copy operation).
Usage
typescript
const flatten = new FlattenOperator(context);
const output = await flatten.forward([input]);
// [N, C, H, W] -> [N, C*H*W]1
2
3
2
3
Shapes
| Input | Output |
|---|---|
| [N, C, H, W] | [N, CHW] |
| [N, D1, D2, ...] | [N, D1D2...] |
Note: This is a zero-copy operation (creates a view).
Conv2dBiasReLUOperator
Fused Conv2d + Bias + ReLU operator for optimized performance.
Parameters
Same as Conv2dParams (bias is required).
Usage
typescript
const fused = new Conv2dBiasReLUOperator(context);
const output = await fused.forward([input, weight, bias], {
kernelSize: [3, 3],
stride: [1, 1],
padding: [1, 1]
});1
2
3
4
5
6
2
3
4
5
6
Performance
3× memory bandwidth reduction compared to sequential execution:
Sequential: 6 memory ops (Conv → Write → Read → Bias → Write → Read → ReLU → Write)
Fused: 2 memory ops (Read → Conv+Bias+ReLU → Write)1
2
2
Common Patterns
Chaining Operations
typescript
// Sequential execution
let x = input;
x = await conv2d.forward([x, w1, b1], params1);
x = await relu.forward([x]);
x = await maxpool.forward([x], { poolSize: [2, 2] });
x = await flatten.forward([x]);
x = await dense.forward([x, w2, b2], { units: 10 });
x = await softmax.forward([x]);1
2
3
4
5
6
7
8
2
3
4
5
6
7
8
Using Fused Operators
typescript
// Instead of:
// x = await conv2d.forward([x, w, b], params);
// x = await relu.forward([x]);
// Use fused:
x = await fusedConvReLU.forward([x, w, b], params);1
2
3
4
5
6
2
3
4
5
6
Error Handling
All operators throw descriptive errors for invalid inputs:
typescript
try {
const output = await conv2d.forward([invalidInput], params);
} catch (error) {
console.error('Operator error:', error.message);
// Example: "Conv2d requires input with shape [N, C, H, W]"
}1
2
3
4
5
6
2
3
4
5
6