Technical Briefing
How Sketch Pulse Works
Canvas doodles classified by MobileNet
Sketch Pulse captures freehand drawing on a canvas, then sends that canvas directly into MobileNet classification to produce top guesses.
Data Pathway
Canvas Data
MobileNet
Prediction
1) Canvas drawing input
Mouse and touch events are normalized into canvas coordinates and rendered with a stylized brush.
Brush stroke rendering
context.lineCap = "round";
context.lineJoin = "round";
context.strokeStyle = "#19d8ff";
context.lineWidth = 10;
context.shadowBlur = 16;
context.shadowColor = "rgba(255, 60, 172, 0.8)";
context.lineTo(point.x, point.y);
context.stroke();2) Classify directly from canvas
MobileNet accepts the canvas element, so there is no extra export step required.
Canvas inference
await tf.ready();
const model = await getSketchModel();
const result = await model.classify(canvas, 3);
setPredictions(result);3) Confidence-first output
The interface lists top classes and probabilities instead of a single guess, making uncertainty visible.
Mission Debrief
Simple pipeline: draw -> classify -> display top 3.
Client-side model inference only.
Great for rapid experimentation with human input.