AI Proctoring Systems: Privacy-First with Computer Vision
In 2025, AI proctoring has reached a regulatory inflection point. The EU AI Act classifies biometric systems in educational contexts as high-risk AI systems, imposing strict obligations for transparency, documentation, and human oversight. GDPR adds further constraints on processing biometric data. Platforms that ignore these requirements risk fines of up to 4% of global annual revenue.
This article builds a proctoring system that treats privacy as a first principle, not an afterthought. We will implement face detection, gaze tracking, and anomaly detection while keeping biometric data processed locally on the student's device, with data minimization and full GDPR compliance.
What You Will Learn
- Privacy-first architecture: on-device processing vs cloud processing
- Face detection with MediaPipe FaceMesh: 468 landmarks in real time
- Gaze tracking: estimating gaze direction from standard webcams
- Anomaly detection: audio, tab switching, multi-face detection
- GDPR compliance: data minimization, consent, retention policies
- Human oversight: mandatory flag and human review system
1. Privacy-First Design Principles
A privacy-first proctoring system inverts the traditional "collect everything, filter later" paradigm. Core principles:
- Data minimization: no continuous video recording. Capture only anomalous events
- On-device processing: biometric processing in the browser via WebAssembly. The server never sees raw video
- Pseudoanonymization: irreversible hashes for student IDs in the proctoring system
- Purpose limitation: data usable only for the specific exam session
- Mandatory human oversight: no binding automatic decisions — AI flags, humans decide
// Privacy-First Architecture: on-device processing with MediaPipe
// No biometric data transmitted to server - only anonymized event metadata
import * as faceMesh from '@mediapipe/face_mesh';
import * as camera from '@mediapipe/camera_utils';
interface ProctoringEvent {
type: 'gaze_away' | 'face_missing' | 'multiple_faces' | 'tab_switch' | 'audio_anomaly';
timestamp: number;
duration?: number;
confidence: number;
// NO raw biometric data - metadata only
}
class PrivacyFirstProctor {
private faceMeshInstance: faceMesh.FaceMesh | null = null;
private readonly CONFIG = {
GAZE_AWAY_THRESHOLD_DEG: 30,
GAZE_AWAY_DURATION_MS: 3000,
FACE_MISSING_DURATION_MS: 2000,
MAX_EVENTS_PER_SESSION: 100, // GDPR data minimization limit
};
async initialize(videoElement: HTMLVideoElement): Promise<void> {
// MediaPipe processes LOCALLY in the browser
this.faceMeshInstance = new faceMesh.FaceMesh({
locateFile: (file) => `https://cdn.jsdelivr.net/npm/@mediapipe/face_mesh/






