Client Layer
The @circadify/web-sdk runs in the browser. It captures video via the camera, detects facial landmarks, extracts regions of interest (ROIs), and encodes them into a structured tensor for upload. No raw video leaves the device.
Circadify is a hybrid client-server platform for contactless vital signs measurement using remote photoplethysmography (rPPG). The client performs real-time face detection and on-device preprocessing; our inference engine runs the rPPG model on the preprocessed RGB tensor and returns vital signs in the API response. No health data is stored on Circadify’s side.
The platform has three primary layers:
Client Layer
The @circadify/web-sdk runs in the browser. It captures video via the camera, detects facial landmarks, extracts regions of interest (ROIs), and encodes them into a structured tensor for upload. No raw video leaves the device.
API Layer
Serverless API endpoints handle authentication, rate limiting, session management, and secure upload URL generation. Session state is short-lived and encrypted. Developer accounts, API keys, and usage data are stored in a managed database.
Inference Engine
The inference engine processes the uploaded RGB tensor through our rPPG signal processing model, extracts physiological signals, and returns calibrated vital signs in the API response. The tensor is discarded after inference; no health data is stored.
You can integrate Circadify in two ways: using the SDK (recommended) or calling the REST API directly.
The SDK handles the entire pipeline — camera access, face detection, preprocessing, upload, and result polling — in a single async call:
import { CircadifySDK } from '@circadify/web-sdk';
const sdk = new CircadifySDK({ apiKey: 'ck_live_your_key_here', onProgress: (p) => console.log(`${p.phase}: ${p.percent}%`),});
const result = await sdk.measureVitals({ videoElement: document.getElementById('preview') as HTMLVideoElement,});
console.log('Heart Rate:', result.heartRate, 'BPM');console.log('SpO2:', result.spo2, '%');The SDK is ~38 KB (ESM). WASM dependencies (~12 MB total) are lazy-loaded and cached by the browser.
If you cannot use the npm package (e.g., native mobile, server-side, or custom preprocessing), you can call the REST API directly. You are responsible for camera access, preprocessing, and upload:
# 1. Start a sessioncurl -X POST https://api.circadify.com/sdk/session/start \ -H "X-API-Key: ck_live_your_key_here" \ -H "Content-Type: application/json"
# Response: { "session_id": "...", "upload_url": "...", "expires_at": ... }
# 2. Upload preprocessed tensor to the secure upload URLcurl -X PUT "$UPLOAD_URL" \ -H "Content-Type: video/webm" \ --data-binary @tensor.bin
# 3. Notify the backend that upload is completecurl -X POST https://api.circadify.com/sdk/session/$SESSION_ID/upload-complete \ -H "X-API-Key: ck_live_your_key_here"
# 4. Poll for resultscurl https://api.circadify.com/sdk/session/$SESSION_ID/result \ -H "X-API-Key: ck_live_your_key_here"confidence: 0.0 so the client always gets a response. Always check the confidence score.All SDK and API requests are authenticated with API keys in the format ck_live_{hex}. Keys are passed via the X-API-Key header. Each request is validated against:
Rate limit headers are returned on every response:
X-RateLimit-Limit: 300X-RateLimit-Remaining: 297X-RateLimit-Reset: 1712000000All measurements return all six vitals: heart rate, respiratory rate, HRV, SpO2, systolic BP, and diastolic BP.