Launch the session
Hit the button and pick the active Meet tab or screen from the browser picker. Takes five seconds.
They're cheating. You're losing.
We're ending it.
OpticGuard tracks gaze patterns in real time, exposing AI-assisted cheating during live interviews before it costs the honest candidate the role.
Every computation runs locally in your browser and is discarded the moment your session ends. No server, no database, no transmission of any kind. Ever. The entire codebase is open source, so you don't have to take our word for it.
The companies hiring right now. The candidates they deserve.
While honest candidates prepare for months, others sit down with an AI whispering every answer. OpticGuard detects the gaze patterns that give it away, in real time, with no plugins required.
Hit the button and pick the active Meet tab or screen from the browser picker. Takes five seconds.
The shared screen fills the window. OpticGuard's panel stays visible on top, running continuously.
Live confidence score, gaze history, and a plain-language verdict update every 12 milliseconds.
Google Meet, Zoom, and most major platforms actively block third-party tools from accessing live video streams through browser extensions. It's a deliberate restriction, not an oversight. OpticGuard bypasses this entirely by using the browser's native screen-share API, which requires no special permissions and works today, out of the box.
If OpticGuard is adopted at the organizational level, it can absolutely be shipped as a verified Chrome extension or a native Zoom integration. That version would still run entirely locally, remain fully open source, and carry the exact same zero-data guarantee. We're building toward that. The screen-share approach is the version that works for everyone, right now.
You don't need to trust us. That's the point. Open your browser's DevTools Network tab and run a session. You will not see a single outbound request. No fetch calls, no WebSocket connections, no beacons. The entire source is publicly available, so you can read exactly what happens to the gaze coordinates: they're computed into a confidence score, displayed, and discarded. Nothing persists. Nothing is sent.
If you're deploying this for an organization and need formal assurance, any developer can audit the codebase in under an hour and confirm the same.
Only gaze-direction coordinates, derived from your webcam feed during an active session. These are processed locally to produce a single reading-pattern confidence score: a measure of whether eye movement is consistent with reading off-screen text. The raw coordinates and the computed score exist only in memory, only for the duration of the session. When you stop, they are gone.
OpticGuard does not record video, capture screenshots, store conversation content, or retain anything beyond the current session's in-memory signal.
OpticGuard. Because opportunity should be earned.