Ambient Context
Passively records which apps you use, window titles, URLs, and dwell time — all locally. Combined with MCP, your AI tools can ask “what was I working on yesterday?” and get a real answer.
No screenshots. No keylogging. No cloud sync. Just a local timeline of your work context that makes your voice data and AI tools dramatically more useful.
Your work timeline
Ambient context builds a timeline as you work. Which app was active, what was in the window title, which URL was open, and how long you stayed. All captured via the macOS Accessibility API — no screen recording, no pixel capture.
This timeline feeds into your auto-generated daily journal and is queryable by your AI tools via MCP. When Claude asks Resonant “what was I working on?”, the ambient timeline is how it knows.
Daily Standup
auth-service/routes.ts
github.com/org/repo/pull/247
#eng-team
webhook-retry/handler.ts
grafana.internal/d/api-latency
auth-service/middleware.ts
What it captures
Which application is in the foreground and for how long.
VS Code — 2h 14m today
The title of the active window — file names, document titles, chat recipients.
auth-service/routes.ts — VS Code
The URL of the active browser tab (Chrome, Safari, Arc, Firefox).
github.com/org/repo/pull/247
How long you spend in each app and window, aggregated by session.
Slack: 45m across 12 sessions
The sequence of app transitions throughout the day, with timestamps.
VS Code → Chrome → Slack → VS Code
Text copied to clipboard during work (optional, disabled by default).
const token = jwt.sign(...)
MCP integration
Three MCP tools expose ambient context to any AI agent — Claude, Codex, and others. The data stays local — only the text your AI tool includes in its prompt leaves your machine.
“What was I working on yesterday afternoon?”
ambient_timeline()
VS Code (auth-service) 12:00–14:30, Slack (eng-team) 14:30–14:45, Chrome (Grafana) 14:45–15:20, VS Code (webhook-retry) 15:20–17:00.
“How much time did I spend in Slack today?”
ambient_app_usage()
Slack: 1h 23m across 18 sessions. Peak usage: 14:00–15:00 (34 min continuous).
“What am I looking at right now?”
get_context()
VS Code — auth-service/middleware.ts:47 — validateToken function. Browser tab: GitHub PR #247.
Context-aware dictation
Ambient context doesn't just build a timeline — it makes your dictation smarter. Every time you dictate, Resonant captures the active app, window title, browser URL, selected text, and visible text snippets.
This context is attached to the dictation and used by cloud cleanup to produce app-appropriate output. Dictating in VS Code produces technical text. Dictating in Gmail produces email-formatted text. Dictating in Slack produces casual messages. Same voice, different output.
You say: “add a try catch around the token validation and log the error with the request ID”
try { await validateToken(req.headers.authorization); } catch (err) { logger.error('Token validation failed', { requestId: req.id, error: err }); }
You say: “hey Sarah the migration PR is up can you take a look when you get a chance I added the fallback validator like we discussed”
Hey Sarah, The migration PR is up — can you take a look when you get a chance? I added the fallback validator like we discussed. Thanks!
You say: “just pushed the JWT migration PR heads up it changes the token validation order so if you see test failures that's probably why”
just pushed the JWT migration PR — heads up it changes the token validation order, so if you see test failures that's probably why
Privacy
Ambient context collects workspace metadata, not content. It knows you were in VS Code editing a file called routes.ts — it doesn't read the file. It knows you were on a GitHub PR — it doesn't scrape the page.
All ambient data is stored locally in Resonant's encrypted database. You can view, search, and delete it at any time. In HIPAA mode, ambient capture can be disabled entirely.
Free. Local. Yours to control.
Ambient workspace memory. Queryable by your AI tools. All on your Mac.
Requires macOS 14+ · Apple Silicon