Resonant

MCP Server — 11 tools

Your voice becomes
AI context.

Resonant exposes an MCP server that lets any AI agent — including Claude and Codex — query everything you've said, done, and seen. Your voice history becomes a knowledge base your AI assistant can search.

No competitor does this. Your dictations, meetings, memos, app usage, and daily journal — all queryable by your AI tools, all stored locally on your Mac.

What is MCP

A protocol that lets
AI tools ask questions
about your work.

Model Context Protocol (MCP) is an open standard that connects AI assistants to external data sources. Instead of you copy-pasting context into prompts, your AI tool queries the source directly.

Resonant implements an MCP server that exposes your entire voice workspace — dictations, meetings, memos, ambient context, and journal entries — as structured, searchable data. Your AI assistant doesn't guess what you've been working on. It knows.

Without MCP

“In my standup this morning I mentioned the auth migration — I think I said we'd move to JWTs by Thursday? Can you check... actually let me find the notes first...”

Context lost. Time wasted searching. AI can't help because it doesn't know what you said.

With Resonant MCP

“What did I commit to in this morning's standup?”

Claude queries your meeting transcript, finds the JWT migration commitment, and gives you the exact quote with timestamp.

In practice

Your AI assistant queries
your voice workspace.

Claude CodeClaude Code
Meeting recall
You

What did I commit to in this morning’s standup?

MCP tool call

search({ query: "standup", type: "meeting", limit: 1 })

AI

From your 9:30am standup (Zoom, 12 min): You committed to finishing the JWT migration by Thursday, mentioned the rate limiter needs load testing before deploy, and asked Sarah to review the webhook retry PR.

CursorCursor
Voice-to-spec
You

I described an API design earlier today — find it and use it as the spec for this endpoint.

MCP tool call

search({ query: "API design", type: "dictation", limit: 5 })

AI

Found a dictation from 2:14pm in VS Code (auth-service/routes.ts): "The endpoint should accept a Bearer token, validate against the JWKS endpoint, return a 401 with a retry-after header if the token is expired, and rate limit to 100 requests per minute per org..."

Claude CodeClaude Code
Ambient recall
You

What was I working on yesterday afternoon?

MCP tool call

ambient_timeline({ date: "2026-03-28", start: "12:00", end: "18:00" })

AI

Yesterday afternoon: VS Code (auth-service) 12:00–14:30, Slack (eng-team) 14:30–14:45, Chrome (Grafana dashboard) 14:45–15:20, back to VS Code (webhook-retry) 15:20–17:00. 3 dictations and 1 memo recorded during this window.

CursorCursor
Memo as context
You

I recorded a memo about the notification system architecture — use it as context for this refactor.

MCP tool call

search({ query: "notification architecture", type: "memo" })

AI

Found memo from Mar 26 (3 min 42s): "Three event types — user actions, system events, scheduled digests. Different retention policies. Leaning toward three separate tables over polymorphic. Fan-out for digests should be async via a job queue, not inline..."

Tool reference

11 tools. Everything queryable.

Each tool is a structured API that your AI assistant can call. Search across your entire voice history, pull meeting transcripts, read journal entries, and understand your ambient work context.

Search & Discovery

Find anything across your voice history

search

Full-text search across all dictations, meetings, and memos. Returns matched items with timestamps, app context, and surrounding text.

Try

"search for mentions of JWT migration"

recent_activity

Returns a feed of recent voice activity — dictations, memos, meetings — ordered by time. Configurable limit and type filters.

Try

"what have I been working on today?"

Ambient Context

Understand your work environment

ambient_timeline

Returns the ambient context timeline — a chronological record of app switches, window titles, URLs, and dwell time throughout the day.

Try

"what was I doing between 2pm and 4pm?"

ambient_app_usage

Aggregated app usage statistics from ambient context. Shows which apps you used, for how long, and when — grouped by time window.

Try

"how much time did I spend in Slack today?"

get_context

Returns the current ambient context snapshot — active app, window title, URL, selected text, and visible screen content.

Try

"what am I looking at right now?"

Workspace Data

Access specific items and metadata

get_journal

Retrieves the auto-generated daily journal for a specific date. Includes timeline, highlights, and task blocks grouped by app.

Try

"show me my journal from yesterday"

get_dictation

Retrieves a specific dictation by ID with full metadata — raw transcript, cleaned text, app context, window title, URL, and duration.

Try

"get the details of that dictation I did in VS Code"

get_meeting

Retrieves a full meeting transcript with speaker labels, timestamps, duration, and the detected meeting app (Zoom, Meet, Teams, etc.).

Try

"pull the transcript from my 10am meeting"

get_memo

Retrieves a voice memo with its auto-generated title, AI summary, transcript, and duration.

Try

"find that memo I recorded about the API redesign"

get_stats

Returns voice workspace statistics — total words, dictation count, meeting hours, streak data, and progression level.

Try

"how many words have I dictated this week?"

get_config

Returns the current Resonant configuration — active model, language, trigger key, cloud cleanup status, and enabled features.

Try

"what model is Resonant using?"

Compatibility

Works with your AI tools.

Claude Code

Claude Code

Auto-discovers

Cursor

Cursor

Auto-discovers

VS Code

VS Code

Manual config

Windsurf

Windsurf

Manual config

Any MCP-compatible client works. Resonant uses the standard stdio transport.

Setup

Three steps. No config files.

01

Install Resonant

Download and run Resonant on your Mac. The MCP server starts automatically in the background.

02

Open your AI tool

Claude Code and Cursor auto-discover the MCP server. For VS Code and others, add the server config to your MCP settings.

03

Ask a question

"What did I discuss in my last meeting?" "What was I working on yesterday?" Your AI tool queries Resonant and gets real answers.

Privacy

All data stays on
your Mac.

The MCP server runs locally inside Resonant. Your AI tool communicates with it over a local socket — no network requests, no cloud relay, no API keys. The voice data that feeds these tools never leaves your machine.

When Claude or Cursor queries your meeting transcript, the query and the response both stay on your device. The only thing that reaches the AI provider is the text your tool decides to include in its prompt — the same text you'd paste manually.

MCP doesn't create a new privacy surface. It automates the copy-paste you were already doing.

Frequently asked questions

Free. Local. Always.

Give your AI tools
the memory they lack.

Everything you say, structured and searchable. No subscription. No cloud. Just your voice, your Mac, and your AI assistant.

Requires macOS 14+ · Apple Silicon