Resonant for Research
Literature notes during reading. Experiment logs with gloves on. Analysis narration while the data is on screen. Your best research thinking happens faster than you can type. Resonant captures it.
Press a hotkey. Speak the insight. Structured text appears wherever your cursor is. On-device. Private. As local as your data.
The problem
You notice a pattern in the data while the plot is still rendering. A methodological concern surfaces mid-paragraph in a paper. The connection between two findings clicks during a seminar. These are the moments that advance your research.
By the time you sit down to write, the nuance is gone. The specific phrasing that captured the idea has been replaced by a vaguer reconstruction. The energy of the original thinking has cooled into summary.
Research is thinking made visible. Voice captures the thinking at its source — before it fades, before it flattens, before it becomes “I had a thought about this but I can't remember the details.”
Where it fits
React to the paper while you read it. The argument, the gap, the connection to your own work. Spoken while the thinking is fresh, structured on arrival.
Hands are gloved. Equipment is running. Dictate the observation, the deviation from protocol, the unexpected result. Timestamped and clean.
Talk through what the data is showing you. The outlier you noticed, the pattern forming, the hypothesis shifting. Capture the reasoning, not just the result.
The specific aims section you rewrote in your head on the drive home. The significance framing that clicked during a seminar. Speak the draft before it dissolves.
Read the manuscript and respond aloud. Methodological concerns, missing references, the question the authors should have asked. Fast, thorough, structured.
Rehearse the narrative arc. Dictate speaker notes for each slide. Capture the phrasing that works and discard what doesn't. Iterate out loud.
Voice memos
Record a voice memo from anywhere on your Mac. Resonant transcribes locally and structures the output. The hypothesis you captured in the lab. The observation from the field site. The critique that formed while reading.
“Sample B is showing fluorescence at 420nm which is 30nm off from what the model predicted. Could be contamination from the buffer prep or the excitation wavelength needs recalibrating. Running the blank now. If the blank is clean I think we have a genuine spectral shift which would support the conformational change hypothesis from the Chen paper.”
“Site 4 transect, second pass. Vegetation density is notably higher on the north-facing slope compared to last season. Three new saplings in the clearcut zone. Soil moisture feels higher too, probably from the late spring melt. Need to cross-reference with the weather station data when I get back to the lab.”
“This meta-analysis claims the effect size is 0.3 but they included the two studies from 2019 that used the old assay protocol. If you drop those the effect probably shrinks to nonsignificant. This is exactly the methodological concern we raised in our response letter. I should cite this in the discussion section but note the limitation.”
Ambient context
When you dictate, Resonant records which papers were open, which tools were running, which datasets were on screen. Not the content — just the context. The metadata that makes your notes findable months later.
Your AI tools can query this context via MCP. Ask Claude “what papers was I reading when I had that idea about the binding assay?” and get an answer.
Context that would take 10 minutes to reconstruct from memory is captured automatically, locally, in the background.
Auto journal
Every evening, Resonant generates a structured journal entry from your dictations and ambient context. What you worked on, what you observed, what you said about it.
No more blank lab notebook pages. No more reconstructing what you did from git commits and browser history. The research log writes itself from the record of your actual work.
Searchable. Queryable by AI. The kind of documentation that makes reproducibility possible and progress reviews painless.
Morning — Literature review
Reviewed Chen et al. 2024 on conformational dynamics. Noted potential conflict with our spectral data. Recorded memo outlining the spectral shift hypothesis as an alternative explanation.
Midday — Data analysis
Ran spectral comparison in R Studio. Sample B showing 420nm fluorescence (30nm off prediction). Blank was clean, supporting genuine spectral shift. Dictated preliminary interpretation.
Afternoon — Writing
Drafted results paragraph in Overleaf. Dictated discussion section connecting spectral findings to Chen framework. Identified two additional citations needed.
Privacy
Unpublished findings. Preliminary results. Grant ideas before submission. Research data demands a higher standard of privacy than most work. Resonant meets it by keeping everything local.
Audio is processed on-device using Apple Neural Engine. No cloud transcription. No audio uploads. No third-party access. Works offline, in the field, in secure facilities, on air-gapped networks.
Your IRB doesn't need to review a tool that never sends data off the machine.
Free. Local. Private.
Capture the thinking at its source. Literature notes, experiment logs, analysis narration, grant drafts — all by voice, all on your Mac.
Requires macOS 14+ · Apple Silicon