The way I write code has changed. These days I spend more time talking to AI agents than typing code myself. Cursor, Windsurf, Claude Code, and similar tools do the actual implementation while I describe what I want.
Voice input fits this workflow surprisingly well.
Typing speed used to matter. Now it's about how fast you can explain what you want. I talk at 150 words per minute. I type at maybe 60. The math is obvious.
Why Voice Works Here#
Speed
I can describe a feature in 30 seconds by voice. Typing the same prompt takes two minutes. When you're doing this dozens of times a day, it adds up.
Better Prompts
When I type, I write terse prompts. When I talk, I naturally give more context. The AI produces better code on the first try.
Less Fatigue
Long prompting sessions wear you out. Talking is easy. Typing the same level of detail gets exhausting.
Code Review
Reviewing code is conversational. Saying "this looks good, but make the error handling more specific" feels natural.
The Voice + Agent Workflow#
1. Initial Direction#
“Create a REST API for managing a todo list. Use Express with TypeScript. Include endpoints for CRUD operations, input validation with Zod, error handling middleware, and Jest tests for each endpoint.”
[AI agent plans and executes the multi-file implementation]2. Course Correction#
“The error handling middleware should return structured JSON responses with an error code, message, and optional details field. Don't use the generic Express error handler.”
[Agent adjusts its approach based on your feedback]3. Code Review by Voice#
“This looks good, but the database connection should use a connection pool instead of creating a new connection per request. Also, add a graceful shutdown handler.”
[Agent modifies the implementation accordingly]4. Approval and Commit#
“This implementation looks correct. Commit it with the message: Add todo API with CRUD endpoints and validation.”
[Agent commits with the specified message]Setting Up for Agentic Voice Coding#
Install Whisperer
Get it from the Mac App Store with Code Mode for your IDE.
Set Up Per-App Profiles
Natural language mode for agent prompts, code mode for direct code editing.
Use a Good Microphone
Clarity matters when giving complex instructions to AI agents.
Enable Streaming Preview
Verify your instructions as you speak them before sending.
Tools That Work Well with Voice#
| Tool | Voice Use Case |
|---|---|
| Cursor | Speak Cmd+K prompts, chat with AI, describe refactors |
| Claude Code | Voice-driven terminal sessions, speak complex instructions |
| Windsurf | Cascade prompts via voice, multi-file edits |
| GitHub Copilot Chat | Explain code, ask questions, request changes |
| VS Code | Direct code dictation with Code Mode |
Where This Is Going#
More coding happens through conversation with AI, less through direct typing. Voice fits that shift. I'm not saying everyone needs to dictate their code. But if you're spending hours a day prompting AI tools, it's worth trying.
Related: Voice Dictation for Vibe Coding, Voice Coding Guide, Code Mode, Developer Productivity. See pricing.
Ready to try voice dictation on your Mac?
Free download. No account required. 100% offline.
Download on the Mac App Store