- DiggerInsights
- Posts
- Neuralink Just Gave a Man His Voice Back
Neuralink Just Gave a Man His Voice Back
A brain chip restored speech to an ALS patient who hadn't spoken in years. Here's what happened, how it works, and what it means.

A brain chip just gave an ALS patient his voice back. Not his typing. Not his cursor control. His actual voice, reconstructed from thought alone.
Elon Musk posted the results on X yesterday and 7 million people watched. But the real story isn't the viral clip. It's what the technology implies for the 30,000+ people worldwide who lose their ability to speak to ALS every year.
What Actually Happened
Brad Smith, an Arizona father of three with late-stage ALS, became the first nonverbal person to receive Neuralink's brain implant. He can't move his limbs. He can't speak. Before the implant, he communicated using an eye-tracking device that only worked in dark rooms.
His own words: "I was basically Batman. I was stuck in a dark room."
Now Smith types on a laptop by thinking about moving his tongue and jaw. The implant reads those motor signals and translates them into cursor movements. Here's the part that hits hardest: when he types, the system speaks in his own voice, cloned from old recordings using AI before he lost the ability to talk.
He attended his kid's soccer game. Gave a talk at his church. Left the house for something other than a medical appointment for the first time in five years.
"Neuralink has given me freedom, hope, and faster communication," Smith said in a video he created and edited himself using the device.
How It Actually Works
Think of your brain like a keyboard that's still plugged in, even when the monitor is off. ALS destroys the connection between your brain and your muscles. But the brain is still firing signals, still "pressing keys." It just has nowhere to send them.
Neuralink's implant sits in the motor cortex, the part of your brain that plans movement. When Smith thinks about moving his tongue to form a word, the chip picks up those electrical patterns. Machine learning models decode them into phonemes and stitch them into sentences.
Right now it works through typing: think, move cursor, select letters. But the VOICE trial is pushing toward direct speech decoding at 120 to 150 words per minute. That's roughly how fast you're reading this sentence out loud.
Why This Is Different
Most BCI headlines over the past two years have been about cursor control. Moving a mouse with your mind. Playing chess. Browsing Reddit. Impressive, sure. But fundamentally, it's still a pointing device.
Speech restoration changes the category entirely. You're not controlling a tool. You're recovering a basic human function. The difference between typing a sentence and speaking it out loud is the difference between texting someone "I love you" and saying it to their face.
Stanford researchers published work in Cell showing they can now decode "inner speech" from the motor cortex. Not attempted speech, not mouthing words. Just thinking them. Neuralink's VOICE trial builds on the same principle, with a wireless implant small enough to be invisible under the scalp.
The Catch
Let's be honest about what we don't know yet.
Neuralink has roughly 20 participants across all its clinical trials. The VOICE trial for speech restoration is even smaller. Brad Smith's results are promising, but this is still early. One patient is an anecdote, not a dataset.
The FDA granted Neuralink Breakthrough Device Designation for speech restoration, which accelerates regulatory review. But "accelerated" in FDA terms still means years, not months. Full commercial approval for a brain implant is a long road with a lot of safety data still required.
Smith himself noted that the system needs continuous calibration as his brain activity changes over time. It's not plug-and-play. And it required surgery to implant a chip in his skull. The risk calculus is very different for someone with late-stage ALS versus someone with mild speech difficulties.
Signal
Valuation: Neuralink is now valued at roughly $8.6 billion after raising $1.29B total, with a $650M Series E in June 2025. Some secondary market estimates push it toward $14.9B.
Competitors are moving fast. Paradromics just got cleared to start its own speech trial. Synchron has a less invasive approach (no open brain surgery) and is already in human trials. Blackrock Neurotech has been in the BCI space longer than Neuralink. And Sam Altman is co-founding Merge Labs, a new BCI startup that hasn't said much yet but has deep pockets.
What to watch: the race isn't just to decode speech. It's to decode speech wirelessly, reliably, and at scale. Whoever cracks the manufacturing and regulatory path for a consumer-grade speech BCI will own a market that touches ALS, stroke, traumatic brain injury, and cerebral palsy patients worldwide.
ALS alone affects roughly 10.55 per 100,000 people globally, and that number is projected to climb to 12.5 per 100,000 by 2040. About 60% of patients with bulbar-onset ALS lose the ability to speak within 18 months of their first symptoms.
The Close
A man thought about words and his old voice said them out loud. That sentence would have been science fiction three years ago. It's still early, still small, still messy. But the signal is loud enough to hear.
💼 Jobs
Neuralink — Neural Signal Processing Engineer (Fremont, CA): Build the algorithms that turn brain signals into speech. Yes, that's the actual job. neuralink.com/careers
Synchron — Clinical Trial Manager (New York, NY): Run the human trials for a BCI that goes through a blood vessel instead of a hole in the skull. synchron.com/careers
Paradromics — Hardware Design Engineer (Austin, TX): Design the physical chip that sits on someone's brain and doesn't fail for a decade. paradromics.com/careers
Blackrock Neurotech — ML Research Scientist (Salt Lake City, UT): Decode neural signals into usable data for patients with paralysis. blackrockneurotech.com/careers
🐸 Memes



Dug up with a headlamp and genuine alarm ⛏️
Reply