Skip to main content

Featured

Richard Hammond explains what he experienced during his coma | 310mph Crash | Insight into non-local consciousness

Richard Hammond, a presenter on the popular car show "Top Gear," was involved in a serious car crash while filming in 2006. He was in a coma for weeks following the accident. However, he has since recovered and continues to work as a television presenter and journalist. In the video below, he discusses his experience with non-local consciousness during the coma, while his doctors were predicting a poor outcome and saying it was hopeless, his wife kept the faith. The video is short and it's a great story.

Breakthrough Brain Implant Decodes Inner Speech

 

Stanford Team Unveils Device to Translate Inner Monologue for Paralyzed Patients

Researchers at Stanford University have introduced a pioneering brain implant that can decode and vocalize a person’s inner speech—the words silently imagined in their mind. This innovation promises life-changing communication for individuals with paralysis so severe that they cannot even attempt to speak aloud.

Technology: From Movement Signals to Silent Thought

Until now, brain-computer interface (BCI) technologies helped speech-disabled individuals by analyzing signals from the motor cortex as they attempted physical speech movements, even if no sound was produced. The Stanford team, however, bypassed the need for any attempted physical speech. The implant records neural activity in real time as users either try to speak or just imagine saying specific words.

Lead researcher Erin Kunz said, “This is the first time we've managed to understand what brain activity looks like when you just think about speaking. For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.”eurekalert+2

Study Participants: Real-World Impact

Four severely paralyzed participants—living with conditions such as ALS or suffering from brainstem strokes—were recruited. For one, communication was only possible through limited eye movements. After receiving electrode arrays as part of the BrainGate consortium, they were prompted to either try speaking or simply imagine a set of words.

AI models were trained to recognize the neural patterns of individual phonemes and assemble them into sentences. The system had up to 74% real-time accuracy in decoding silent speech—a notable achievement for a vocabulary that included up to 125,000 words.insideprecisionmedicine+3

Frank Willett, an assistant professor of neurosurgery at Stanford, explained: "For people with paralysis, attempting to speak can be slow and fatiguing. Decoding inner speech could make communication quicker, more comfortable, and even more natural." Participants confirmed that using inner speech demanded less effort than attempting physical speech, especially when breath control or muscle strength was compromised.statnews+2

Privacy and Password Protection

Interestingly, researchers found the BCI could sometimes pick up inner speech that participants did not intend to share—such as numbers counted during a task. In response, the Stanford group added a privacy safeguard: a “password” that users imagine in their mind to unlock the BCI for decoding speech. In the study, the phrase “chitty chitty bang bang” blocked unintended decoding with 98% reliability.neurosciencenews+1

Industry Context: Private Investment and Future Challenges

The BCI field is rapidly expanding, both academically and privately. Investment is expected to rise as OpenAI, led by Sam Altman, launches the Merge Labs neurotech venture to compete directly with Neuralink, Elon Musk’s BCI company. These efforts are reshaping the sector, bringing new attention—and ethical concerns—about neural data privacy and mental sovereignty.ainvest+2

Looking Forward

While the implant presently cannot decode free-form internal monologues with perfect accuracy, ongoing advances in hardware and machine learning promise even better results. Researchers envision a future in which people with profound paralysis can converse fluently and comfortably using only their thoughts.

As Dr. Willett concluded: “This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech.”eurekalert+2

  1. https://www.eurekalert.org/news-releases/1093888
  2. https://neurosciencenews.com/bci-inner-speech-decoding-29574/
  3. https://www.statnews.com/2025/08/14/can-a-computer-turn-our-internal-monologue-into-speech/
  4. https://www.insideprecisionmedicine.com/topics/informatics/decoding-inner-speech-in-real-time-with-ai-and-brain-computer-interfaces/
  5. https://www.ainvest.com/news/frontier-human-machine-symbiosis-openai-bci-venture-reshaping-ai-hardware-investment-2508/
  6. https://futurism.com/openai-sam-altman-neuralink-competitor
  7. https://coincentral.com/sam-altman-confirms-backing-brain-computer-interface-startup-rivaling-neuralink/
  8. https://ppl-ai-file-upload.s3.amazonaws.com/web/direct-files/attachments/35117057/d03ea903-b070-4fab-acce-a5b62176d9cd/inner-speach.pdf
  9. https://www.euronews.com/next/2025/08/15/a-brain-computer-chip-can-read-peoples-minds-with-up-to-74-accuracy
  10. https://www.nytimes.com/2025/08/14/science/brain-neuroscience-computers-speech.html
  11. https://www.scientificamerican.com/article/new-brain-device-is-first-to-read-out-inner-speech/
  12. https://www.science.org/content/article/brain-device-reads-inner-thoughts-aloud-inspires-strategies-protect-mental-privacy