When we listen to spoken words, the sound enters our ears and is converted into electrical signals. Those signals then travel through the brainstem and into the auditory processing regions of the brain. Christina Zhao at I-LABS and colleagues at UC Berkeley and Johns Hopkins traced that path in the brain using EEG electrodes placed on listeners’ scalps. Participants listened to 3,000 repetitions of a single sound in English. The resulting brain waves from the EEG recordings closely followed the actual acoustics of the sound.
The researchers then transmitted the same recording of the English sound through an unsupervised neural network — an AI system — that was trained to learn English.
Comparing of the brain activity and the AI system showed that artificial intelligence (AI) systems can process signals in a way that is remarkably similar to how the brain interprets speech.
In cognitive science, one of the primary goals is to build mathematical models that resemble humans as closely as possible. The newly documented similarities in brain waves and AI waves are a benchmark on how close researchers are to meeting that goal.