Some neural networks learn a language like humans. This is evidenced by the results of a study by computer linguist Gašper Beguš from the University of California at Berkeley, writes Quanta magazine.
Together with his colleagues, he compared the brain waves of humans listening to a simple sound to the signal produced by a neural network analyzing the same sound. The results were uncannily alike. Most importantly, the researchers tested networks made up of general-purpose neurons that are suitable for a variety of tasks.
“They show that even very, very general networks, which don’t have any evolved biases for speech or any other sounds, nevertheless show a correspondence to human neural coding,” emphasized psychologist Gary Lupyan from the University of Wisconsin, Madison, who was not involved in the work.
During the study, the specialists discovered another interesting parallel between people and machines – they “heard” sounds that were spoken to them in English and Spanish differently.
However, the results of the study not only help solve the mystery of how artificial neural networks (ANNs) learn, but also suggest that the human brain may not be equipped with the “hardware and software” for language.
“The paper definitely provides evidence against the notion that speech requires special built-in machinery and other distinctive features,” said linguist Vsevolod Kapatsinski of the University of Oregon.
As you know, in the 1950s, linguist Noam Chomsky suggested that people are born with an innate and unique ability to understand language. He believes that this ability is literally built into the human brain.