Have you also been hearing about Throw Trucks With Your Mind? It’s a Kickstarter project for a game that, according to the LA Times report on its developer and a couple of testers, allows gamers to forego the keyboard and joystick by — literally! — plugging their minds into the game via a headset that measures brainwaves.
Baby, we’ve come a long, long way.
No surprise, the science and technology that could even make this possible is also affecting the language industry. Synapses, wavelengths, magnetic fields, electrical signals — how is it these rapid brain firings become words? A host of new scientific research and devices may finally be fleshing that out. Developments around the globe are getting to the bottom of how the brain processes language and the findings may lead to a myriad of new applications that may directly benefit from this knowledge.
From Nonoichi to Abu Dhabi
Scientists and engineers at Japan’s Kanazawa Institute of Technology recently made a rare brain scanner for a neuroscience language laboratory in Abu Dhabi, according to a report from Gulf News. The scanner, known as a magnetoencepholography (MEG) machine, was custom-made for the lab, which is run in partnership with the New York University. According to the report, “the MEG system is one of the first in the Gulf and is among a few in the world.” Researchers there hope to gain “scientific knowledge on how humans neurologically process language — from the basics of constructing words from their root forms to the more complex task of composing sentences.”
This singular machine “will be able to analyse language processes in the brain faster and more efficiently than current neuroscience technology,” according to an article posted by The National. The MEG machine allows scientists to “to measure extremely small magnetic fields generated by the electric activity in the brain” which correlates to language and cognitive abilities.
The University of Washington is also making important strides to understand language development with the MEG machine. The Institute for Learning & Brain Development asserts that the scanner “is a completely safe and quiet brain-imaging device that is appropriate for the youngest children.” Researchers there are trying to unravel why children learn new languages more quickly than adults which may “‘lead to second language learning programs that work for people of all ages.'”
It’s Not Just the MEG
Studying electrical brain activity has also led to other language-related advances. University of California researchers at San Francisco reported that they have made neurological inquiries relating to “implications for developing computer-brain interfaces for artificial speech communication and for the treatment of speech disorders.” Their research into speech and language development centers upon the “tiny brain regions that controls our lips, jaw, tongue and larynx.” By viewing the electrical activity in this part of the brain, researchers have already found universal patterns that “linguists have observed in languages around the world.”
Neuroscientists at the University of California at Berkeley report that they are also up to their ears in brain imaging and making strides toward understanding “the imagined speech of a patient unable to speak due to stroke or paralysis.”
This is remarkable; to see what impact this could have, I encourage you to read The Man Who Lost His Language, which tells the story of the much admired late Sir John Hale, a British Renaissance historian, translator, editor, and university professor, after he suffered a stroke that caused aphasia, the loss of language. The book shows his gradual recovery during the seven years after the stroke, as his writing abilities partially recovered but his spoken repertoire extended only marginally, while his brilliant intellect remained unaffected by this disability.
Understanding how the brain processes languages has implications beyond the neuroscience lab. These developments and others will almost certainly impact businesses — and in the healthcare sector, as one might expect. For instance, the University of Houston reported that a new device developed by its students provides a translating platform that turns sign language gestures into audible words. As more research sheds light on the brain’s ability to process language, it’s reasonable to expect that more technological machines may be designed to help people with language-related deficiencies. It’s also reasonable to imagine a considerable demand for machines such as this.
Look again to the example of the gaming industry. According to Science Daily, a recent study published in the journal Psychophysiology demonstrated that electroencephalography readings could show a person’s video game aptitude. Researchers asserted that “‘by measuring your brain waves the very first time you play the game, we can predict how fast you’ll learn over the next month'”. This gaming aptitude, dependent on the brain’s “alpha waves” is also associated with an area of the brain that is involved in “decision-making, attention and self-control.” Developers may tap into this valuable data as they produce new gaming devices and games.
While there is considerable research surrounding language and how we process it, these brain imaging techniques are making groundbreaking progress. If machines can be created to decipher our imagined speech from brain waves or to translate our brain’s electrical firings into words, the repercussions are going to be enormous for science, healthcare — even education. More than that — wouldn’t the sales and marketing departments of every known industry just love to get a read of our brain waves, too?