The Prospects of Interfacing Computers with the Human Brain
ShareAll sharing options for:The Prospects of Interfacing Computers with the Human Brain
- Twitter (opens in new window)
- Facebook (opens in new window)
- Linkedin (opens in new window)
- Reddit (opens in new window)
- Pocket (opens in new window)
- Flipboard (opens in new window)
- Email (opens in new window)
The linking of computers and the human brain has long been thought to be the realm of science fiction writers. Since the 70’s however, Brain Computer Interfaces (BCIs) have been a major focus of the Defense Advanced Research Projects Agency (DARPA) and subsequently other research facilities, who now have a variety of on-going projects that aim to aid physiological issues affecting humans. A couple of examples from DARPA’s research include facilitating the retrieval of existing memories in patients with dementia and delivering naturalistic sensations to amputees.
These tasks involve interfacing the brain with an electronic device that is either implanted (invasive) or placed externally (non-invasive). A surgically implanted electronic device named the cochlear implant is an example of an invasive BCI that provides a sense of sound to a person who is profoundly deaf. This is achieved by providing sound signals to the brain by replacing the work of damaged parts of the inner ear.
Researchers at the Wyss Centre for Bio and Neuroengineering in Geneva have recently found a way to decipher the thoughts of patients suffering from Locked-In Syndrome, a condition in which a patient is aware but cannot move or communicate verbally due to total motor paralysis.
By measuring blood oxygenation and the electrical activity of the brain, a computer is able to monitor blood flow changes in response to yes or no questions asked by the researchers. The researchers were then able to feed the computer enough data so it could recognise patterns in response to the questions and accurately associate the data.
The restoration of particular biological functions for people who have limited capabilities has been the primary catalyst driving the research of BCI’s forward. The hope from a medical perspective is that the technology will not only help sufferers of Locked-in syndrome, but will also have wider applications in monitoring and alleviating other neurological conditions. Inspired by the success in the field of rehabilitation and medical care we can already witness the advent of BCI’s being prototyped for different fields.
Facebook are currently recruiting for a brain-computer interface engineer for their new innovation team. The secretive ‘Building 8 team’ is being assembled to be at the forefront of augmented and virtual reality, artificial intelligence, connectivity and other important breakthrough areas.
This begs the question, what kind of social media platforms will be possible in the future? A new mind-reading social media experience could be on the horizon, and this vision is backed up by comments made by Mark Zuckerberg, who believes that one day we’ll be able to send thoughts to each other directly through technology. “You’ll be able to think of something and friends will be able to experience it too, if you’d like”.
Tim Mullen, the founder of neurotechnology company Qusp Technologies, is working on a commercial platform to digitally link a person to the cloud, and eventually any internet-connected device. According to Mullen, the ultimate goal of a brain computer interface isn’t just to connect and interact with your immediate surroundings but to accomplish direct brain to brain communication, something he describes as a type of mediated telepathy.
We might not bee too far away from be able to control our tech with our thoughts alone. So far, development has produced technology that has allowed researchers to test binary tasks such as turning on and off a light, changing the channel and adjusting the volume on a television. The research will eventually trickle into everyday use, meaning controlling home environments through the combination of the internet of things and brain control is something that is entirely feasible in the foreseeable future.
The evolution of alphanumeric typing through thought control is another commercial prospect that has been exposed to recent research and development. As oppose to sitting at a desk and typing away, the idea is to move the user away from the keyboard altogether and instead focus their mind on the letters and words they want to produce as they appear on-screen.
Created for people suffering from paralysis, brain injury researchers at the Wadsworth Centre in Albany have actually developed a system that allows users to do just that. The system works by displaying a unique flashing pattern for each letter of the alphabet, so when attention is focused on a particular letter, brain activity mirrors that letter’s flashing pattern. The computer then registers which part of the flashing matrix has caught the user’s attention and the letter will appear onscreen.
One of the obstacles researchers have found when monitoring brain activity for BCI’s is isolating the signals that indicate meaningful intent on the part of a user, from non-useful electrical background “noise”. An example during a TEDx talk by Jose del R. Millan, involved a person on stage mentally controlling a robot with his thoughts. The controller could see the robot on a webcam and was focused on directing the it around a room in a desired direction.
It didn’t go as smoothly as planned, and the robot repeatedly crashed (link for video). It was evident that controlling the robot with precision was a tough task, and the lack of control suggested that the background noise going on in the controller’s brain was interfering with the process. That talk was over four years ago, and a Boston-based start-up Neurable have now developed a piece of software that solves this problem. It does so by filling the gaps when the desired signal is too weak, providing a kind of autopilot mode through artificial intelligence that will predict what the desired action is, should there be too much background interference.
The demonstrator in the video was also using the least invasive BCI, a device known as an electroencephalograph (EEG) – a set of electrodes attached to the scalp using a cap. The electrodes can read brain signals, but the skull blocks a lot of the electrical signal and it distorts what does get through. To get a higher-resolution signal, scientists instead use invasive BCI’s, where the electrodes are implanted directly into the grey matter of the brain itself.
There is strong focus on evolving these systems, and we are getting closer and closer to linking brains with computers in unimaginable ways. The rate of scientific and technological progress in neurotechnology is rapidly accelerating, partly thanks to the quest for increased functionality for those suffering from paralysis. This in turn has formed the foundation for similar interfaces to be developed and in the coming years you can expect to see a multitude of new products integrating the brain with technology.
Discussion feed