“It’s going to blow your minds,” said Elon Musk in September last year when he unveiled Neuralink, his brain-computer interface project. It involves a digital device implanted into the brain that can interpret the electrical signals made by neurons, gather data and pass them on to an external computer. He’s described it as a “Fitbit in your skull”.
It’s just one of many such projects, as this Wired article sets out. It highlights NextMind, a small and noninvasive neural interface that’s worn on your head and translates brain waves into data that can be used to control compatible software, and a device from CTRL-Labs that uses the motor cortex.
All very sci-fi but what’s it got to do with education?
Quite a lot, according to John Galloway, a specialist in the use of technology to improve educational opportunities for children and young people with special educational needs. For people with physical disabilities, if the distance between our minds and our machines is shrinking, then the opportunities available through the use of technology are amplified.
He notes that when we first started using personal computers our interaction was determined by a keyboard and a mouse. So our capacity with them was determined by our dexterity with a keyboard and a mouse. More recently we’ve become quite used to operating touch screens. So the variety of operational commands has increased, we can now tap and swipe. We don’t need the same kind of control as we might do with a keyboard.
“There have always been alternatives to keyboard and mouse use around, for instance switches, big buttons that you tap,” says John. “Typically a light will scan across the screen and when it comes to the point on the screen you want to activate, say a letter on a keyboard, you hit the button. Stephen Hawking was a switch user. He used, for a while, a system known as ‘sip and puff.’ Basically, he sucked and he blew and that made a light scan his keyboard then he selected what he wanted. And that’s how he wrote many of his books and delivered his lectures.”
From voice control to mind control
Using mouths to mouths to operate machines is now commonplace, albeit in a slightly different way. We talk to our devices, we offer them commands, we tell them to remember things for us, we ask them questions, we tell them to connect us to a friend on the phone. And so the distance between the computer and what we are thinking about is now between our mouths and our brains.
John points out that, even more recently we’ve developed technologies where we control the device with our eyes. We look on the screen and when we find the area we want to activate we blink. Possibly the smallest motion the human body can make. In fact, we don’t even have to blink, we can just look and keep your gaze in one place and hover on that spot and the computer understands that that’s what you want to activate and ‘click,’ it does it for you. So now the distance is down to between the eyes and the mind.
And so, we come to the notion of mind control, with brain implants: literally we think, and our technology does what we are thinking about. It is making our thoughts tangible.
Shrinking the distance
“So the way in which technology – the way in which our ingenuity with technology – is shrinking that distance no longer means that technology is, literally, at arm’s length, now we can work from inside our own heads,” John concludes. “Many of those disabilities disappear. And the opportunities that our devices offer us are available to them in a way that they haven’t been previously.”
Watch the video (3m:47s): Inclusion and accessibility – How technology supports learners with physical disabilities with John Galloway