“Where Paralympians are professional athletes, the point of the Cybathlon is to show that next-gen of prosthetics can work for anyone.”
“We call them ‘flourishes’,” says Aldo Faisal. “Kevin has this flourish where, because he can rotate his wrist 360-degrees either way, when he reaches for or passes you stuff he will add this flourish just for the fun of it. There’s no need for him to do it, but he does it anyway. It’s like if you pick up your teacup and you stretch out your pinkie finger… He doesn’t think about it: he just does it.”
Faisal, senior lecturer in Neurotechnology at Imperial College London, is talking to WIRED about one of his ‘Pilots’: the term given to disabled athletes taking part in the 2016 Cybathlon in Switzerland. Outwardly, the Cybathlon looks something like the Paralympics. All the competitors – or Pilots – struggle with some degree of disability, be it amputation, paralysis or restricted movement as the result of spinal cord injury. But there’s a critical difference: where Paralympians are professional athletes, the point of the Cybathlon is to show that the next generation of prosthetics can work for anyone.
Faisal’s team – which he compares to Formula One, with Pilots as the drivers and the Imperial College London researchers and engineers as the pit crew – took home a silver medal in the functional electrical stimulation bike race. Kevin made it the final stage of the powered arm prosthesis race, which tests the grip and speed with which Pilots can manipulate objects with their prosthetic hands.
“In 2015 we were contacted by the organisers of the Cybathlon, asking if we were interested in participating,” says Faisal. “We immediately said yes. [It was appealing because it provided the opportunity for] empowerment of disabled people – paralysed people, amputees – and to showcase that empowerment to a broader public.”
From day one, Faisal has included students and end users – the paralysed atheltes – in his research. But Kevin’s flourish was something unexpected, and emblematic of the work Faisal and his team at Imperial College London are doing. By reverse engineering the processes that the brain uses to control movement, his team’s mission is to create prosthetics that almost totally bypass the learning stage patients often experienced with cybernetic limbs – creating plug-and-play prosthetics, if you like.
“We’re interested in understanding what algorithms the human brain uses to control movement, to make decisions,” he says. “And of course, if we can reverse-engineer these algorithms, then the pathway to implementation is more straightforward than if you’re trying to do artificial intelligence from scratch.
“For example, if you have standard prosthetics and you want to grab a cup, then you need to concentrate and send brain impulses to your prosthetic hand, so it rotates right by 30 degrees and opens the hand by 4cm, and then you have to think about how each finger closes around them and so forth.
“But that’s not how we actually think about movement. When you want to grab a cup, you just think, ‘I want to grab that cup,’ and you grab it. So, while we are thinking at a much more cognitive level, a lot of the prosthetics and rehabilitation technology out there operates at a much lower level. We want to bring a lot of this cognitive intelligence into the machines themselves, so your prosthetic hand can grab a cup if you just tell the hand which cup to grab, and all these little finicky details – how much to move each joint of the fingers, how much to rotate your wrist, – [the prosthetic] takes care of that.”
Kevin’s subconscious wrist flourish, then, is more than just an idiosyncrasy – it is, in it’s small way, a proof-of-concept for how totally integrated the next generation of prosthetics might be.
Faisal’s lab at Imperial is split into two parts. On the one side are the neuroscientists, trying to decode the processes that allow for instinctive movements. The other half of the lab takes those findings and tests them out with prosthetic prototypes.
There are hurdles to overcome. For instance, finding the best way of interfacing the prosthetics with the patients “so that they don’t have to learn to operate [their] prosthetic; they just use it intuitively”. Power, will also become an issue. Faisal explains: “A prosthetic arm needs to carry all its power [internally]. You don’t want to hook yourself up to a socket just to use your hand.”
In the past, cost has also been a hurdle. But the workarounds that Faisal’s team have implemented have been ingenious. Using eye-tracking technology hacked together from off-the-shelf components (an early prototype even used camera peripherals from a videogame console), Faisal’s lab created an interface that allowed wheelchair-bound volunteers to navigate around obstacles simply by looking at the path they wished to take. A second breakthrough came with a replacement for the myoelectric technology used to translate muscle signals into movement in artificial limbs. Faisal calls this technology “myoacoustics”.
“We developed a novel way of recording muscle signals. We can make it about 100 times cheaper than conventional methods of measuring muscle activity. And for that we only need simple microphones… When muscles contract, they vibrate like a string. And you can hear these vibrations with conventional microphones.
“What we published so far shows that we can record [muscle signals] with a better signal-to-noise ratio [than myoelectric devices], with the added advantage that we don’t have an electrical interface with the skin, which means you can wear this [device] all day long. It doesn’t matter whether you sweat or not – the contact remains the same; it’s just sound.”
The technology is so effective, Faisal claims, patients can even use it in the shower.
But the technologies coming out of Faisal’s lab don’t start and finish with disabled patients. Because the devices the team creates do not require surgery to be implanted, Faisal foresees a big market for consumer products as well – what he refers to as augmentative rather than restorative devices. The big question here is not whether the technology could work in theory, but – as anyone who was beaten up for being an early adopter of Google Glass will tell you – how much privacy consumers are willing to give up for the convenience of new, wearable augmentations.
“I think the is whether our acceptance of a lack of privacy is going to continue or not,” says Faisal. “If it continues… you’re going to live in a space – in a city and a country – that is aware of you, and can predict your future actions based on the data it has collected about you, to make your life easier. You would probably see advertising that’s targeted to you – not just [to your tastes], but at the right moment, catching the right moodswing to entice you to shop. You’re going to have to think less about what you want to do, but just say, ‘Yes, OK, yes, OK,’ as proposals [made by your wearables] become much better.”
Faisal’s vision of the future then, is not one of devices solely driven by the needs of patients, but of wearable integrations with both our minds and our bodies. Pilots like Kevin and the rest of the Cybathlon team are living proof that these technologies work. The real question is how much of our thoughts and feelings will we be willing to share with these new devices – medical or otherwise.