The science-fiction vision of robotic prosthetic limbs that can be controlled by the brain and provide sensory feedback is coming closer. Stuart Nathan looks at progress in the UK.
There is a recurring theme in engineering of trying to match or copy nature. It’s hardly surprising. The world and its biological systems have had millions of years to evolve solutions to the various problems posed by the environment; civilisation, by contrast, has had mere centuries. It’s always a challenge, and humanity’s successes in matching nature are relatively rare.
One of the biggest challenges comes in healthcare, where engineers literally have to match nature. Engineering some device that will have to fulfil the same function as a natural part of the body or coordinate with natural processes is about as difficult as it gets. And replacing missing or lost limbs provides some of the most striking examples of the progress we have made.
Archaeologists have found examples of replacement body parts from ancient Egypt, Greece and Rome. These range from the crude — wooden peg legs and strap-on toes — to primitive, but still impressive attempts at limbs with hinged joints. Fast forward to the 19th century, and we find fully articulated prosthetic hands, which might not have been particularly effective but certainly look impressive.
An iron artificial arm, 1560-1600, once thought to have belonged to a German knight
Credit: Science Museum; Wellcome Images; Creative Commons
Fictional visions
Today, our expectations have been raised – unfairly – by science fiction. The 1970s television series The Six Million Dollar Man introduced us to a triple amputee whose legs and arm were replaced with robotic limbs that gave him superhuman abilities (running at 60mph, lifting impossibly heavy weights, and seeing acutely with an implanted electronic eye); the series’ enduring legacy is in popularising the term “bionic” for a motorised prosthetic. A decade later, and we saw the right hand of Star Wars hero Luke Skywalker lopped off and replaced with a cybernetic hand that was visually and functionally indistinguishable from his natural extremity, even down to reflexes and sensation.
Luke Skywalker’s hand remains the model for a prosthetic Lucasfilm; Disney Studios
But neither the Bionic Man nor Luke are realistic reflections of what is possible with prosthetics. We still talk of “a hand like Luke Skywalker’s” when we want to evoke an advanced prosthetic, and under examination they still fall well short in functionality, no matter how impressive they look. So, 40 years after we learned to talk about bionics, what is the shape of prosthetics to come?
There are two main challenges involved in developing prosthetics. The first is in designing the mechanical limb itself. With increasing miniaturisation of electric motors and advances in computing power, this is becoming less of a challenge than the second, still-towering difficulty; finding ways to interface the machine with the amputee’s body. How can somebody who has lost a limb control a prosthetic? Is it possible to think about moving a prosthetic arm and move it with brain power alone; or to get even closer to the natural condition, and move it without barely any conscious thought? Can the sense of touch be replicated by a machine, even with today’s advanced sensors? And how about the sense which we rely on but is so fundamental that we are barely aware of it: proprioception — knowing exactly where our limbs and extremities are without having to look? How close can an amputee be returned to natural function with technology? And how is that technology likely to develop in the coming decades?
State of the art
It’s useful here to look at the present state-of-the-art. Current prosthetics have sockets that are made to fit precisely onto the amputee’s stump by a specialist prosthetician. It is absolutely vital that the fit is precise, and most prosthetics have to be adjusted regularly, which is, like all custom-making processes, expensive, time-consuming, and often inconvenient. There aren’t that many prostheticists, and travel to clinics in a problem (this is, of course, even more acute in the developing world and conflict zones, where amputation is disproportionately common and debilitating). Even the best-adjusted sockets are not ideal; the stump can slip against the surface, become sweaty and uncomfortable, and prolonged wear can be painful. This is particularly a problem for lower limb prostheses. As the body’s weight bears down onto the socket sores and resulting infections are a constant danger.
The most advanced prostheses available today do have some degree of mental control, but no sensory feedback. Control is achieved thanks to a phenomenon called myoelectricity. The remaining muscles of the stump still respond when the user “moves” the missing limb, resulting in electrical signals on the skin that can be detected by sensors installed into the socket. Although these signals may not correspond exactly to the movements the missing limb would have made, the user can learn how to make the prosthetic move in the desired fashion.
Myoelectric sensors are quite inexpensive, and the signals can be processed by off-the-shelf chips and sent to motors in the prosthetic. Companies such as Open Bionics, which The Engineer has covered, use such technology in their prosthetic arms and hands, which are designed to be open source and can be built from parts made on commercial 3D printers.
Myoelectric control depends very strongly on the fit between stump and prosthetic, because the sensors that detect the muscle signal have to be precisely placed on the correct area of the skin.
Moreover, this technology is best suited to arms and hands. Legs present a different set of problems, as the movement of knees, feet and ankles in normal walking are more autonomous and less conscious than those of hands, arms and fingers; they also have to deal with different types of stress and perform a more mechanical and supportive function. Because of this, in general the prosthetics field is sharply divided between upper and lower limb specialisms.
Advanced lower limb prosthetics tend to contain more passive systems, based around mechanical joints whose stiffness, in the most advanced cases, can be adjusted automatically during walking. Known as active joints, these often use pneumatics to help create realistic movements of knees and ankles, controlled by electronic actuators.
The most advanced lower limb available is generally accepted to be the Linx system, produced by UK company Blatchford, whose joints adjust automatically to changes in posture and which can be used even on soft and uneven surfaces.
The Linx smart lower-limb prosthetic is believed to be the most advanced prosthetic leg available
Costing around £20,000 per unit, the Linx is, ironically, not currently available on the National Health Service in England because the equipment purchasing policy only takes into account the initial cost. In Scotland, where through-life costs are considered, the system has recently become available.
This reflects an unfortunate fact faced by lower limb amputees: because of the unbalanced gait resulting from using a prosthetic leg and the stresses this imposes on the skeleton, many amputees eventually have to undergo a replacement of the hip on the opposite side to the missing limb. The cost of this operation, post-surgical care and monitoring, will in most cases outweigh the extra cost of purchasing an expensive prosthetic leg (even though even the Linx needs regular attention from a prostheticist).
New ground or improvement?
Development of prosthetics largely divides into two camps; those working to refine current socket-based technology and those working on new systems more directly integrated into the body. The most basic requirement of the latter is some system that is grafted onto the skeleton using a process known as osseointegration. This requires developing metal systems that can be inserted into or attached to the shaft of a bone, whereupon the body’s innate healing processes grow living bone directly onto and into the metal. 3D printing and advanced coating techniques have helped develop the technology considerably in recent years, as they allow the custom manufacture of textures and shapes suited for bone tissue to grow through.
Indeed, prostheses using such technology have become relatively common, such as hip and knee implants. The important thing about these is that they remain entirely inside the body. For a replacement body part, a section of the implant would have to protrude through the skin. Breaking the skin permanently is potentially dangerous, because it could create a pathway for infection. Until relatively recently, the accepted wisdom was that very few amputees would even consider the risk of a protruding implant.
Prof Noel Fitzpatrick’s amputation prostheses for domestic animals may have helped shift opinions on implants
This perception may now be beginning to change, and the difference has come from a surprising source: veterinary science. Readers in the UK may be familiar with Prof Noel Fitzpatrick, an Irish vet whose clinic in Surrey has been featured in a number of TV programmes which show off his specialism in replacing lost paws of small animals with protruding prostheses. Socketed prostheses are not practical for animals, but regular viewers will be familiar with Fitzpatrick’s frequent struggles to encourage the skin of amputated limbs to adhere to his custom-made implants and the fight against resulting infections. Fitzpatrick is, however, an advocate of these “amputation prostheses” for humans, and works with surgeons on advancing the technology into human clinical practice.
The power of the brain
Kianoush Nazarpour, a bioengineer from Newcastle University, is one of those researching ways of improving existing technology. It is understandable that amputees wouldn’t want to risk implantation, especially when this technology is not fully developed, he told The Engineer. “By definition, if you need an amputation, you’ve already had a very traumatic experience, and the surgery to remove a damaged limb is even more trauma and risk. You can see why people wouldn’t want to expose themselves to another extreme procedure when they might end up with something no better — or even not as good— as something they can already have, and that’s before you consider the risk of infection.”
Nazarpour is an upper limb specialist, and all his work follows one philosophy. “We try not to overcomplicate the prosthetic itself, especially with on-board computing,” he explained. The thinking behind this is that the human brain can already outperform any kind of synthetic processor, and its potential has not been fully explored. “Think of a blind man with a walking stick,” he said. “Does that stick restore his sight? No. But the simple sensory feedback he obtains by tapping it in learned ways allows his brain to reach a relatively sophisticated impression of his surroundings; or at least the small part of his surroundings that he needs to understand to take a next step safely.”
Sensory feedback relayed to the stump may help trick the brain
Nazarpour’s research, in which he is working in collaboration with Imperial College London and the universities of Leeds, Keele, Essex and Southampton, is focused on giving prosthesis users sensory feedback. For this, he uses relatively simple sensors in the fingers of the prosthetic to detect temperature, pressure and shear (the last of these is detected by a sensor that responds to force lateral to the surface rather than perpendicular). Their output is translated into small electrical currents that are applied to the stump’s skin. “Everybody might feel the sensations differently,” he said. “For some people, it might feel like tickling, to others scratching. The sensor density cannot possibly be as great as that on a real hand, and the feedback isn’t as rich. But the brain can learn to interpret the sensation on the remaining flesh as though it were on the hand.”
Similar research in the blind has had some success in devices that stimulate the skin of the back in response to the output of a forward-facing camera, he added. “In these people, the sensations on the back are translated into an impression of what is in front of them through the brain’s learning process.”
Part of this, he added, results from neuroplasticity: the brain’s ability to develop new connections between neurons, effectively “rewiring” itself to develop new functions. “It’s not fast or easy,” he admitted. “People who get myoelectric limbs can typically start to learn to control them in about five minutes, because the visual impact of being able to see what your hand is reaching for, for example, is very powerful. Learning to interpret sensory input is an order of magnitude more difficult, and takes correspondingly longer.”
One intriguing direction the research has taken is in integrating machine vision into prosthetic hands. An off-the-shelf camera is attached near the wrist facing the fingers, and when the user moves the hand towards an object a processing algorithm assesses how best to position the fingers to grip the object. “It’s not a difficult algorithm to decide whether a tripod grip or forefinger and thumb would be best, so as the hand approaches the object the fingers move into the best position. All the user has to do is close the hand when it reaches the object.”
This is a transition technology, Nazarpour added, but is achievable with current equipment. “The point is that we shouldn’t be afraid to use different sorts of inputs if that will help us,” he said.
A similar system could conceivably be used on a prosthetic leg, he added; a camera monitoring ahead of the foot could manoeuvre the prosthetic foot into the best position to help the user climb steps, for example.
A USB for the body
Cambridge Bio-Augmentation Systems (CBAS) is one of the most ambitious of the new technology school of prosthetic development. CBAS is developing a standardised interface that could be surgically implanted into the stump of an arm or a leg, where it would integrate with bone and also connect directly to nerves. A robotic limb would then plug in to the interface, and also clamp securely onto the section protruding from the body to fix it into position. “Think of it as a USB port for the body,” explained co-founder Oliver Armitage.
Oliver Armitage with a mannequin sporting a prototype prosthetic interface device. Below, a schematic of CBAS’s prosthetic interface device with a robot hand
CBAS is focusing on developing the interface rather than the limb, Armitage said. The system would be open source to allow robotics specialists to develop the prostheses themselves. “It gives us the best chance of developing technology, reducing the cost and letting other experts play their role,” Armitage said.
Armitage is a bioengineer specialising in the junctions between dissimilar tissue such as bone and tendon, which has led him to work on how synthetic materials can be integrated into the body. One innovation he has been working on is a method to avoid the risk of infection. As well as the bone implant encouraging growth of natural material into metal, he is developing a blend of elastomers and other soft materials into which skin can grow, to help create a waterproof, airtight seal between skin and the protruding part of the implant. His fellow co-founder, Emil Hewage, is a specialist in neuroscience and machine learning.
While Armitage is looking at methods and materials that can connect nerves to the section of the interface inside the body, Hewage is looking at methods of interpreting the spiking electrical signals produced by nerves into forms that motor controllers can understand. This would work in two directions: signals from the motor nerves would be sent to the motors controlling the joints and fingers of the prosthetic, while the output of electronic sensors in the device would be fed into the sensory nerves.
Attaching directly to the skeleton has a variety of advantages, Armitage and Hewage said. “You have a fixed connection, so there’s no slippage and no risk of sores developing on the skin of the stump,” Armitage said. “The stresses of movement are passed directly into the skeleton, which has evolved to cope with them. Neural connection is already being done, and the technology we would use it is similar to that used for cochlear implants or deep brain stimulation in treatment of Parkinson’s disease, but connecting to the peripheral nervous system rather than in the brain.”
Another advantage, Hewage explained, is that direct attachment exploits the existing proprioception sense. “If the prosthetic moves precisely with the skeleton then it’s fulfilling what the brain naturally expects be there, and we just tap into that”
Hewage agrees that the sensory input from a synthetic system can’t match the richness of a natural extremity. “But we can send and receive information at the same speed the nervous system works in a non-amputee,” he said. “And the brain is very good at filling in gaps. We don’t perceive the world in anything like the detail that we think we do, either from our eyes or from our sense of touch. Our brains essentially use sophisticated processing tricks to fill in what our senses are not perceiving from moment to moment.”
CBAS is not a large company, having about a dozen permanent research staff and around 30 regular collaborators in clinical and academic institutions in the UK and around the world. However, the company has been undertaking preclinical trials, and Armitage says that it hopes to proceed to early clinical trials in humans in 2018. The ambition is to develop a standardised interface that would cost around £10,000 per unit, and could be incorporated into upper or lower limb implants.
The Six Million Dollar Man is still a science fiction pipe-dream. But Luke Skywalker’s hand, or at least a close approximation, might be closer than we think.