Biosensor armband could control prosthetics

Biosensor armband could control prosthetics

Engineers in the US have developed an AI-driven biosensor armband that can detect hand gestures and could be used to control prosthetics or video games.  

Built at the University of California, Berkeley, the system recognises hand gestures based on 64 different electrical signals in the forearm. These signals are then fed into a chip programmed with an AI algorithm that translates them into corresponding movements of the hand and arm. As well as potential applications in prosthetics, the device could also be used as a tool for humans to interact with computers, according to the Berkeley team. The work is published in Nature Electronics.

Robotic prosthetics master more natural movement on different terrains

“Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers,” said Ali Moin, a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences.

“Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”

The team was able to teach the algorithm to recognise 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers. They used a type of advanced AI called a hyperdimensional computing algorithm, which is capable of updating itself with new information. For example, if the electrical signals associated with a specific hand gesture change because a user’s arm gets sweaty, or they raise their arm above their head, the algorithm can incorporate this new information into its model. Furthermore, all of the computation is done locally on the chip, helping to address privacy concerns around personal data.

“When Amazon or Apple creates their algorithms, they run a bunch of software in the cloud that creates the model, and then the model gets downloaded onto your device,” said Jan Rabaey, Professor of Electrical Engineering at UC Berkeley and senior author of the paper. “The problem is that then you’re stuck with that particular model. In our approach, we implemented a process where the learning is done on the device itself.”

Although the biosensor armband is still a prototype, Rabaey believes all the elements are there for it to evolve into a commercial product.

“Most of these technologies already exist elsewhere,” said the Professor, “but what’s unique about this device is that it integrates the biosensing, signal processing and interpretation, and artificial intelligence into one system that is relatively small and flexible and has a low power budget.”

https://www.theengineer.co.uk