Human hands perform different types of tasks daily and they are one of the most useful parts of our body with 30 muscles and 25 joints. There are also 17,000 receptors and nerve endings in human hands which help us perform highly complex tasks too. But what if you have a prosthetic arm after something happened to your hand? In 2022, Sarah de Lagarde got into an accident and lost her right arm and was offered a prosthetic arm by UK’s National Health Service. But that prosthetic arm helped only a little hit and couldn't be as good as a normal arm. The prosthetic arm given to Lagarde only had one joint at the elbow while the hand was static. After nine months of struggle with the prosthetic arm, she was given a battery-powered bionic arm utilizing artificial intelligence (AI) which could anticipate the person she wanted to make through electrical signals from her muscles.
Every simple to complex movement we make through our hands requires a huge integration of our motor control and sensory feedback which is all handled by the brain. Many engineers and medical professionals had been struggling for many years to make something as good as a human hand and now with advances in AI and robotics, we have almost been successful in making prosthetics which can mimic humans. But there are still questions about whether an AI machine can truly natch intricate and complex abilities of a human hand or not. Even though we still have a long way to go, these initial stages of human-like development tell us that we can achieve our goal sooner.
The development of AI hands is just like the development of dexterity in human babies. When a baby is born, it can only perform involuntary reflexes like gripping on someone's finger and it improves when the baby gets matured. As the baby grows, it develops motor skills through trial, error and sensory feedback. Just like the baby, robotic hands with AI also perceive and try to understand the environment and then react to it. One of the examples of this is the DEX-EE robot which is developed by Google DeepMind and Shadow Robot Company. It is a three-fingered robot which uses fingertip sensors to handle delicate objects and can be extremely helpful in healthcare where precise control is needed.
Some other companies are also developing AI robots to perform dexterous and dangerous tasks like Rustom Stolkin which is developing robots to clean nuclear waste and Boston Dynamics which is developing robots for tasks like organising and packing. One of the biggest challenges for robots right now is dealing with objects with various textures and irregularities. There are also some fruit-picking robots from Cambridge’s Dogtooth Technologies and Tesla’s Optimus which are showing good responses and learning to pick fruits without damaging them. So the future of AI in prosthetics seem bright but we need to address some limitations as well which can take at least five more years.
Image: DIW-Aigen
Read next: AI Reshapes Workforce: Study Finds 24% Decline in Easily Automated Freelance Jobs
Every simple to complex movement we make through our hands requires a huge integration of our motor control and sensory feedback which is all handled by the brain. Many engineers and medical professionals had been struggling for many years to make something as good as a human hand and now with advances in AI and robotics, we have almost been successful in making prosthetics which can mimic humans. But there are still questions about whether an AI machine can truly natch intricate and complex abilities of a human hand or not. Even though we still have a long way to go, these initial stages of human-like development tell us that we can achieve our goal sooner.
The development of AI hands is just like the development of dexterity in human babies. When a baby is born, it can only perform involuntary reflexes like gripping on someone's finger and it improves when the baby gets matured. As the baby grows, it develops motor skills through trial, error and sensory feedback. Just like the baby, robotic hands with AI also perceive and try to understand the environment and then react to it. One of the examples of this is the DEX-EE robot which is developed by Google DeepMind and Shadow Robot Company. It is a three-fingered robot which uses fingertip sensors to handle delicate objects and can be extremely helpful in healthcare where precise control is needed.
Some other companies are also developing AI robots to perform dexterous and dangerous tasks like Rustom Stolkin which is developing robots to clean nuclear waste and Boston Dynamics which is developing robots for tasks like organising and packing. One of the biggest challenges for robots right now is dealing with objects with various textures and irregularities. There are also some fruit-picking robots from Cambridge’s Dogtooth Technologies and Tesla’s Optimus which are showing good responses and learning to pick fruits without damaging them. So the future of AI in prosthetics seem bright but we need to address some limitations as well which can take at least five more years.
Image: DIW-Aigen
Read next: AI Reshapes Workforce: Study Finds 24% Decline in Easily Automated Freelance Jobs