Welcome to our site! To develop the first fully integrated smart bionic prosthesis, AILIMB combines machine learning, computer vision, and artificial sensory feedback inside a single wearable socket. With expertise in prosthetic control, AI, and human-centered prosthetics, AILIMB aims to push the boundaries of assistive technology and improve the quality of life for prosthesis users.
This research was supported by the Science Fund of the Republic of Serbia, #17997, AI Chip Based Closed Loop Control of Prosthetic Limbs – AILIMB.
Rejection rates for myoelectric prostheses remain at 25–40%. The real bottleneck is not mechatronics — it's human-machine interfacing.
The loss of a hand is a debilitating event with long-lasting physical, psychological, and social consequences. While modern bionic limbs are sophisticated systems, the control methods that allow users to effectively exploit their capabilities are still missing.
AILIMB advocates a semi-autonomous approach: the prosthesis is equipped with additional sensors and AI so that it can perform some functions automatically, effectively helping the user accomplish tasks with minimal cognitive load. The system combines myoelectric control for voluntary commands, artificial sensory feedback to restore tactile sensations, and automatic control powered by computer vision — all integrated inside a standard prosthesis socket.
Rejection rates for myoelectric prostheses remain at 25–40%. The real bottleneck is not mechatronics — it's human-machine interfacing. AILIMB directly addresses this by enabling intuitive, low-effort control through intelligent sensor fusion and AI.
Machine learning classification and regression to recognize user motion intention from EMG data. We designed and tested methods for collecting and analyzing EMG signals, evaluated several pattern recognition algorithms, and obtained initial results showing high accuracy in distinguishing different hand movements and contraction levels. → Success rate >95% for 5 movement classes.
Vibrotactile interface to convey the full state of the Michelangelo prosthesis. We developed vibrotactile feedback methods to convey the full state of the hand, including grasping force, hand aperture, and wrist rotation, and tested them in able-bodied individuals. The results suggest these methods successfully provide intuitive tactile feedback. → User recognition success rate >90%.
Depth camera + IMU for automatic hand preshaping. Using Time-of-Flight depth sensors, we collected point clouds to examine object recognition of small to medium objects. Our experiments confirm that compact ToF sensors can effectively estimate object properties, even with relatively few data points. → Object size estimation with <10% relative error.
Fusion of volitional EMG commands and automatic control for a semi-autonomous prosthesis. Semi-autonomous fusion combines user-driven EMG control with intelligent automation, ensuring that voluntary commands always take priority. The system enables natural, real-time interaction without explicit mode switching, while automatic control assists by adapting grasp parameters to the target object.
All components are integrated into a wearable socket prototype, with comprehensive assessment planned in both lab and home environments.
AILIMB integrates three key technological pillars into a single, wearable prosthetic socket with a standard connection interface.
Multichannel EMG electrode array inside the socket captures muscle signals from the residual limb. Pattern classification and regression algorithms decode user motion intention for both sequential and simultaneous multi-DoF control.
→ ML: LDA, SVM, Deep Learning, Ridge Regression
A miniature RGB/depth camera and inertial measurement unit mounted on the prosthesis perceive the target object and user approach strategy. Hardware-efficient algorithms enable real-time, embedded inference for automatic pre-shaping.
→ Object recognition, pose estimation, RANSAC geometry
An array of vibration motors around the forearm conveys the full state of the Michelangelo hand — aperture size, wrist rotation, and grasping force — through intuitive spatial and intensity encoding without requiring visual attention.
→ Spatial encoding, intensity modulation, low-power integration
Innovative fusion of volitional and automatic command streams. Three paradigms investigated: traded autonomy, function-split autonomy, and blended autonomy — to determine the optimal allocation between user and AI controller.
→ Shared control, traded/blended autonomy paradigms
Custom-designed electronics with standard prosthesis attachment interface. All sensors, processors, and actuators fit inside the socket — making the smart control fully wearable and adaptable to any commercial myoelectric hand.
→ Michelangelo hand interface; universal socket design
The second fully integrated prototype includes data logging hardware (SD card) for monitoring prosthesis use in the home environment. Two amputees will use the system daily for two months with periodic lab assessments.
→ SHAP, BBT, CPRT, embodiment questionnaires
7 work packages organized across a 24-month project duration.
An interdisciplinary consortium combining expertise in embedded systems, AI, and prosthetics research.
AILIMB is funded by the Science Fund of the Republic of Serbia through the SFRS–DIJASPORA 2023 program, #17997. The project is open to collaboration with clinical centers, prosthetics companies, and amputee organizations.