About Technology Work Packages Team Partners Contact
SFRS – DIJASPORA 2023

AI Chip Based
Closed Loop
Control of
Prosthetic Limbs

Welcome to our site! To develop the first fully integrated smart bionic prosthesis, AILIMB combines machine learning, computer vision, and artificial sensory feedback inside a single wearable socket. With expertise in prosthetic control, AI, and human-centered prosthetics, AILIMB aims to push the boundaries of assistive technology and improve the quality of life for prosthesis users.

This research was supported by the Science Fund of the Republic of Serbia, #17997, AI Chip Based Closed Loop Control of Prosthetic Limbs – AILIMB.

AILIMB CONTROLLER EMG CONTROL COMPUTER VISION TACTILE FEEDBACK AI / ML PROCESSING IMU / SENSORS BIONIC HAND
€200K Total Budget
24M Project Duration
3 Partner Institutions
10–20 Amputee Participants
Overview

The Problem We Solve

Rejection rates for myoelectric prostheses remain at 25–40%. The real bottleneck is not mechatronics — it's human-machine interfacing.

The loss of a hand is a debilitating event with long-lasting physical, psychological, and social consequences. While modern bionic limbs are sophisticated systems, the control methods that allow users to effectively exploit their capabilities are still missing.

AILIMB advocates a semi-autonomous approach: the prosthesis is equipped with additional sensors and AI so that it can perform some functions automatically, effectively helping the user accomplish tasks with minimal cognitive load. The system combines myoelectric control for voluntary commands, artificial sensory feedback to restore tactile sensations, and automatic control powered by computer vision — all integrated inside a standard prosthesis socket.

Rejection rates for myoelectric prostheses remain at 25–40%. The real bottleneck is not mechatronics — it's human-machine interfacing. AILIMB directly addresses this by enabling intuitive, low-effort control through intelligent sensor fusion and AI.

Slika 1 Slika 2 Slika 3 Slika 4 Slika 5
TO1 — EMG Recognition

Machine learning classification and regression to recognize user motion intention from EMG data. We designed and tested methods for collecting and analyzing EMG signals, evaluated several pattern recognition algorithms, and obtained initial results showing high accuracy in distinguishing different hand movements and contraction levels. → Success rate >95% for 5 movement classes.

TO2 — Tactile Feedback

Vibrotactile interface to convey the full state of the Michelangelo prosthesis. We developed vibrotactile feedback methods to convey the full state of the hand, including grasping force, hand aperture, and wrist rotation, and tested them in able-bodied individuals. The results suggest these methods successfully provide intuitive tactile feedback. → User recognition success rate >90%.

TO3 — Computer Vision

Depth camera + IMU for automatic hand preshaping. Using Time-of-Flight depth sensors, we collected point clouds to examine object recognition of small to medium objects. Our experiments confirm that compact ToF sensors can effectively estimate object properties, even with relatively few data points. → Object size estimation with <10% relative error.

TO4 — Semi-Autonomous Fusion

Fusion of volitional EMG commands and automatic control for a semi-autonomous prosthesis. Semi-autonomous fusion combines user-driven EMG control with intelligent automation, ensuring that voluntary commands always take priority. The system enables natural, real-time interaction without explicit mode switching, while automatic control assists by adapting grasp parameters to the target object.

TO5–TO6 — Integration & Validation

All components are integrated into a wearable socket prototype, with comprehensive assessment planned in both lab and home environments.

Technical Components

Core Technologies

AILIMB integrates three key technological pillars into a single, wearable prosthetic socket with a standard connection interface.

Myoelectric Control

Multichannel EMG electrode array inside the socket captures muscle signals from the residual limb. Pattern classification and regression algorithms decode user motion intention for both sequential and simultaneous multi-DoF control.

→ ML: LDA, SVM, Deep Learning, Ridge Regression

Computer Vision + IMU

A miniature RGB/depth camera and inertial measurement unit mounted on the prosthesis perceive the target object and user approach strategy. Hardware-efficient algorithms enable real-time, embedded inference for automatic pre-shaping.

→ Object recognition, pose estimation, RANSAC geometry

Artificial Tactile Feedback

An array of vibration motors around the forearm conveys the full state of the Michelangelo hand — aperture size, wrist rotation, and grasping force — through intuitive spatial and intensity encoding without requiring visual attention.

→ Spatial encoding, intensity modulation, low-power integration

Shared / Semi-Autonomous Control

Innovative fusion of volitional and automatic command streams. Three paradigms investigated: traded autonomy, function-split autonomy, and blended autonomy — to determine the optimal allocation between user and AI controller.

→ Shared control, traded/blended autonomy paradigms

Embedded Platform

Custom-designed electronics with standard prosthesis attachment interface. All sensors, processors, and actuators fit inside the socket — making the smart control fully wearable and adaptable to any commercial myoelectric hand.

→ Michelangelo hand interface; universal socket design

Home Use Validation

The second fully integrated prototype includes data logging hardware (SD card) for monitoring prosthesis use in the home environment. Two amputees will use the system daily for two months with periodic lab assessments.

→ SHAP, BBT, CPRT, embodiment questionnaires

Work Plan

Work Packages

7 work packages organized across a 24-month project duration.

M1M3M6M9M12 M15M18M21M24
WP1Coordination
WP2Automatic Control
WP3Myoelectric Control
WP4Artificial Feedback
WP5Smart Prosthesis
WP6Assessment
WP7Dissemination
WP1
Coordination and Management FTNUNS
Financial, scientific, and technological management. Consortium agreement, IP tracking, data management plan, and ethical oversight.
M1 → M24  |  5 person-months
WP2
Automatic Control IVI
Computer vision and IMU-based automatic control for prosthesis pre-shaping. Object recognition, geometry estimation, and user behavior interpretation.
M1 → M16  |  20 person-months
WP3
Myoelectric Control FTNUNS
Development of classification and regression pipelines for sequential and simultaneous volitional control of all Michelangelo hand degrees of freedom.
M1 → M16  |  11 person-months
WP4
Artificial Feedback HSTAAU
Vibrotactile stimulation interface encoding full prosthesis state (aperture, wrist rotation, grasping force) for intuitive non-visual sensory feedback.
M1 → M16  |  11 person-months
WP5
Smart AI-Empowered Prosthesis HSTAAU
Integration of all components into the embedded platform. Investigation of shared control paradigms. Two prototypes: external (M12) and fully integrated socket (M18).
M8 → M18  |  20 person-months
WP6
Assessment and Validation FTNUNS
Clinical tests in able-bodied subjects, laboratory assessment in 5–10 amputees, and home use study with 2 amputees. SHAP, BBT, CPRT, and embodiment questionnaires.
M4 → M24  |  10 person-months
WP7
Dissemination and Exploitation IVI
Open-access publications, workshops for end users and industry, stakeholder analysis, business plan, and collaboration with Otto Bock for commercialization.
M1 → M24  |  3.4 person-months
Research Team

Key Investigators

An interdisciplinary consortium combining expertise in embedded systems, AI, and prosthetics research.

FTNUNS — Novi Sad, Serbia
Prof. Nikola Jorgovanović
Principal Investigator
Senior researcher in biomedical engineering, advanced control systems in prosthetics, electrotactile and vibrotactile feedback, and biomedical electronic systems. Founder of Biomedical Engineering studies at FTN. Coordinator of H2020 SixthSense and SmartStim projects. Collaboration with Otto Bock on somatosensory feedback.
HSTAAU — Aalborg, Denmark
Prof. Strahinja Došen
Project Coordinator from Diaspora
Full Professor at Aalborg University. Pioneer of the semi-autonomous prosthesis control concept. Leader of the Neurorehabilitation Systems group. PI for multiple EU projects (Tactility, Wearplex, SixthSense, SimBionics). Over €3M in secured external funding.
FTNUNS — Novi Sad, Serbia
Prof. Vojin Ilić
WP Lead — Embedded Electronics
Full Professor specializing in mixed-signal electronics design, microcontroller systems, medical electronics, DSP, and electrophysiological signal recording. Key contributor to embedded platform design in H2020 SixthSense and SmartStim projects.
IVI — Serbia
Dr. Milovan Medojević
WP Lead — AI & Computer Vision
Research Associate at the Institute for Artificial Intelligence Research and Development of Serbia. Expert in deep learning, IoT systems, and energy-based behavioral analysis. International research experience across Slovakia, Spain, Denmark, and Thailand.
Consortium

Partner Institutions

FTNUNS
Faculty of Technical Sciences, University of Novi Sad
🇷🇸 Serbia — Project Lead
The largest faculty in Serbia with over 17,000 students. Regional leader in knowledge and technology transfer. Extensive experience in biomedical engineering, embedded systems, and prosthetics research. Holds the Michelangelo prosthesis donated by Otto Bock.
Visit website →
IVI
Institute for Artificial Intelligence Research and Development of Serbia
🇷🇸 Serbia
Serbia's national AI research institute covering industrial AI, language understanding, health applications, and mathematical ML foundations. Acts as an incubator for AI-driven startups. Key contributor to computer vision and deep learning in AILIMB.
HSTAAU
Dept. of Health Science and Technology, Aalborg University
🇩🇰 Denmark — Diaspora Partner
Internationally recognized leader in prosthesis control and feedback. Pioneer of the smart/semi-autonomous prosthesis concept. Holds state-of-the-art equipment including Michelangelo and Mia hands, motion capture, and multichannel EMG systems.
Visit website →
Get in Touch

Contact the Project

AILIMB is funded by the Science Fund of the Republic of Serbia through the SFRS–DIJASPORA 2023 program, #17997. The project is open to collaboration with clinical centers, prosthetics companies, and amputee organizations.

PI Prof. Nikola Jorgovanović
Institution FTN, University of Novi Sad
Diaspora PI Prof. Strahinja Došen, AAU Denmark
Program SFRS – DIJASPORA 2023
Duration 24 months
Total Project Budget
€ 200,000
Science Fund of the Republic of Serbia