Skip to main content Skip to secondary navigation

Intelligent Prosthetic Arm (IPARM)

Main content start
IPARM concept

Intelligent Prosthetic Arm Concept. One challenge with an upper limb prosthesis is the effective control of the numerous possible degrees of freedom as the functionality approaches an able-bodied arm. Limited actuation input from the human, coupled with the complexity of control necessary for dexterous tasks requires a solution that can interpret the human's intended action and can allow the arm to perform the dexterous tasks with a degree of autonomy conditioned on the interpreted human intent. This motivates the IPARM project which seeks to provide prosthesis with autonomy that is transparent to the human wearers intended action, allowing them to ultimately perform complex, dexterous tasks.

IPArm example

Prosthetic arm users often struggle to control their prosthesis, and accuracy is a skill that is developed over a long time period if it is ever achieved. Can a robotic prosthetic leverage situational awareness and prediction to decrease the learning curve and improve accuracy? Can the input from the human (EMG, mixed reality (gaze tracking and information presentation)) be used to determine the desired action? Can ‘natural’ arm motions and task/object affordances be used to confirm human intended action?

IPArm interfacing
Human Control Diagram

When developing an intelligent, wearable robot what must first be considered is what ability is the robot enhancing or replacing?

IPARM: prosthesis ability substitution

An individual who has lost the majority of their arm would benefit from an intelligent prosthesis capable of observing the environment and taking desired, dexterous action on behalf of the person. First use application of such a prosthetic would be in common activities of daily living (ADL):

IPARM- ADL example

As we move closer to a platform that is technologically ready to interact with end users, can we first pilot our algorithms in a manner such that all individuals can contribute to their development? One method of performing this is by utilizing virtual tasks with virtual prosthetics (or external substitutes, such as a table top manipulator) to pilot the developing approaches. 

IPARM teleoperation

Research Objectives

  1. Decrease the time necessary for a user to reach full maximum dexterity potential of a prosthetic arm (achieved through intent estimation)
  2. Increase the overall maximum dexterity potential of a prosthetic arm (achieved through situtational awareness of the task(s))

 

Current Students:

Publications

Here we demonstrate a virtual prosthetic arm displayed in the Hololens2 platform. This virtual testbed will be used to perform wearer intent prediction leading to better ease in control when performing dexterous tasks. This virtual platform serves as a testbed for our algorithms to be deployed on physical systems for teleoperation and prosthetic arm control. This is a virtual prosthetic arm is a virtual form of the Modular Prosthetic Limb (MPL) from Johns Hopkins University's Applied Physics Laboratory. The SDF model is from the Gazebo models database: https://github.com/osrf/gazebo_models/tree/master/mpl_right_arm