Abstract
With an increasing number of collaborative robots or “co-robots” entering human environments, there is a growing need for safe, intuitive, and efficient (physical) Human-Machine Interfaces. Unlike industrial robots, co-robots operate in cluttered and dynamic working spaces where accidental collisions may occur. To minimize interaction forces, co-robots are usually lightweight and compliant. However, this makes the robot dynamics highly nonlinear and therefore difficult to control. In addition, the control loop must incorporate feedback from integrated sensors. Future systems under development are covered with force-sensing robot skin consisting of thousands of multi-modal sensors, creating the need for efficient robot sensor calibration and processing.
During this work, a neuroadaptive (NA) controller was developed and validated for safe and stable physical interaction. In order to achieve intuitive physical Human-Robot Interaction (pHRI), the robot error dynamics were modified to behave equivalent to a simple admittance model by expanding the NA controller with Prescribed Error Dynamics (PED). Another developed approach for modifying the robot error dynamics was an inner/outer-loop structure consisting of an admittance model in the outer-loop, which generates a model trajectory that the inner-loop follows. This admittance model was implemented with an autoregressive-moving-average (ARMA) filter, which was tuned with recursive least squares and by using a prescribed task model. Experiments conducted showed that the developed framework allows a high degree of generality and adaptability to different human preferences, tasks, robots, and sensors. It is also offers a novel algorithm for adaptive calibration of robot skins by directly tuning admittance models that map sensor voltages into desired robot motion. Finally, it was suggested that the pHRI can be made more efficient by reducing the human effort during a collaborative task. The human force exerted on the robot to achieve a desired pose can be minimized by predicting and then executing the desired human motion. Different Human Intent Estimators (HIEs) were proposed, including a neural network based estimator.
Biography
Sven Cremer was born near Köln, Germany and grew up in Göteborg, Sweden. He received his B.Sc. degree in Engineering Physics and Applied Mathematics from the University of the Pacific, California, in 2010. Because of his passion for robotics, he pursed a Ph.D. degree in Electrical Engineering with a focus on control systems at The University of Texas at Arlington. In 2013, he joined the Next Generation Systems (NGS) group headed by Professor Dan O. Popa and worked on projects involving human-machine interfaces, assistive robotics, neuroadaptive control, and robot skin.