Speakers

Sami Haddadin
Technical University of Munich
Munich School of Robotics and Machine Intelligence

Physical Human-Robot Interaction

Sami Haddadin is Director of the Munich School of Robotics and Machine Intelligence at the Technical University of Munich (TUM) and holds the Chair of Robotics Science and Systems Intelligence. His research interests include intelligent robot design, robot learning, collective intelligence, human-robot interaction, nonlinear control, real-time planning, optimal control, human neuromechanics and robot safety. From 2014 to 2018, Sami Haddadin was Full Professor and Director of the Institute of Automatic Control at Gottfried Wilhelm Leibniz Universität Hannover, Germany. Prior to that, he held various positions as a research associate at the German Aerospace Center (DLR). He received degrees in Electrical Engineering, Computer Science and Technology Management from the Technical University of Munich and the Ludwig-Maximilians-Universität München. He received his doctorate with high distinction from RWTH Aachen. He has presented his work at conferences and published more than 130 scientific articles in international journals. He has been honored with numerous prestigious awards and prizes for his scientific work.

 

Andrea Cherubini
University of Montpellier
Laboratory of Computer Science, Robotics and Microelectronics of Montpellier

From perception to inter-action

Traditionally, heterogeneous sensor data was fed to fusion algorithms (e.g., Kalman or Bayesian-based), so as to provide state estimation for modeling the environment. However, since robot sensors generally measure different physical phenomena, it is preferable to use them directly in the low-level servo controller rather than to apply them to multi-sensory fusion or to design complex state machines. This idea, originally proposed in the hybrid position-force control paradigm, when extended to multiple sensors brings new challenges to the control design; challenges related to the task representation and to the sensor characteristics (synchronization, hybrid control, task compatibility, etc.). The rationale behind our work has precisely been to use sensor-based control as a means to facilitate the physical interaction between robots and humans. In particular, we have used vision, proprioceptive force, touch and distance to address case studies, targeting four main research axes: teach-and-repeat navigation of wheeled mobile robots, collaborative industrial manipulation with safe physical interaction, force and visual control for interacting with humanoid robots, and shared robot control. Each of these axes will be presented here, before concluding with a general view of the issues at stake, and on the research projects that we plan to carry out in the upcoming years.

Andrea Cherubini received the M.Sc. in Mechanical Engineering in 2001 from the University of Rome La Sapienza and a second M.Sc. in Control Systems in 2003 from the University of Sheffield, U.K. In 2008, he received the Ph.D. in Control Systems from the University of Rome La Sapienza. From 2008 to 2011, he was postdoc at Inria Rennes, France. Andrea has co-authored over 70 papers in International peer-reviewed Conferences and Journals and is currently Associate Professor in Robotics at University of Montpellier, France.

 

Nathanaël Jarrassé
CNRS – Sorbonne Université
Institute of Intelligent Systems and Robotics

On the use of sensorimotor coordinations for intuitive and ecological robotic assistance to gesture

Robots are promising tools for assisting human gesture, whether those are the ones of an impaired user or of an operator in industry. While a good progress has been made in the last decades on the hardware of these robotic devices, offering users an intuitive and ecological control over their body assisted by these devices remains a critical challenge. In this talk, I will thus introduce some researches we are conducting on the characterization of natural motor coordinations and their reorganization provoked by impairment or the interaction with robots; and how this knowledge can be used to develop better control of both rehabilitation and assistive robots.

Nathanaël Jarrassé is a permanent research scientist (CR1 CNRS, Section 7, CID 53) at the Institute of Intelligent Systems and Robotics of the Sorbonne Université, Paris, working in the AGATHE team. He was previously working as an associate researcher at the Human Robotics Group, Dept. of Bioengineering, Imperial College London under the supervision of Pr. E. Burdet. He received my PhD from UPMC in 2011, under the supervision of Pr. G. Morel, and, previously, an M.Eng. in Mechanical Engineering from Arts et Métiers ParisTech and an M.Sc. in Robotics from UPMC, both in 2006. His research projects aim at understanding and improving the physical Human-Robot interaction (pHRi) for neuromotor rehabilitation and assistance applications (especially for the upper-limb) to further embodiment of technological devices. He is especially interested in the natural control of wearable or interacting mechatronic devices (exoskeletons, prosthetics, instrumented interfaces, cobots), the physical coupling between robotic devices and the human body and the analysis of human sensorimotor control and interactive behaviours. He is also interested in the Ethical, Legal and Societal (ELS) issues in medical and assistive robotics.

 

Ilana Nisky
Ben Gurion University of the Negev
Department of Biomedical Engineering

Modeling human sensorimotor control for better control of surgical robots

Robot-assisted minimally invasive surgery (RAMIS), where a surgeon manipulates a pair of joysticks that teleoperate both instruments and a camera inside a patient’s body, requires precise control of movement, object and tissue manipulation, and perception. Despite many advantages for both the patient and the surgeon, the full potential of RAMIS and other teleoperation applications is yet to be realized. Two of the major progress-impeding gaps, the lack of touch feedback and limited knowledge of how to measure skill and optimize training, could be bridged by applying models of surgeons’ sensorimotor control. I will present our recent findings in effort to answer basic and applied questions in human sensorimotor control focusing on manual interaction with virtual and real objects in the context of RAMIS.

Ilana Nisky received the B.Sc (summa cum laude), M.Sc. (summa cum laude), and Ph.D. in Biomedical Engineering from the Department of Biomedical Engineering, Ben-Gurion University of the Negev, Israel. She is currently a senior lecturer in the Department of Biomedical Engineering, Ben-Gurion University of the Negev, where she is the head of the Biomedical Robotics Lab. She is also the head of the Israel-Italy Virtual Lab on Artificial Somatosensation for Humans and Humanoids. She was previously a postdoctoral research fellow in the Department of Mechanical Engineering, Stanford University. She is the recepient of the 2019 IEEE RAS Early Academic Career Award, the prestigious Alon Fellowship for young faculty from the Israeli Council for High Education, and the Marie Curie International Outgoing Fellowship from the European Commission. Her research interests include human motor control, haptics, robotics, human and machine learning, teleoperation, and robot-assisted surgery, and her lab is supported by competitive grants from the Israeli Science Foundation, the Israel-US Binational Science Foundation, and the Ministry of Science and Technology. Nisky has authored more than 60 scientific publications in peer-reviewed journals and conference proceedings, and numerous abstracts in international conferences. She is an Associate Editor at the IEEE Transactions on Haptics and the IEEE Robotics and Automation Letters journals, a member of the BGU ABC Robotics Initiatuve, and serves on the steering committee of the Zlotowski Center for Neuroscience. She served as an executive committee member of the EuroHaptics Society 2014-2018, and is a board member of the Israeli Society for Medical and Biological Engineering. She is a Senior Member of IEEE, a member of the Society for the Neural Control of Movement, the Society for Neuroscience, Technical Committee on Haptics, and American Physiology Society.

 

Neville Hogan
Massachusetts Institute of Technology
Departments of Mechanical Engineering and Brain & Cognitive Sciences

Quantitative models of human performance facilitate physical collaboration with robots

Humans are remarkably agile and dexterous despite profound limitations of the neuro-mechanical system. I contend that this is accomplished by composing behavior from ‘building block’ dynamic behaviors: oscillations and stereotyped submovements for forward-path control of motions; and mechanical impedance for physically interactive dynamics. Composing behavior in this way confers advantages but also implies fundamental limitations of human performance. Path curvature and speed are coupled in a way that is difficult to overcome, even with training. Moving slowly and smoothly is hard for humans. Mechanical impedance is profoundly influenced—but also limited—by musculo-skeletal geometry. This is particularly evident in the behavior of wrists and ankles, which determine the mechanical impedance of our principal means of interacting with the world, hands and feet. Both exhibit profound directional weakness that is not offset by muscle activation. To ignore these limitations of motion and interactive behavior is to risk compromising human-robot physical collaboration.

Neville Hogan is Sun Jae Professor of Mechanical Engineering and Professor of Brain and Cognitive Sciences at the Massachusetts Institute of Technology. He earned a Diploma in Engineering (with distinction) from Dublin Institute of Technology and M.S., Mechanical Engineer and Ph.D. degrees from MIT. He joined MIT’s faculty in 1979 and presently Directs the Newman Laboratory for Biomechanics and Human Rehabilitation. He co-founded Interactive Motion Technologies, now part of Bionik Laboratories. His research includes robotics, motor neuroscience, and rehabilitation engineering, emphasizing the control of physical contact and dynamic interaction. Awards include: Honorary Doctorates from Delft University of Technology and Dublin Institute of Technology; the Silver Medal of the Royal Academy of Medicine in Ireland; the Henry M. Paynter Outstanding Investigator Award and the Rufus T. Oldenburger Medal from the American Society of Mechanical Engineers, Dynamic Systems and Control Division; and the Academic Career Achievement Award from the Institute of Electrical and Electronics Engineers, Engineering in Medicine and Biology Society.

 

Daniel Ferris
University of Florida
Department of Biomedical Engineering

Comprehensive physiological assessment of human-robot interactions

Daniel Ferris received a B.S. in Mathematics Education from the University of Central Florida, a M.S. in Exercise Physiology from the University of Miami, and a Ph.D. in Human Biodynamics from the University of California Berkeley. His research focuses on the biomechanics and neural control of human locomotion. Most of his research focuses on human-machine interactions (mechanically and electrically). Projects include both technology development and basic research using mobile brain imaging, robotic lower limb exoskeletons, and bionic lower limb prostheses. The general goal is to identify principles of how humans control their movements and how they learn to use robotic assistance. The results provide guidance for designing robotic devices to assist human walking and running. His laboratory has created several different robotic lower limb exoskeletons to determine how assistance at the ankle, knee, and hip can reduce the energetic cost of locomotion and making walking easier for humans. His laboratory has also translated the technologies to develop a bionic lower limb prosthesis under proportional myoelectric control. Dr. Ferris and his group are also pioneering the use of high-density electroencephalography (EEG) to perform mobile brain imaging with high temporal resolution. This last effort includes both new hardware and software innovations to facilitate removal of motion and muscle artifacts from EEG during walking and running.

 

Joao Ramos
University of Illinois at Urbana-Champaign
Department of Mechanical Science and Engineering

Whole-Body Teleoperation of Humanoid Robots via Bilateral Feedback for Dynamic Physical Interactions

Autonomous humanoid robots are still far from matching the sophistication and adaptability of human’s perception and motor control performance. To address this issue, I investigate the utilization of human whole-body motion to command a remote humanoid robot in real-time, while providing the operator with physical feedback from the robot’s actions. In this talk, I will present the challenges of virtually connecting the human operator with a remote machine in a way that allows the operator to utilize innate motor intelligence to control the robot’s interaction with the environment. I present pilot experiments in which an operator controls a humanoid robot to perform power manipulation tasks, such as swinging a firefighter axe to break a wall, and dynamic locomotion behaviors, such as walking and jumping.

Joao Ramos currently works as Assistant Professor at the University of Illinois at Urbana-Champaign. He previously worked as a Postdoctoral Associate working at the Biomimetic Robotics Laboratory, at the Massachusetts Institute of Technology. He received a PhD from the Department of Mechanical Engineering at MIT in 2018. During his doctoral research, he developed teleoperation systems and strategies to dynamically control a humanoid robot utilizing human whole-body motion via bilateral feedback. His research focus on the design and control of robotic systems that experiences large forces and impacts, such as the MIT HERMES humanoid, a prototype platform for disaster response. Additionally, his research interests include human-machine interfaces, legged locomotion dynamics, and actuation systems.