Lecture Review: From Automation to Autonomy: Advances in Machine Learning driving Next-gen Robotics
On June 17, 2020 AIRS invited Professor Sethu Vijayakumar from the University of Edinburgh and The Alan Turing Institute, UK to deliver an online lecture entitled “From Automation to Autonomy: Advances in Machine Learning driving Next-gen Robotics”. Professor Vijayakumar is a world renowned roboticists who has pioneered the use of Machine Learning in adaptive control of complex, multi degree of freedom robotic platforms, having held influential research career positions in three continents: Japan, USA and UK. He holds a Microsoft-Royal Academy of Engineering Chair in Robotics at Edinburgh, is the founding Director of the Edinburgh Centre for Robotics and helps shape and drive the UK national Robotics and Autonomous Systems (RAS) agenda in his role as the Programme Director at The Alan Turing Institute, the UK’s national institute for data science and AI.
Professor Vijayakumar started by thanking AIRS for the invite and also mentioning how excited he was to start a new collaborative project with AIRS, whose motto about developing AI for the good of the society resonated closely with his own goals and the institutions he represented in the UK.
Professor Vijayakumar explained the emphasis of his talk – enabling robotics to move from mere automation to enable significant autonomy and devolved decision making by exploiting the advances in Machine Learning for dealing with dynamic situations.
He then justified why waiting for ‘full’ autonomous capabilities may be economically infeasible nor ethically desirable due to the nature of the new application domains and how the next challenge is to deliver Shared Autonomy capabilities that can seamlessly move between teleoperation and full autonomous behaviour depending on the demands of the task, while still allowing human in the loop control.
Professor Vijayakumar structured his talk around four use case scenarios as capability demonstrations to explain the scientific advances in perception, representation, planning, compliant actuation and optimal control that enabled new and reactive capabilities in his work – the first being capabilities for remote operations with punctuated autonomy.
Professor Vijayakumar explained some off-the-shelf sensing technologies as well as some new techniques for combining model based tracking in real time in order to combine multi-modal sensor information for planning and control of complex robotic platforms.
He introduced topology based representations (interaction mesh, writhes, relational tangent planes) for dealing with complex planning problems as well as explained the use of open-sourced Exotica motion planning algorithm for whole body motion planning and testing, allowing scaling to very complex problems in a cohesive fashion.
He went on to explain another domain where complex environments, dynamic situations and safety was critical – in the use of legged robots on Inspection maintenance and repair of assets. He focused on technology related to compliant actuation and its importance in ensuring safe and accurate behaviour. His group are world leaders in this technology, having been awarded the 2013 IEEE-TRO Best paper award on variable impedance technology.
Professor Vijayakumar then went on to explain the details of algorithmically treating the variable impedance (stiffness, damping) modulation as an optimisation problem and how one can use techniques from graphical models and probabilistic inference to efficiently tackle the problem. He showed several practical implementation on real hardware where this approach was used to ensure robust task achievement while minimising energy consumption.
Professor Vijayakumar also briefly touched upon the need for learning technologies (machine learning for incremental dynamics estimation) that can be used with optimal control to adpatively deal with chages in the environment.
He then went on the explain his latest project (2016) involves a collaboration with NASA Johnson Space Centre on the Valkyrie humanoid robot being prepared for unmanned robotic pre-deployment missions to Mars. He focused on whole body motion planning algorithms as well as uneven terrain footstep planning and dyadic collaborations – all developed using a combination of bi-level hybrid optimisation framework, optimising contacts, timing and capable of significant adaptation to changing environments.
Finally, on the scientific front, he touched upon the other domain that his group is doing world leading research: in the domain of shared control for upper and lower limb prosthetics. He demonstrated some videos of the testing of upper limb prosthetic NHS amputees as well as pilots on the use of exoskeletons for assistance of lower limb injury or stroke patients.
Professor Vijayakumar finished the scientific part of the presentation by highlighting key areas he could not cover in this presentation, including the questions relating to safe, ethical, verifiable and explainable development of robotics and AI technologies. He explained that these are the questions he is tackling as the Programme Director of The Alan Turing Institute.
He explained the use of living labs and public-private partnerships with academia, industry and the government as an excellent model he has been advocating through his leadership for responsible innovation in Robotics and Autonomous Systems.
Professor Vijayakumar finished the talk by highlighting the new collaborative project he has initiated with AIRS (along with details of the three scientific pillars) and the opportunity for bright scientists in China to get involved through the CUHK-AIRS research programs.
Further details of Professor Vijayakumar’s details, publications and research output can be accessed from his webpage and social media contacts:
TED talk: https://youtu.be/kj4NZrdGQhs