A Class of High Order Tuners for Adaptive Systems

Page 1

IEEE CONTROL SYSTEMS LETTERS, VOL. 5, NO. 2, APRIL 2021

391

A Class of High Order Tuners for Adaptive Systems Joseph E. Gaudio , Graduate Student Member, IEEE, Anuradha M. Annaswamy , Fellow, IEEE, Michael A. Bolender , Eugene Lavretsky , Fellow, IEEE, and Travis E. Gibson

Abstract—Parameter estimation algorithms using higher order gradient-based methods are increasingly sought after in machine learning. Such methods however, may become unstable when regressors are time-varying. Inspired by techniques employed in adaptive systems, this letter proposes a new variational perspective to derive four higher order tuners with provable stability guarantees. This perspective includes concepts based on higher order tuners and normalization and allows stability to be established for problems with time-varying regressors. The stability analysis builds on a novel technique which stems from symplectic mechanics, that links Lagrangians and Hamiltonians to the underlying Lyapunov stability analysis, and is provided for common linear-in-parameter models. Index Terms—Adaptive systems, uncertain systems.

I. I NTRODUCTION ODIFICATIONS to gradient-based parameter update methods have been actively researched within both the machine learning and adaptive systems communities for decades, for optimization and control in the presence of uncertainties. Of particular note is the seminal higher order gradient method proposed by Nesterov [1] which has not only received significant attention in the optimization community [2], but also in the neural network learning community [3] due to its potential for accelerated learning. Variants of Nesterov’s higher order method have become the standard for training deep neural networks [3]. To gain insight into Nesterov’s

M

Manuscript received March 10, 2020; revised May 12, 2020; accepted June 1, 2020. Date of publication June 16, 2020; date of current version June 30, 2020. This work was supported in part by the Air Force Research Laboratory, Collaborative Research and Development for Innovative Aerospace Leadership, Thrust 3—Control Automation and Mechanization under Grant FA 8650-16-C-2642, and in part by the Boeing Strategic University Initiative. Recommended by Senior Editor M. Guay. (Corresponding author: Joseph E. Gaudio.) Joseph E. Gaudio and Anuradha M. Annaswamy are with the Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, MA 02139 USA (e-mail: jegaudio@mit.edu; aanna@mit.edu). Michael A. Bolender is with the Autonomous Control Branch, Air Force Research Laboratory, Wright-Patterson AFB, OH 45433 USA (e-mail: michael.bolender@us.af.mil). Eugene Lavretsky is with the BR&T, The Boeing Company, Huntington Beach, CA 92647 USA (e-mail: eugene.lavretsky@boeing.com). Travis E. Gibson is with the Department of Pathology, Harvard Medical School, Boston, MA 02115 USA (e-mail: tegibson@bwh.harvard.edu). Digital Object Identifier 10.1109/LCSYS.2020.3002513

method, which is a difference equation, several recent results have leveraged a variational approach showing that, in continuous time, there exists a broad class of higher order methods where one can obtain fast convergence rates [4]. In all the aforementioned work, while the parameter update algorithm is time-varying, the regressors in the problem statement are assumed to be constant. While almost all problems in adaptive control have timevarying regressors, much of the machine learning research has focused, by and large, on constant regressors (see [4] and references therein). Any application of machine learning techniques to safety critical problems will necessarily require the consideration of time-varying regressors, where either the input features are time-varying, for time-series prediction, for recurrent networks with time-varying inputs, or for online learning and optimization [5], [6]. While much of the adaptive systems community has focused on first order parameter update laws [7]–[9], one notable exception is the “high-order tuner” proposed by Morse [10] which has been useful in providing stable algorithms for time-delay systems [11] in the adaptive control setting. The algorithms we develop will be based on these high-order tuners, and are applicable both to machine learning and adaptive control problems, and will use a variational perspective to unite the algorithm derivation and Lyapunov stability analysis. We begin with a discussion of algorithm parameterization, where the use of a regressor-based parameterization is proposed as compared to a time-based parameterization common in machine learning methods [4]. Two higher order tuner algorithms, one of which is considered in [10], are then derived from a unified Lagrangian approach which relates the potential, kinetic, and damping characteristics of the algorithm. We proceed to a discussion of the novelty of the proposed implementation of the derived algorithms, by splitting a second order differential equation (ODE) into two first order ODEs, and relating Lagrangians to Hamiltonians. The Hamiltonian perspective allows for a discussion of symplectic forms of equations which in turn allow for the design of two additional higher order tuners amenable to symplectic discretization techniques [12], [13]. A detailed stability analysis follows, where loss functions and Lyapunov functions are connected to the variational perspectives, and is provided for two classes of adaptive systems. The main contributions of this letter are (i) the derivation of a class of high-order tuners (HT) that are proved to be

c 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. 2475-1456 See https://www.ieee.org/publications/rights/index.html for more information.

Authorized licensed use limited to: Khwaja Fareed University of Eng & IT. Downloaded on July 07,2020 at 11:11:51 UTC from IEEE Xplore. Restrictions apply.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.