The paper presents a Lyapunov-based approach to derive weight adaptation laws for a deep residual neural network (ResNet)-based adaptive controller. The key highlights are:
Motivation: Deep neural network (DNN)-based controllers can compensate for unstructured uncertainties, but existing methods either use static DNN models or require offline training of inner-layer weights. This motivates the need for a ResNet-based adaptive controller with real-time weight adaptation.
Approach: The ResNet is expressed as a composition of building blocks involving a shortcut connection across a fully-connected DNN. A constructive Lyapunov-based approach is provided to derive weight adaptation laws for the ResNet using the gradient of each DNN building block.
Novelty: This is the first result on Lyapunov-derived adaptation laws for ResNets, which pose additional mathematical challenges compared to fully-connected DNNs due to the shortcut connections.
Analysis: A nonsmooth Lyapunov-based analysis is provided to guarantee asymptotic tracking error convergence. The analysis ensures the system state remains within a compact domain where the universal function approximation property of the ResNet holds.
Simulations: Comparative Monte Carlo simulations demonstrate that the ResNet-based adaptive controller provides a 64% improvement in tracking and function approximation performance compared to an equivalent fully-connected DNN-based adaptive controller.
Key Advantages: The ResNet architecture overcomes the vanishing gradient problem present in fully-connected DNNs, enabling faster weight adaptation and better compensation of system uncertainties.
To Another Language
from source content
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by Omkar Sudhir... lúc arxiv.org 04-12-2024
https://arxiv.org/pdf/2404.07385.pdfYêu cầu sâu hơn