Structure-preserving deep learning

Loading...
Thumbnail Image
Date
2021-10
Open Access Location
Journal Title
Journal ISSN
Volume Title
Publisher
Cambridge University Press
Rights
CC BY 4.0
Abstract
Over the past few years, deep learning has risen to the foreground as a topic of massive interest, mainly as a result of successes obtained in solving large-scale image processing tasks. There are multiple challenging mathematical problems involved in applying deep learning: most deep learning methods require the solution of hard optimisation problems, and a good understanding of the trade-off between computational effort, amount of data and model complexity is required to successfully design a deep learning approach for a given problem.. A large amount of progress made in deep learning has been based on heuristic explorations, but there is a growing effort to mathematically understand the structure in existing deep learning methods and to systematically design new deep learning methods to preserve certain types of structure in deep learning. In this article, we review a number of these directions: some deep neural networks can be understood as discretisations of dynamical systems, neural networks can be designed to have desirable properties such as invertibility or group equivariance and new algorithmic frameworks based on conformal Hamiltonian systems and Riemannian manifolds to solve the optimisation problems have been proposed. We conclude our review of each of these topics by discussing some open problems that we consider to be interesting directions for future research.
Description
Keywords
Deep learning, ordinary differential equations, optimal control, structure-preserving methods
Citation
Celledoni E, Ehrhardt MJ, Etmann C, Mclachlan RI, Owren B, Schonlieb CB, Sherry F. (2021). Structure-preserving deep learning. European Journal of Applied Mathematics. 32. 5. (pp. 888-936).
Collections