10–12 May 2023
Gran Sasso Science Institute, L'Aquila
Europe/Rome timezone

Training of Stable Neural Ordinary Differential Equations

Not scheduled
20m
Gran Sasso Science Institute, L'Aquila

Gran Sasso Science Institute, L'Aquila

Via Michele Iacobucci 2, 67100, L'Aquila
Poster

Speaker

Arturo De Marinis (Gran Sasso Science Institute)

Description

Neural ordinary differential equations (neural ODEs) are a new family of deep neural networks.
Essentially, a neural ODE is a differential equation whose vector field is a neural network.
Having a neural ODE as a part of a machine learning model makes the model more efficient than a standard one. Indeed, it is possible to train the neural ODE block of the model using the adjoint sensitivity method, which computes the gradients for the gradient descent method avoiding the computational cost of the classical backpropagation.
Our contribution to this field is the study of the stability and the contractivity of the neural ODE block, being a differential equation, with the aim of designing training strategies to make the overall machine learning model robust and stable against adversarial attacks.
This poster is based on [1], [2] and [3].

[1] Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt and David K. Duvenaud, Neural Ordinary Differential Equations, Advances in Neural Information Processing Systems 31, 2018.
[2] Eldad Haber and Lars Ruthotto, Stable Architectures for Deep Neural Networks, Inverse Problems 34(1), 2018.
[3] Ian J. Goodfellow, Jonathon Shlens and Christian Szegedy, Explaining and Harnessing Adversarial Examples, International Conference on Learning Representations, 2015.

Primary authors

Arturo De Marinis (Gran Sasso Science Institute) Nicola Guglielmi (Gran Sasso Science Institute) Anton Savostianov (Gran Sasso Science Institute) Francesco Tudisco (GSSI) Emanuele Zangrando (Gran Sasso Science Institute)

Presentation materials

There are no materials yet.