Speaker
Description
Neural ordinary differential equations (neural ODEs) are a new family of deep neural networks.
Essentially, a neural ODE is a differential equation whose vector field is a neural network.
Having a neural ODE as a part of a machine learning model makes the model more efficient than a standard one. Indeed, it is possible to train the neural ODE block of the model using the adjoint sensitivity method, which computes the gradients for the gradient descent method avoiding the computational cost of the classical backpropagation.
Our contribution to this field is the study of the stability and the contractivity of the neural ODE block, being a differential equation, with the aim of designing training strategies to make the overall machine learning model robust and stable against adversarial attacks.
This poster is based on [1], [2] and [3].
[1] Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt and David K. Duvenaud, Neural Ordinary Differential Equations, Advances in Neural Information Processing Systems 31, 2018.
[2] Eldad Haber and Lars Ruthotto, Stable Architectures for Deep Neural Networks, Inverse Problems 34(1), 2018.
[3] Ian J. Goodfellow, Jonathon Shlens and Christian Szegedy, Explaining and Harnessing Adversarial Examples, International Conference on Learning Representations, 2015.