Robustness in ML
The robustness of a model is a measure of its stability with respect to perturbations of the input. We investigate and review recent developments in the field affecting the training, evaluation and deployment of ML systems.
Research feed
Other series in Safety and reliability in ML systems
Check all of our work