Posts by Collection

portfolio

publications

Towards Automated Satellite Conjunction Management with Bayesian Deep Learning Permalink

Published in AI4EarthSciences, NeurIPS 2020 Workshop, 2020

After decades of space travel, low Earth orbit is a junkyard of discarded rocket bodies, dead satellites, and millions of pieces of debris from collisions and explosions. With a speed of 28,000 km/h, collisions in these orbits can generate fragments and potentially trigger a cascade of more collisions known as the Kessler syndrome. As commercial entities place mega-constellations of satellites in orbit, the burden on operators conducting collision avoidance manoeuvres will increase. For this reason, development of automated tools that predict potential collision events (conjunctions) is critical. We introduce a Bayesian deep learning approach to this problem.

Recommended citation: Francesco Pinto, Giacomo Acciarini, Sascha Metz,Sarah Boufelja, Sylvester Kaczmarek, Klaus Merz,Jose A Martinez-Heras, Francesca Letizia, Christopher Bridges, and Atılım Gunes Baydin (2020). "Towards automated satellite conjunction management with bayesian deep learning." NeurIPS 2020: AI for Earth Sciences Workshop, 2020 https://arxiv.org/abs/2012.12450

Spacecraft collision risk assessment with probabilistic programming Permalink

Published in MLPS Workshop, NeurIPS 2020 Workshop, 2020

We build a novel physics-based probabilistic generative model for synthetically generating conjunction data messages, calibrated using real data. By conditioning on observations, we use the model to obtain posterior distributions via Bayesian inference. We show that the probabilistic programming approach to conjunction assessment can help in making predictions and in finding the parameters that explain the observed data in conjunction data messages, thus shedding more light on key variables and orbital characteristics that more likely lead to conjunction events. Moreover, our technique enables the generation of physically accurate synthetic datasets of collisions, answering a fundamental need of the space and machine learning communities working in this area.

Recommended citation: Francesco Pinto, Giacomo Acciarini, Sascha Metz,Sarah Boufelja, Sylvester Kaczmarek, Klaus Merz,Jose A Martinez-Heras, Francesca Letizia, Christopher Bridges, and Atılım Gunes Baydin (2020). "Spacecraft collision risk assessment with probabilistic programming." NeurIPS 2020: MLPS Workshop, 2020 https://arxiv.org/pdf/2012.10260.pdf

Mix-MaxEnt: Improving Accuracy and Uncertainty Estimates of Deterministic Neural Networks Permalink

Published in DistShift, NeurIPS 2021 Workshop, 2021

We propose an extremely simple approach to regularize a single deterministic neural network to obtain improved accuracy and reliable uncertainty estimates. Our approach, on top of the cross-entropy loss, simply puts an entropy maximization regularizer corresponding to the predictive distribution in the regions of the embedding space between the class clusters. This is achieved by synthetically generating between-cluster samples via the convex combination of two images from different classes and maximizing the entropy on these samples. Such a datadependent regularization guides the maximum likelihood estimation to prefer a solution that (1) maps out-of-distribution samples to high entropy regions (creating an entropy barrier); and (2) is more robust to the superficial input perturbations. We empirically demonstrate that Mix-MaxEnt consistently provides much improved classification accuracy, better calibrated probabilities for in-distribution data, and reliable uncertainty estimates when exposed to situations involving domain-shift and out-of-distribution samples.

Recommended citation: Francesco Pinto, Harry Yang, Ser-Nam Lim, Philip H.S. Torr, Puneet K. Dokania (2021). "Mix-MaxEnt: Improving Accuracy and Uncertainty Estimates of Deterministic Neural Networks." NeurIPS 2021: DistShift, 2021 https://openreview.net/pdf?id=hlVgM8XcssV

Are Vision Transformers Always More Robust Than Convolutional Neural Networks? Permalink

Published in DistShift, NeurIPS 2021 Workshop, 2021

Since Transformer architectures have been popularised in Computer Vision, several papers started analysing their properties in terms of calibration, out-of-distribution detection and data-shift robustness. Most of these papers conclude that Transformers, due to some intrinsic properties (presumably the lack of restrictive inductive biases and the computationally intensive self-attention mechanism), outperform Convolutional Neural Networks (CNNs). In this paper we question this conclusion: we show that CNNs pre-trained on large amounts of data are expressive enough to produce superior robustness to current transformers performance. Also, in some relevant cases, CNNs, with a pre-training and fine-tuning procedure similar to the one used for transformers, exhibit competitive robustness. To fully understand this behaviour, our evidence suggests that researchers should focus on the interaction between pre-training, fine-tuning and the specific inductive biases of the considered architectures. For this reason, we present some preliminary analyses that shed some light on the impact of pre-training and fine-tuning on out-of-distribution detection and data-shift.

Recommended citation: Francesco Pinto, Philip H.S. Torr, Puneet K. Dokania (2021). "Are Vision Transformers Always More Robust Than Convolutional Neural Networks?." NeurIPS 2021: DistShift, 2021 https://openreview.net/pdf?id=CSXa8LJMttt

talks

Spacecraft Collision Avoidance with Bayesian Deep Learning Permalink

Published:

With the number of objects orbiting in low earth orbit increasing over time, managing potential collisions (conjunctions) is becoming more and more difficult. For this reason, the European Space Application is investigating the possibility of applying Machine Learning to help automate the process of satellite conjunction management. In this talk we present the work performed at the Frontiers Development Lab applying Bayesian Deep Learning to the problem.

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.