I like ML, in particular deep learning optimization, efficency and benchmarks!
Here you can find a copy of my CV.
You can find me also on:
- October, 2024. 📄 Our paper on Loss Landscape Characterization of Neural Networks has been accepted to NeurIPS 2024! We introduce a new function class that better captures neural network loss landscapes, ensuring convergence for several SGD-based algorithms, and showing its applicability across several Deep Learning tasks!
- August, 2024. 🎉 Our submission to AlgoPerf scored third 🥉 in the inaugural benchmark results! We scored first among non-industry submissions! Checkout the MLCommons blogpost and our submissions in the official repo.
- July, 2024. 📂 Released plainLM, a minimal open-source repository for pre-training Transformers on Language Modeling. It is written in basic pytorch, supports distributed training, and contains a minimal Transformer implementation, with RoPE, RMSNorm, GLU.
Conformal Prediction Bands for Two-Dimensional Functional Time Series
Ajroldi, Diquigiovanni, Fontana, Vantini, Computational Statistics & Data Analysis, 2023.
We develop algorithms to forecast time evolving surfaces and estimate prediction uncertainty. We introduce estimation techniques for functional autoregressive models and revisit distribution-free uncertainty quantification techniques for this setting.
Continuous and early prediction of Acute Kidney Injury in critically ill patients
Alfieri, Ancona, Tripepi, Rubeis, Ajroldi, Finazzi, Cauda, Fagugli, (2023), on PLOS ONE.
We propose a novel ML model to continuosly predict Acute Kidney Injury episodes in Intensive Care Units using routinely-available data. The model is tested through a multi-centric, multi-national external validation procedure.