Our new collaborative paper (Godaz et al., 2021) on an exciting, if a bit theoretical, method for improving a common optimziation method has been accepted to ACML 2021.

RLBFGS is a common quasi-Newton method for optimization using Riemannian manifolds, but the method is very computationally expensive. In this paper we present how altering two mappings (vector transport and adjoint) in the algorithm can lead to a significant computational performance gain with very little loss in accuracy.

ACML is the Asian Conference on Machine Learning, for the 2021 edition there were 378 submissions with an acceptance rate of around 30% via double-blind reviewing.