![]() Very pleased to share this one! Federated learning, that is, the fitting of statistical or machine learning models on disparate datasets whilst preserving privacy by never having the individual datasets leave their source, is an area of massive potential. Preprint Link : #machinelearning #statistics #bayesian ![]() Some nice maths about why this is a good idea (actually, in an idealised setting where you know how to "warp" perfectly, it is an optimal approach) and simulation examples also feature! Our paper proposes the generally-applicable idea of using a central "world" through which to travel when jumping between models, and apply ideas from deep generative models (specifically, normalising flows) to "warp" each model's distribution to the shared world. ![]() This is a conceptually very appealing approach, but is notoriously hard to get to work in practice, and often specially-tailored approaches need to be developed on a case-by-case basis. Transdimensional sampling in Bayesian Statistics/Machine Learning is an elegant approach to average over a set of candidate models, yielding uncertainty quantification not only over model predictions, but also over the models themselves. Very happy to announce that our paper "Transport Reversible Jump Proposals" has been accepted into this year's #aistats conference! This one was a lot of fun, here's the brief story of the idea that Laurence Davies and I dreamed up.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |