News

  •     Infinite Infimal Convilution Regularisation -- preprint

    Together with Kristian Bredies, Marcello Carioni, Martin Holler, and Carola Schönlieb we propose a regularisation functional based on an infimal convolution of a continuum of convex functionals and use it to, e.g., learn dominant anisotropy directions in an image. The preprint can be found here: arXiv:2304.08628

  •     Quondam Fellowship of Hughes Hall, University of Cambridge

    I was elected a Quondam Fellow of Hughes Hall, a Cambridge college where I spent the last 4 years as a Research Fellow. This is a lifelong fellowship that will allow me stay in touch and involved in college matters. I am very glad to continue playing a part!

  •     Joining the University of Bath

    I am very excited to start my post as Lecturer in Mathematics and Data Science at the University of Bath from 1st September! Looking forward to joining a friendly and forward-looking departiment. Thank you everyone who helped me!

  •     Analytic and Geometric Approaches to Machine Learning Symposium

    I presented my paper on approximation properties of vector-valued neural networks at the Analytic and Geometric Approaches to Machine Learning Symposium in Bath (online).
    Slides are available here.

  •     Neural networks with values in a Banach space

    I submitted a preprint on two-layer neural networks that take values in a Banach space. It extends the results of Bach (2017), E, Ma and Wu (2019), E and Wojtowytsch (2020) and others on approximation rates of infinitely wide scalar-valued two-layer neural networks and establishes Monte-Carlo rates in Bochner spaces. The most unexpected result (for me) is that in the vector-valued case continuity of such neural networks can only be established with respect to the weak$^*$ topology in the target space. This turns out to be a significant restriction for networks with the ReLU activation function.
    The preprint can be found here: arxiv:2105.02095

  •     Inverse problems, Banach lattices and general fidelity functions

    Our paper with Leon Bungert, Martin Burger and Carola Schönlieb on Variational regularisation for inverse problems with imperfect forward operators and general noise models has appeared in Inverse Problems. We analyse inverse problems where the operator contains errors that can be described by an interval in a Banach lattice (e.g., kernel of an integral operator with pointwise errors) and the data are corrupted by noise that can be described using some data fidelity function. Our results apply to, e.g., Gaussian, salt-and-pepper, Poisson and mixed noise.
    The paper is in open acces and can be found here: DOI:10.1088/1361-6420/abc531

  •     Deep learning and non-linear spectral analysis - NeurIPS 2020

    Our paper with Tamara Grossmann, Guy Gilboa and Carola Schönlieb on Deeply Learned Spectral Total Variation Decomposition has been accepted to NeurIPS 2020. Congratulations Tamara on your great work!
    We compute a non-linear Total Variation decomposition of an image 10000 faster then classical methods.
    Check it out here: arXiv:2006.10004

  •     EPSRC Postdoctoral Fellowship

    I was awarded an EPSRC Fellowship! The title is “Regularisation theory in the data driven setting”. My goal is to extend regularisation theory to the setting where there is no direct access to the forward operator at the time of solving the inverse problem and only input-output training pairs are available. Such pairs can be either collected experimentally or obtained from a computationally expensive model prior to solving the inverse problem. The latter scenario is relevant for time-sensitive applications where near real-time reconstructions are required.
    Furthermore, the model free setting is the natural habitat of neural networks, and my long-term goal is to better understand their regularisation properties in the context of ill-posed inverse problems in infinite dimensions.
    The fellowship is due to start in April 2021.

  •     Data driven regularisation

    Our paper with Andrea Aspri and Otmar Scherzer on Data Driven Regularization by Projection has appeared in Inverse Problems! We show that regularisation can be defined and rigorously studied in the setting when there is no numerical access to the forward operator and the operator is given only via input-output training pairs. Such pairs can be either collected experimentally or obtained from a computationally expensive model prior to solving the inverse problem (which may be useful in time-sensitive applications). We show connections to classical regularisation methods such as regularisation by projection and variational regularisation, study stability of the reconstructions and present numerical experiments on data driven inversion of the Radon transform.
    The paper is in open acces and can be found here: DOI:10.1088/1361-6420/abb61b

  •     SIAM Imaging Science 2020

    I gave a talk about our paper on variational regularisation in Banach lattices at the SIAM Imaging Science conference. Thank you Nicolai Riis for inviting me to your mini-sympsium! And also many thanks to the organisers Michael Elad and Stacey Levine for organising the conference under current difficult circumstances.
    Slides are available here.

  •     Deep learning and non-linear spectral analysis

    Our preprint with Tamara Grossmann, Guy Gilboa and Carola Schönlieb on Deeply Learned Spectral Total Variation Decomposition is online! We compute a non-linear Total Variation decomposition of an image 10000 faster then classical methods.
    Check it out here: arXiv:2006.10004

  •     Inverse problems, Banach lattices and general fidelity functions

    Our preprint with Leon Bungert, Martin Burger and Carola Schönlieb on Variational regularisation for inverse problems with imperfect forward operators and general noise models is now online! We analyse inverse problems where the operator contains errors that can be described by an interval in a Banach lattice (e.g., kernel of an integral operator with pointwise errors) and the data are corrupted by noise that can be described using some data fidelity function. Our results apply to, e.g., Gaussian, salt-and-pepper, Poisson and mixed noise.
    Check it out here: arXiv:2005.14131

  •     Computing distance functions using gradient flows

    Our preprint with Leon Bungert and Martin Burger on L-infinity variational problems and relations to distance functions is now online! We show that the distance function is the unique ground state of a certain L-infinity variational problem and devise an efficient numerical algorithm for computing distance functions on graphs.
    Check it out here: arXiv:2001.07411

subscribe via RSS