Jekyll2024-03-19T17:25:43+00:00https://yury-korolev.gitlab.io//feed.xmlYury KorolevInverse problems, calculus of variations, mathematical imaging, mathematics of machine learning
Infinite Infimal Convilution Regularisation – preprint2023-04-24T08:00:00+00:002023-04-24T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2023/04/24/inf-convTogether with Kristian Bredies, Marcello Carioni, Martin Holler, and Carola Schönlieb we propose a regularisation functional
based on an infimal convolution of a continuum of convex functionals and use it to, e.g., learn dominant anisotropy directions in an image.
The preprint can be found here:
arXiv:2304.08628]]>Quondam Fellowship of Hughes Hall, University of Cambridge2022-12-06T08:00:00+00:002022-12-06T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2022/12/06/HH-QFI was elected a Quondam Fellow of Hughes Hall, a Cambridge college where I spent the last 4 years as a Research Fellow.
This is a lifelong fellowship that will allow me stay in touch and involved in college matters.
I am very glad to continue playing a part!]]>Joining the University of Bath2022-08-18T08:00:00+00:002022-08-18T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2022/08/18/Bath-lectureshipI am very excited to start my post as Lecturer in Mathematics and Data Science at the University of Bath from 1st September!
Looking forward to joining a friendly and forward-looking departiment. Thank you everyone who helped me!]]>Analytic and Geometric Approaches to Machine Learning Symposium2021-07-29T08:00:00+00:002021-07-29T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2021/07/29/Bath-talkI presented my paper on approximation properties of
vector-valued neural networks at the Analytic and Geometric Approaches to Machine Learning Symposium in Bath (online).
Slides are available here.]]>Neural networks with values in a Banach space2021-05-06T08:00:00+00:002021-05-06T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2021/05/06/NN-Banach-acceptedI submitted a preprint on two-layer neural networks that take values in a Banach space.
It extends the results of
Bach (2017),
E, Ma and Wu (2019),
E and Wojtowytsch (2020)
and others on approximation rates of infinitely wide scalar-valued two-layer neural networks and establishes
Monte-Carlo rates in Bochner spaces. The most unexpected result (for me) is that in the vector-valued case
continuity of such neural networks can only be established with respect to the weak$^*$ topology in the
target space. This turns out to be a significant restriction for networks with the ReLU activation function.
The preprint can be found here:
arxiv:2105.02095]]>Inverse problems, Banach lattices and general fidelity functions2020-10-28T08:00:00+00:002020-10-28T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2020/10/28/arbitrary_fid-acceptedOur paper with Leon Bungert, Martin Burger and Carola Schönlieb on
Variational regularisation for inverse problems with imperfect forward
operators and general noise models has appeared in Inverse Problems. We analyse inverse problems
where the operator contains errors that can be described by an interval in a
Banach lattice (e.g., kernel of an integral operator with pointwise errors)
and the data are corrupted by noise that can be described using some data
fidelity function. Our results apply to, e.g., Gaussian, salt-and-pepper,
Poisson and mixed noise.
The paper is in open acces and can be found here:
DOI:10.1088/1361-6420/abc531]]>Deep learning and non-linear spectral analysis - NeurIPS 20202020-10-27T08:00:00+00:002020-10-27T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2020/10/27/TVspecNet-NeurIPSOur paper with Tamara Grossmann, Guy Gilboa and Carola Schönlieb on Deeply Learned Spectral Total Variation
Decomposition has been accepted to NeurIPS 2020. Congratulations Tamara on your great work!
We compute a non-linear Total Variation decomposition of an image 10000 faster then classical
methods.
Check it out here:
arXiv:2006.10004]]>EPSRC Postdoctoral Fellowship2020-09-23T08:00:00+00:002020-09-23T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2020/09/23/EPSRCI was awarded an EPSRC Fellowship! The title is “Regularisation theory in the data driven setting”. My goal is to
extend regularisation theory to the setting where there is no direct access to the forward operator at the time of
solving the inverse problem and only input-output training pairs are available. Such pairs can be either collected
experimentally or obtained from a computationally expensive model prior to solving the inverse problem. The latter
scenario is relevant for time-sensitive applications where near real-time reconstructions are required.
Furthermore, the model free setting is the natural habitat of neural networks, and my long-term goal is to better
understand their regularisation properties in the context of ill-posed inverse problems in infinite dimensions.
The fellowship is due to start in April 2021.]]>Data driven regularisation2020-09-08T08:00:00+00:002020-09-08T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2020/09/08/data-drivenOur paper with Andrea Aspri and Otmar Scherzer on Data Driven Regularization by Projection has appeared in Inverse Problems!
We show that regularisation can be defined and rigorously studied in the setting when there is no numerical access to
the forward operator and the operator is given only via input-output training pairs. Such pairs can be either
collected experimentally or obtained from a computationally expensive model prior to solving the inverse problem (which
may be useful in time-sensitive applications). We show connections to classical regularisation methods such as
regularisation by projection and variational regularisation, study stability of the reconstructions and present numerical
experiments on data driven inversion of the Radon transform.
The paper is in open acces and can be found here:
DOI:10.1088/1361-6420/abb61b]]>SIAM Imaging Science 20202020-07-15T08:00:00+00:002020-07-15T08:00:00+00:00https://yury-korolev.gitlab.io//jekyll/update/2020/07/15/SIAM-IS-talkI gave a talk about our paper on variational regularisation in Banach lattices at the SIAM Imaging Science conference. Thank you Nicolai Riis for inviting me to your mini-sympsium! And also many thanks to the organisers Michael Elad and Stacey Levine for organising the conference under current difficult circumstances.
Slides are available here.]]>