Recovery Guarantees of Unsupervised Neural Networks for Inverse Problems trained with Gradient Descent - GREYC image Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2024

Recovery Guarantees of Unsupervised Neural Networks for Inverse Problems trained with Gradient Descent

Résumé

Advanced machine learning methods, and more prominently neural networks, have become standard to solve inverse problems over the last years. However, the theoretical recovery guarantees of such methods are still scarce and difficult to achieve. Only recently did unsupervised methods such as Deep Image Prior (DIP) get equipped with convergence and recovery guarantees for generic loss functions when trained through gradient flow with an appropriate initialization. In this paper, we extend these results by proving that these guarantees hold true when using gradient descent with an appropriately chosen step-size/learning rate. We also show that the discretization only affects the overparametrization bound for a two-layer DIP network by a constant and thus that the different guarantees found for the gradient flow will hold for gradient descent.
Fichier principal
Vignette du fichier
DIP_Discrete_arxiv-2.pdf (592.94 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04496435 , version 1 (08-03-2024)

Identifiants

  • HAL Id : hal-04496435 , version 1

Citer

Nathan Buskulic, Jalal Fadili, Yvain Quéau. Recovery Guarantees of Unsupervised Neural Networks for Inverse Problems trained with Gradient Descent. 2024. ⟨hal-04496435⟩
12 Consultations
5 Téléchargements

Partager

Gmail Facebook X LinkedIn More