Comprehending the intricacies of image structure and noise distribution is pivotal in employing paired training methods. Nevertheless, these approaches encounter difficulties when attempting to generalize to images characterized by unknown noise distributions. In response to this challenge, we propose a two-stage image denoising method designed to enhance generalization performance. In the initial stage, we introduce a preliminary denoiser based on a Multilayer Perceptron (MLP). This denoiser utilizes an implicit structural prior to forecast preliminary denoising results, ensuring alignment with the clean image structure. This stage necessitates training with a limited set of paired low-noise images. In the subsequent stage, we incorporate initial denoising results as guiding conditions for denoising diffusion null-space model (DDNM), we proficiently harness the generated diffusion prior, markedly amplifying the denoising model's generalization capability. We utilize a pretrained unconditional diffusion model, obviating the need for supplementary training or network optimization, thereby resulting in an economical overall training cost for the entire method. Through extensive experimental validation encompassing diverse datasets, noise types, and intensity levels, our method consistently outperforms alternative image denoising techniques in both denoising performance and generalization ability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.