👁️🔍Perceive-IR:
Learning to Perceive Degradation Better for All-in-One Image Restoration

Wuhan University1
Horizon Robotics2
Guangdong University of Technology3
Under Peer Review

: Equal Contribution
, 📧: Corresponding Author

TL;NR: A Degradation Quality Aware All-in-one Image Restoration Framework

Perform Better in All-in-one Image Restoration

MY ALT TEXT

PSNR comparisons with state-of-the-art methods across two common scenarios.
* denotes results obtained under “Noise+Haze+Rain+Blur+Low-light” training setting, while unmarked results are from “Noise+Haze+Rain” training setting.

Abstract

The limitations of task-specific and general image restoration methods for specific degradation have prompted the development of all-in-one image restoration techniques. However, the diversity of patterns among multiple degradation, along with the significant uncertainties in mapping between degraded images of different severities and their corresponding undistorted versions, pose significant challenges to the all-in-one restoration tasks. To address these challenges, we propose Perceive-IR, an all-in-one image restorer designed to achieve fine-grained quality control that enables restored images to more closely resemble their undistorted counterparts, regardless of the type or severity of degradation. Specifically, Perceive-IR contains two stages: (1) prompt learning stage and (2) restoration stage. In the prompt learning stage, we leverage prompt learning to acquire a finegrained quality perceiver capable of distinguishing three-tier quality levels by constraining the prompt-image similarity in the CLIP perception space. Subsequently, this quality perceiver and difficulty-adaptive perceptual loss are integrated as a qualityaware learning strategy to realize fine-grained quality control in restoration stage. For the restoration stage, a semantic guidance module (SGM) and compact feature extraction (CFE) are proposed to further promote the restoration process by utilizing the robust semantic information from the pre-trained large scale vision models and distinguishing degradation-specific features. Extensive experiments demonstrate that our PerceiveIR outperforms state-of-the-art methods in all-in-one image restoration tasks and exhibit superior generalization ability when dealing with unseen tasks.

Two-stage Framework

Quality-aware Learning Strategy

MY ALT TEXT

(a) The CLIP-aware loss;
(b) The difficulty-adaptive perceptual loss.

Visual Comparisons

Discussion on Losses

BibTeX

@article{zhang2024perceive,
        title={Perceive-IR: Learning to Perceive Degradation Better for All-in-One Image Restoration},
        author={Zhang, Xu and Ma, Jiaqi and Wang, Guoli and Zhang, Qian and Zhang, Huan and Zhang, Lefei},
        journal={arXiv preprint arXiv:2408.15994},
        year={2024}
      }