Skip to content

Latest commit

 

History

History
7 lines (7 loc) · 1.21 KB

readme.md

File metadata and controls

7 lines (7 loc) · 1.21 KB

Spatial-Temporal Correlation Learning for Real-Time Video DeInterlacing

Yuqing Liu, Xinfeng Zhang, Shanshe Wang, Siwei Ma, and Wen Gao Accept by ICME 2021 IEEExplore

Abstract

Deinterlacing is a classical issue in video processing area, which aims to generate the progressive video from the interlaced instance. Although numerous algorithms have been proposed in the past decades, their performances are still not satisfactory from both quality of experience and processing efficiency. This paper focuses on the spatial-temporal correlation in the given frame, and design a network for recovering the missing field. Intra-frame motion compensation is considered between the given fields for detail refinement. Furthermore, we address the inherent correlations among image features with channel attention for better exploration. Extensive experimental results on different video sequences show that our method outperforms state-of-the-art methods according to both objective and subjective evaluations satisfying the real-time requirement.

News

2021/11/04: We win the 5-th place of MSU Deinterlacer Benchmark link