Context-Aware Image Matting for Simultaneous Foreground and Alpha Estimation
Published In
2019 IEEE/CVF International Conference on Computer Vision (ICCV)
Document Type
Citation
Publication Date
11-1-2019
Abstract
Natural image matting is an important problem in computer vision and graphics. It is an ill-posed problem when only an input image is available without any external information. While the recent deep learning approaches have shown promising results, they only estimate the alpha matte. This paper presents a context-aware natural image matting method for simultaneous foreground and alpha matte estimation. Our method employs two encoder networks to extract essential information for matting. Particularly, we use a matting encoder to learn local features and a context encoder to obtain more global context information. We concatenate the outputs from these two encoders and feed them into decoder networks to simultaneously estimate the foreground and alpha matte. To train this whole deep neural network, we employ both the standard Laplacian loss and the feature loss: the former helps to achieve high numerical performance while the latter leads to more perceptually plausible results. We also report several data augmentation strategies that greatly improve the network's generalization performance. Our qualitative and quantitative experiments show that our method enables high-quality matting for a single natural image.
Locate the Document
DOI
10.1109/ICCV.2019.00423
Persistent Identifier
https://archives.pdx.edu/ds/psu/34763
Publisher
IEEE
Citation Details
Hou, Q., & Liu, F. (2019). Context-Aware Image Matting for Simultaneous Foreground and Alpha Estimation. Institute of Electrical and Electronics Engineers (IEEE). https://doi.org/10.1109/iccv.2019.00423