Abstract
Automatic image colorization has made tremendous progress in recent years. However, previous methods rarely consider the inherent diversity of natural images, and existing diverse colorization approaches still suffer from the following two limitations. i) The diversity of generated images is limited. ii) Some inevitable artifacts substantially degrade the quality of colorization results. To cope with such problems, we propose a novel diverse image colorization network based on vanilla GAN, which extracts the deep color priors through an initial colorization network first, and then utilizes these priors to modulate the diverse generation to achieve high-fidelity colorization outputs. In addition, a novel triplet latent regularization is proposed to constrain the correspondence between latent codes and generated images, which efficiently alleviates mode collapse and encourages more diverse colorization results. Extensive experiments on three benchmark datasets demonstrate the superiority of our method over existing diverse image colorization models.