Mutual distillation is a knowledge distillation method that guides a cohort of neural networks to learn cooperatively by transferring knowledge between them, without the help of a teacher network. This paper aims to confirm whether mutual distillation...
Mutual distillation is a knowledge distillation method that guides a cohort of neural networks to learn cooperatively by transferring knowledge between them, without the help of a teacher network. This paper aims to confirm whether mutual distillation is also applicable to super-resolution networks. To this regard, we conduct experiments to apply mutual distillation to the discriminators of SRGANs and analyze the effect of mutual distillation on improving SRGAN’s performance. As a result of the experiment, it was confirmed that SRGANs whose discriminators shared their knowledge through mutual distillation can produce super-resolution images enhanced in both quantitative and qualitative qualities.