MaskCLIP: Masked Self-Distillation Advances Contrastive Language-Image Pretraining | Read Paper on Bytez