[Label Refinery](https://ingenjoy.notion.site/Label-Refinery-165308e26924815e9609d2953c3f675b)
[The State of Knowledge Distillation for Classification Tasks](https://ingenjoy.notion.site/The-State-of-Knowledge-Distillation-for-Classification-Tasks-165308e26924813c9e2cd0423d85093c)
[Contrastive Representation Distillation](https://ingenjoy.notion.site/Contrastive-Representation-Distillation-165308e2692481ea8302e37e25947bf3)
Self-training with Noisy Student improves ImageNet classification
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
Knowledge transfer via distillation of activation boundaries formed by hidden neurons
Relational knowledge distillation.
Self-training with Noisy Student improves ImageNet classification
DeiT: Training data-efficient image transformers & distillation through attention
Training a Binary Weight Object Detector by Knowledge Transfer for Autonomous Driving
Revisiting Knowledge Distillation for Object detection
Knowledge Distillation: A Survey
Task: Object detection