Binary Classification

1 AI tools found

Multi-Label Knowledge Distillation logo

Existing knowledge distillation methods typically work by imparting the knowledge of output logits or intermediate feature maps from the teacher network to the student network, which is very successful in multi-class single-label learning.

Paper and LLMs