对KD来说,就是把老师的软目标和真实的的one-hot标签结合起来,我们发现KD是一个学习的LSR,其中KD的平滑分布来自于一个教师模型,而LSR的平滑分布是人工设计的。简单地说,KD是一个学习得到的LSR, 而LSR是一种特别的KD。
useful url:https://github.com/thuml/Transfer-Learning-Library 【Transfer Learning】
useful url:https://github.com/ZhenyuanLin/Awesome-datafree-KD 【Data-Free KD】
useful url:https://blog.csdn.net/weixin_44936889/article/details/119788818 【零样本知识蒸馏 BIT可达鸭】
Active Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model【CVPR 2020】
Zero-Shot Knowledge Distillation from a Decision-Based Black-Box Mode
A dirt-t approach to unsupervised domain adaptation
DINE: Domain Adaptation from Single and Multiple Black-box Predictors
Unsupervised Domain Adaptation of Black-Box Source Models 【BMVC 2021】
On Universal Black-Box Domain Adaptation
Domain Adaptation for Semantic Segmentation with Maximum Squares Loss 【ICCV 2019】 Deng Cai
Unsupervised multisource domain adaptation without access to source data
Generalize Then Adapt: Source-Free Domain Adaptive Semantic Segmentation [ICCV2021] [Project]
Visualizing Adapted Knowledge in Domain Transfer [CVPR2021] [Pytorch]
Domain Impression: A Source Data Free Domain Adaptation Method [WACV2021] [Project]
Model Adaptation: Unsupervised Domain Adaptation Without Source Data [CVPR2020]
Universal Source-Free Domain Adaptation [CVPR2020] [Project]
Towards Inheritable Models for Open-Set Domain Adaptation [CVPR2020] [Project]
Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation [ICML2020] [Pytorch]
Learning Invariant Representation with Consistency and Diversity for Semi-supervised Source Hypothesis Transfer[7 Jul 2021][Pytorch]
Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer [14 Dec 2020] [Pytorch]
Unsupervised Domain Adaptation of Black-Box Source Models [28 Mar 2021]
Large-Scale Generative Data-Free Distillation