Kavli Affiliate: Zheng Zhu | First 5 Authors: Zekai Li, Xinhao Zhong, Samir Khaki, Zhiyuan Liang, Yuhao Zhou | Summary: In recent years, dataset distillation has provided a reliable solution for data compression, where models trained on the resulting smaller synthetic datasets achieve performance comparable to those trained on the original datasets. To further improve […]
Continue.. DD-Ranking: Rethinking the Evaluation of Dataset Distillation