Skip to content

IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"

Notifications You must be signed in to change notification settings

jhoon-oh/kd_data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

47 Commits
 
 
 
 
 
 
 
 

Repository files navigation

[Official] Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation

This repository is the official implementation of "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation" paper presented in IJCAI 2021. Thanks to the contributors. [IJCAI2021Poster]

Results

You can reproduce all results in the paper with our code. All results have been described in our paper including Appendix. The results of our experiments are so numerous that it is difficult to post everything here. However, if you experiment several times by modifying the hyperparameter value in the .sh file, you will be able to reproduce all of our analysis.

Contact

Feel free to contact us if you have any questions:)

Acknowledgements

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) [No.2019-0-00075, Artificial Intelligence Graduate School Program (KAIST)] and [No. 2021-0-00907, Development of Adaptive and Lightweight Edge-Collaborative Analysis Technology for Enabling Proactively Immediate Response and Rapid Learning].

About

IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published