Page 90 - AIH-2-2
P. 90

Artificial Intelligence in Health                                 Efficient knowledge distillation for breast US



               Convolutional Neural Networks for Mobile Devices. In:   Proceedings of the IEEE/CVF Conference on Computer Vision
               Proceedings of the IEEE Conference on Computer Vision and   and Pattern Recognition; 2019. p. 2604-2613.
               Pattern Recognition; 2016. p. 4820-4828.
                                                                  doi: 10.1109/CVPR.2019.00271
               doi: 10.1109/CVPR.2016.521
                                                               33.  Tung F, Mori G. Similarity-Preserving Knowledge Distillation.
            22.  Vanhoucke V, Senior A, Mao MZ, et al. Improving the Speed   In: Proceedings of the IEEE/CVF International Conference on
               of Neural Networks on CPUs. In: Proceeding Deep Learning   Computer Vision; 2019. p. 1365-1374.
               and Unsupervised Feature Learning NIPS Workshop. Vol. 1;
               2011. p. 4.                                        doi: 10.1109/ICCV.2019.00145
                                                               34.  Gao Z, Chung J, Abdelrazek M, et al. Privileged modality
            23.  Ba J, Caruana R.  Do  Deep Nets  Really Need to be  Deep?
               Advances in Neural Information Processing Systems. In:   distillation for vessel  border  detection  in intracoronary
               NeurIPS Proceedings; 2014.                         imaging. IEEE Trans Med Imaging. 2019;39:1524-1534.
               doi: 10.48550/arXiv.1312.6184                      doi: 10.1109/TMI.2019.2952939
            24.  Hinton G, Vinyals O, Dean J. Distilling the knowledge in a   35.  Dou Q, Liu Q, Heng PA, Glocker B. Unpaired multi-modal
               neural network, arXiv preprint arXiv:1503.02531 2; 2015.  segmentation via knowledge distillation.  IEEE Trans Med
                                                                  Imaging. 2020;39:2415-2425.
               doi: 10.48550/arXiv.1503.02531
                                                                  doi: 10.1109/TMI.2019.2963882
            25.  Romero A, Ballas N, Kahou SE, Chassang A, Gatta C,
               Bengio Y. Fitnets: Hints for thin deep nets, arXiv preprint   36.  Chen Z, Guo X, Woo PY, Yuan Y. Super-resolution enhanced
               arXiv:1412.6550; 2014.                             medical image diagnosis with sample affinity interaction.
                                                                  IEEE Trans Med Imaging. 2021;40:1377-1389.
               doi: 10.48550/arXiv.1412.6550
                                                                  doi: 10.1109/TMI.2021.3055290
            26.  Gou J, Yu B, Maybank SJ, Tao D.  Knowledge distillation:
               A survey. Int J Comput Vision. 2021;129:1789-1819.  37.  Ho TKK, Gwak J. Utilizing knowledge distillation in deep
                                                                  learning for classification of chest X-ray abnormalities. IEEE
               doi: 10.1007/s11263-021-01453-z                    Access. 2020;8:160749-160761.
            27.  Yap MH, Pons G, Mart’i J, et al. Automated breast ultrasound      doi: 10.1109/ACCESS.2020.3020802
               lesions detection using convolutional neural networks. IEEE
               J Biomed Health Inform. 2017;22:1218-1226.      38.  Li K, Yu L, Wang S, Heng PA. Towards cross-modality
                                                                  medical image segmentation with online mutual knowledge
               doi: 10.1109/JBHI.2017.2731873                     distillation. Proc AAAI Conf Art Intell. 2020;34:775-783.
            28.  Xu K, Rui L, Li Y, Gu L.  Feature Normalized Knowledge      doi: 10.1609/aaai.v34i01.5421
               Distillation for Image classification. In: European Conference
               on Computer Vision. Springer; 2020. p. 664-680.  39.  Qin D, Bu JJ, Liu Z,  et al. Efficient medical image
                                                                  segmentation based on knowledge distillation. IEEE Trans
               doi: 10.1007/978-3-030-58595-2_40                  Med Imaging. 2021;40:3820-3831.
            29.  Zagoruyko S, Komodakis N. Paying more attention to      doi: 10.1109/TMI.2021.3098703
               attention: Improving the performance of convolutional
               neural  networks  via  attention  transfer.  arXiv preprint   40.  Mangalam K, Salzamann M. On compressing u-net using
               arXiv:1612.03928; 2016.                            knowledge distillation. arXiv  preprint arXiv:1812.00249;
                                                                  2018.
               doi: 10.48550/arXiv.1612.03928
                                                                  doi: 10.48550/arXiv.1812.00249
            30.  Wang H, Zhang D, Song Y,  et  al.  Segmenting  Neuronal
               Structure in 3D  Optical Microscope  Images via Knowledge   41.  Owen JP, Blazes M, Manivannan N, et al. Student becomes
               Distillation with Teacher-Student Network. In: 2019 IEEE   teacher: Training faster deep learning lightweight
               16  International Symposium on Biomedical Imaging (ISBI   networks for automated identification of optical coherence
                 th
               2019). IEEE; 2019. p. 228-231.                     tomography b-scans of interest using a student-teacher
                                                                  framework. Biomed Opt Express. 2021;12:5387-5399.
               doi: 10.1109/ISBI.2019.8759326
                                                                  doi: 10.1364/BOE.433432
            31.  He T, Shen C, Tian Z, Gong D, Sun C, Yan Y. Knowledge
               Adaptation for Efficient Semantic Segmentation. In:   42.  Vaze S, Xie W, Namburete AI. Low-memory CNNs
               Proceedings of the IEEE/CVF Conference on Computer Vision   enabling real-time ultrasound segmentation towards
               and Pattern Recognition; 2019. p. 578-587.         mobile deployment.  IEEE J Biomed Health Inform.
                                                                  2020;24:1059-1069.
               doi: 10.48550/arXiv.1903.04688
                                                                  doi: 10.1109/JBHI.2019.2961264
            32.  Liu  Y,  Chen  K,  Liu  C,  Qin  Z,  Luo  Z,  Wang  J.  Structured
               Knowledge Distillation for Semantic Segmentation. In:   43.  Cao Z, Yang G, Chen Q, Chen X, Lv F. Breast tumor


            Volume 2 Issue 2 (2025)                         84                               doi: 10.36922/aih.3509
   85   86   87   88   89   90   91   92   93   94   95