搜索knowledge distillation共有 10 个结果
1
Knowledge distillation algorithm based on spatial attention map
Accepted Paper
2024年第6期 :
doi:10.19734/j.issn.1001-3695.2023.10.0496
2
Question answering model based on self-distillation and self-ensemble
2024年第1期 : 212-216
doi:10.19734/j.issn.1001-3695.2023.05.0281
3
Fine-grained visual classification method based on knowledge distillation and target regions selection
2023年第9期 : 2863-2868
doi:10.19734/j.issn.1001-3695.2022.12.0809
4
Multi-teacher learning graph neural network based on feature and graph structure information augmentation
2023年第7期 : 2013-2018
doi:10.19734/j.issn.1001-3695.2022.11.0765
5
Distilling object detectors via knowledge review and decouple
2023年第5期 : 1542-1547
doi:10.19734/j.issn.1001-3695.2022.09.0430
6
Unsupervised distillation hashing image retrieval method based on equivalent constraint clustering
2023年第2期 : 601-606,627
doi:10.19734/j.issn.1001-3695.2022.06.0274
7
Experimental research on intelligent prospecting prediction based on multi-scale feature and meta-learning
2022年第6期 : 1772-1778
doi:10.19734/j.issn.1001-3695.2021.10.0625
8
OKDCR: online knowledge distillation via consistency regularization
2021年第11期 : 3249-3253
doi:10.19734/j.issn.1001-3695.2021.05.0139
9
Knowledge distillation method for remote sensing satellite image classification based on pruning network
2021年第8期 : 2469-2473
doi:10.19734/j.issn.1001-3695.2020.07.0387
10
Real time object detection method based on improved attention transfer
2021年第4期 : 1212-1215
doi:10.19734/j.issn.1001-3695.2020.02.0079