Data Distillation: Towards Omni-Supervised Learning
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章
車道線-論文閱讀: Learning Lightweight Lane Detection CNNs by Self Attention Distillation
yaoling-xumi13
2020-06-13 09:50:41
Stacked Hourglass Networks for Human Pose Estimation
yaoling-xumi13
2020-03-30 14:17:37
Deeplab v3+論文筆記
yaoling-xumi13
2020-03-03 00:01:12
空洞卷積Atrous convolution和ASPP
yaoling-xumi13
2020-03-01 22:30:54
論文閱讀:Joint Learning of Single-image and Cross-image Representations for Person Re-identification
丁香留心
2020-02-25 20:54:02
論文閱讀:Multi-Scale Triplet CNN for Person Re-Identification
丁香留心
2020-02-25 20:54:02
論文閱讀:DENSELY CONNECTED CONVOLUTIONAL NETWORKS
丁香留心
2020-02-25 20:54:02
caffe初始化-Xavier
Iriving_shu
2020-02-24 21:37:43
CVPR2017文章彙總
Iriving_shu
2020-02-24 21:37:43
Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning論文筆記
Iriving_shu
2020-02-24 21:37:43
ICCV 2019幾篇自己感興趣的論文閱讀
Hungryof
2020-02-22 08:17:47
論文閱讀:Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour
yl13
2019-08-25 13:43:07
Deformable Conv及其變體
Hungryof
2018-12-29 00:12:27
Swapout: Learning an ensemble of deep architectures
wydbyxr
2018-11-30 17:23:05
Log-DenseNet: How to Sparsify a DenseNet
wydbyxr
2018-11-30 17:23:05
24小時熱門文章