A regularization strategy combining Dropblock and Dropout
摘要:
为了能够全面且高效加快卷积分类网络的收敛速度和提升稳定性,提出了一种新的正则化策略,将Dropblock算法和Dropout算法相结合,从而实现对整个卷积分类网络的浅层、中层和深层网络进行正则化.其中,Dropblock通过隐藏部分特征图实现卷积层正则化,Dropout通过隐藏部分权重参数实现全连接层正则化,从而实现对整个卷积分类网络进行全面正则化.通过Kaggle猫狗分类大赛提供的数据集进行训练和测试实验表明,提出的新的正则化策略可有效加快分类网络的收敛速度和提升稳定性,此外,能有效提高深度卷积分类网络的分类准确率.
In order to enchance the convergence rate and robustness of classification networks,a new regularization strategy is proposed.The Dropblock algorithm and Dropout algorithm are combined to regularize outputs of the shallow,middle and deep layers in a convolutional classification network.The Dropblock can implement regularization by hiding part of feature maps of convolution layers,the Dropout can implement regularization by hiding part of weight parameters of fully connected layers.Therefore,a whole convolutional classification network can be regularized.Training and testing experiments were conducted through the dataset provided by the Kaggle dogs vs cats classification competition.It shows that the new regularization strategy can effectively enhance the convergence rate and robustness of classification networks.In addition,the regularization strategy can effectively improve the classification accuracy of deep convolutional classification network.
作者:
胡辉 司凤洋 曾琛 舒文璐
Hu Hui;Si Fengyang;Zeng Chen;Shu Wenlu(School of Information Engineering,East China Jiaotong University,Nanchang 330013,China)
机构地区:
华东交通大学信息工程学院
出处:
《betway官方app 学报:自然科学版》 CAS 北大核心 2019年第6期51-56,共6页
基金:
国家自然科学基金(61761019) 江西省自然科学基金(20142BAB207001)
关键词:
正则化 DROPOUT Dropblock 收敛速度 稳定性
regularization Dropout Dropblock convergence speed stability
分类号:
TN951 [电子电信—信号与信息处理]