# DO-Conv **Repository Path**: frontxiang/DO-Conv ## Basic Information - **Project Name**: DO-Conv - **Description**: Depthwise Over-parameterized Convolutional Layer - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2020-08-11 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # DO-Conv: Depthwise Over-parameterized Convolutional Layer Created by Jinming Cao, Yangyan Li, Mingchao Sun, Ying Chen, Dani Lischinski, Daniel Cohen-Or, Baoquan Chen, and Changhe Tu. ## Introduction DO-Conv is a depthwise over-parameterized convolutional layer, which can be used as a replacement of conventional convolutional layer in CNNs in the training phase to achieve higher accuracies. In the inference phase, DO-Conv can be fused into a conventional convolutional layer, resulting in the computation amount that is exactly the same as that of a conventional convolutional layer. Please see our preprint on arXiv for more details, where we demonstrated the advantages of DO-Conv on various benchmark datasets/tasks. ## ImageNet Classification Performance We take the model zoo of GluonCV as baselines. The settings in the baselines have been tuned to favor baselines, and they are not touched during the switch to DO-Conv. In other words, DO-Conv is the one and only change over baselines, and no hyper-parameter tuning is conducted to favor DO-Conv. We consider GluonCV highly reproducible, but still, to exclude clutter factors as much as possible, we train the baselines ourselves, and compare DO-Conv versions with them, while reporting the performance provided by GluonCV as reference. The results are summarized in this table where the “DO-Conv” column shows the performance gain over the baselines.
Network | Depth | Reference | Baseline | DO-Conv |
---|---|---|---|---|
Plain | 18 | - | 69.97 | +1.01 |
ResNet-v1 | 18 | 70.93 | 70.87 | +0.82 |
34 | 74.37 | 74.49 | +0.49 | |
50 | 77.36 | 77.32 | +0.08 | |
101 | 78.34 | 78.16 | +0.46 | |
152 | 79.22 | 79.34 | +0.07 | |
ResNet-v1b | 18 | 70.94 | 71.08 | +0.71 |
34 | 74.65 | 74.35 | +0.77 | |
50 | 77.67 | 77.56 | +0.44 | |
101 | 79.20 | 79.14 | +0.25 | |
152 | 79.69 | 79.60 | +0.10 | |
ResNet-v2 | 18 | 71.00 | 70.80 | +0.64 |
34 | 74.40 | 74.76 | +0.22 | |
50 | 77.17 | 77.17 | +0.31 | |
101 | 78.53 | 78.56 | +0.11 | |
152 | 79.21 | 79.24 | +0.14 | |
ResNext | 50_32x4d | 79.32 | 79.21 | +0.40 |
MobileNet-v1 | - | 73.28 | 73.30 | +0.03 |
MobileNet-v2 | - | 72.04 | 71.89 | +0.16 |
MobileNet-v3 | - | 75.32 | 75.16 | +0.14 |