Research Article

Hybrid Fine-Tuning Strategy for Few-Shot Classification

Table 3

Comparison results under different pretraining methods on tiered-ImageNet. “Pre-tra” and “Lay-fro” are short for the pretraining method and the layer frozen policy, respectively. We report the mean accuracy of 600 episodes and the 95% confidence intervals.

Pre-traLay-fro1-Shot5-Shot10-Shot20-Shot30-ShotAverage gain

R2D2FT-Last152.100.7068.990.7073.210.3076.820.8980.381.242.66
HFT-Last155.180.7272.260.6675.190.6280.350.6381.820.62
FT-all52.900.7870.870.6975.020.2880.040.7284.690.961.45
HFT-all55.180.7272.260.6676.570.2481.500.9185.240.22

SKD-GEN0FT-Last160.510.7576.280.8080.540.7183.840.6786.100.613.58
HFT-Last164.170.8279.420.6183.750.5387.600.4290.250.34
FT-all61.050.7676.370.7983.460.5087.011.3590.451.261.58
HFT-all64.170.8279.420.6183.790.8987.760.9891.090.93

RFS-simpleFT-Last160.450.9874.090.7978.860.5883.360.6183.511.522.86
HFT-Last163.760.8877.740.5781.270.5385.350.5086.450.57
FT-all60.560.9775.390.8080.180.4383.900.4788.041.351.77
HFT-all63.760.8877.740.5781.420.8184.980.6589.010.81

Average gain3.373.372.412.512.132.78