| Layer | Algorithm structure | Parameters |
| Branch 1 | Convolution, batch normalization, ReLU activation | Cov1d (133, 128, 3, 2, 1) | Convolution, batch normalization, ReLU activation | Cov1d (128, 64, 3, 2, 1) | Convolution, batch normalization, ReLU activation | Cov1d (64, 32, 3, 2, 1) |
| Branch 2 | Convolution, batch normalization, ReLU activation | Cov1d (133, 128, 5, 2, 2) | Convolution, batch normalization, ReLU activation | Cov1d (128, 64, 5, 2, 2) | Convolution, batch normalization, ReLU activation | Cov1d (64, 32, 5, 2, 2) |
| Branch 3 | Convolution, batch normalization, ReLU activation | Cov1d (133, 128, 7, 2, 3) | Convolution, batch normalization, ReLU activation | Cov1d (128, 64, 7, 2, 3) | Convolution, batch normalization, ReLU activation | Cov1d (64, 32, 7, 2, 3) |
| GAP layer | Global average pooling | C:32 |
| Feature fusion layer | Feature concatenation | C:96 |
| Soft thresholding activation layer | Soft thresholding activation module | C:96 |
|
|