Research Article

[Retracted] Classroom Behaviour of Talent Cultivation in Colleges and Universities Supported by Deep Learning Technology

Algorithm 1

Convolution neural network with an attention mechanism.
Phase 1: traditional phase
Set all of the CNN-AM’s weights and biases to a low number.
Set the learning rate so that
repeat
for to , do
disseminate the pattern over the network, propagate the pattern throughout the network
for to the number of neurons in the output layer
Error detection
end for
for layers to 1, do
for maps to , do
find a back-propagated error factor
end for
end for
for , I do
for to , do
for all weights of the map, a do
Find
Weights and biases should be updated
end for
end for
Calculate the Mean Square Error (MSE1)
Until or
Phase 2: knowledge transfer repeat
from to PS (number of new training samples) propagate the pattern across the network
for to the number of neurons in the last convolutional layer ()
find the output of the last layer of the convolutional layer.
= (,, …….. )
Find using the TSL framework (Section 3)
end for
Phase 3: update your weight for the transfer learning phase.
for to PR
Train the feedforward layers (layers after the last convolutional layer) using available in phase
Gradient (second) descend algorithms [14] are a viable option.
end for
Find MSE2
Until or