Transfer learning is a technique in which the pertained weight and bias of a neural network rather you can say deep neural network is used in another dataset. There are various transfer learning models we will discuss in the following.
Four different cases where we can use TL models
When we have a small datasets and similar to training data of pre-trained model
When we have a small datasets and different to training data of pre-trained model
When we have a large datasets and similar to training data of pre-trained model
When we have a large datasets and different to training data of pre-trained
Case 1: Small datasets and similar to training data of pre-trained model
If the new data set is small and similar to the original training data:
• Remove the last layer of the pre-trained network.
• We need to add a fully connected network which have the same classes to be classified
• Then the next step is to freeze all the previous networks and randomly initialize the fully connected layer.
• In the final step update weight and bias of the newly created FC layer.
Case 2: Small datasets and different to training data of pre-trained model
If the new data set is small and different from the original training data:
We will follow the same procedure what we are doing before.
• The first step will be to remove the last layer of the pre-trained network.
• Fully connected network will be added which have the same classes to be classified
• Then the next step is to freeze all the previous networks and randomly initialize the fully connected layer.
• In the final step update weight and bias of the newly created FC layer.
Case 3: Large datasets and similar to training data of pre-trained model
If the new data set is large and similar to the original training data:
• Like previous two cases here also we have to remove thee last output layer and add a fully connected layer which matches the number of classes.
• Randomly initialize the weight and bias of FC layers.
• Re-train the entire neural network.
Case 4: Large datasets and different to training data of pre-trained model
If the new data set is large and different from the original training data:
• Same procedure will be applied i.e. remove last layer but here something will be different as it is different from pre-trained network we can use the architecture of the model.
• Randomly initialize weights and bias for the whole network.
• Alternatively, you could just use the same strategy as the "large and similar" data case.
Popular Transfer Learning Models:
AlexNet
ResNet
GoogleeNet
VGGNet
credit: TechTrunk
Kommentare