hi ,
yolo used vocs as the dataset and was given weights for 20 classifications.
Can I change the weight of a classification’s prediction in voc? The weights of other classifications remain the same and do not have to be trained again.
For example, I make and train a classification of the set individually, such as a stapler. Keep the weight of the 20 categories of vocs after training plus the weights of the stapler classifications I trained?
My previous approach was to remake the labels for 20 categories, and in order for me to add targets, I needed to re-label all the categories.
Hey @Fatcar2002,
I am assuming that this is not related in particular to any of the specializations? Assuming my assumption to be true, can you please be a bit more specific as to what exactly you want to achieve here?
As to what I could gather, you have the YOLO model pre-trained on the PASCAL Visual Object Classes (VOC) dataset. Now, the model is pre-trained to predict the 20 classes as per the dataset, and you want the model to predict the 19 classes as before, but you also want the model to predict samples from another class, not included in the original training set.
So, to do that, you have samples belonging to the new class. And now, you want to know how you can achieve this. Do let me know if I am missing anything?
Also, are you open to model predicting the samples from 21 classes (20 Original + 1 New), or are you firm on replacing the predictions of 1 of the original classes with predictions of 1 of the new classes. Do let me know if I am stating anything wrong, and then we will continue with the discussion.
Cheers,
Elemento
Thanks for answering my question.
the model is pre-trained to predict the 20 classes as per the dataset, i want the model to predict the 19 classes as before, i want the model to predict samples from another class, not included in the original training set.
i want to know how todo this.
It seems very troublesome to keep 20 categories and add another category
So I just want to do the first part
thanks
Hey @Fatcar2002,
Apologies for the delayed response. I suppose the second part would be much easier. You would just need to take the pre-trained model with all of it’s layers, and perhaps 1-2 extra dense layers, and a new softmax layer with 21 outputs, and you can now fine-tune this network to give the desired output.
You would just need to figure out 2 things. First, how many existing layers should be frozen and how many of the existing layers should be fine-tuned along with the newly added layers. Second, what should be the composition of the new dataset. In my opinion, it should be all the samples of the new class, along with a considerable proportion of the existing 20 classes. If you only take the new samples, then the newly added layers won’t learn to classify the existing 20 classes.
Now, let’s come to another approach, i.e., predicting 19 classes out of the 20 original classes + 1 new class. This is in fact more troublesome, since we don’t know exactly which of the weights in the neural network are responsible for the 1 class that we want to remove, and these could be spread throughout the width and depth of the network.
One of the approach that comes to my mind similar to the one aforementioned is, instead of adding a softmax layer with 21 outputs, you can add a new softmax layer with 20 outputs only. And when you fine-tune the network, the dataset may contain all the samples of the new class, a considerable portion of the 19 existing classes and no samples from the class that you want to remove.
Both of the approaches seems to be promising to me, however for figuring out which one works the best, you need to try both of them out and compare them quantitatively.
Cheers,
Elemento