@paulinpaloalto @Deepti_Prasad @ai_curious @Honza_Zbirovsky @TMosh @mrtckaya can you please help me as always? Thanks in advance.
Can I know if you have taken any tensorflow specialisation course??
That unbuilt is because you have defined your model layers without any specifications from Global average pool. It cannot magically add it’s own own kernel, padding or unit!!!
I would sincerely advise you to first make your basics strong. I hope you have completed Deep learning specialisation, tensorflow developer professional and tensorflow advanced technique specialisation.
Otherwise you are only running in vicious circle in an unknown land.
Regards
DP
Thanks a lot for the reply and suggestions sir.
But, this has been a practice I usually used in my projects, and turns to work out well. I also checked the tensor flow website and find same thing. I don’t know what am getting wrong ¿
This is from where you got, when I mentioned adding specification it doesn’t need to be always related to the specified layers.
The major difference between your above image and the image in this comment, shows the parameters already being divided into trainable and non-trainable.
Kindly check the below to understand how transfer learning and fine tuning a model is done
ट्रांसफर लर्निंग और फाइन-ट्यूनिंग | TensorFlow Core(by%20setting%20layer.trainable,will%20freeze%20all%20of%20them.
Regards
DP
It above doesn’t work for me, but immediately after fitting the model I tried it and successfully worked witout changing any of my code. Is this a good practice?
I really don’t want to assume anything by your statement. What do mean by the above statement?