As noted in this post, the training is very slow on the C2W2 lab and requires babysitting.
This blog post argues that 512 is a better batch size (on a totally different application) but honestly, I’ll try anything since the training looks like it will take 45 minutes, and I have to babysit it with frequent interactions with the browser or it self-cancels. Any ideas?
Thanks for the reply. I believe this question stands on its own and the link to the related question was for convenience of fellow learners. I tried to delete the link I provided earlier to avoid confusion, but the UI would not allow it. I still welcome comments from mentors or course staff on recommended batch size, independent of the issue with extreme slowness of the C2W2 lab.