I had some fun tweaking the optional assignment this week to run the rock, paper, scissor classifier to classify live camera feed from Raspberry Pi.
If anyone wants to give it a try, you can download the files from Github at:
There are four files:
RPS_Camera.ipyb - Used to create the rock, paper, scissor classification model in Jupyter notebook. I used transfer learning for the model using mobilenet_v2 as feature extractor and adding one dense layer for classification. This code also saves the retrained model and coverts to tflite format.
rps.tflite - Rock Paper Scissor tflite model.
rps_main.py - Python script to execute rps.tflite on Raspberry Pi from command line.
README.md - Instructions for command line execution.
I hope you would have fun and please share your feedback.
Hi Sumeet_Mahesh!
That’s great effort and work! Thanks for sharing your work and knowledge with us, will go through it and will have some fun with it
Regards,
Nithin