Is there a way to get YOLO input from a PC/MAC WEBCAM?
As opposed to some other video source, maybe like this one ==> [YOLO Open Images in New York - YouTube] ?
This is a presentation by Mr My Little Resume himself, Joseph Redmon. See @ the 2:00 mark, if want to cut right to the chase. [You Only Look Once: Unified, Real-Time Object Detection - YouTube]
Here is what i am talking about. For OUR YOLO homework,
we are reading images from a file.
I would like a way to feed live data to the YOLO scripts, from the
webcam on the MAC. Will not be trivial since YOLO is running on the
remote server. But, there may some clever way to do it.
It would make YOLO exercise more interesting and configurable.
+1 for thinking about the use case. I’m not sure sending captured video frames across the internet to a remote YOLO engine is the way to achieve this, however. Maybe better to move the computation closer to the data in this case, which is what both the examples linked above are doing, rather than the other way around. Maybe take a look out on the googleverse at Tiny YOLO, and let us know what you think?
Then there’s this from section 5 of the original paper…
YOLO is a fast, accurate object detector, making it ideal for computer vision applications. We connect YOLO to a webcam and verify that it maintains real-time performance,
my emphasis added