In fact, I can use a smaller model, but I used a larger model because I want to see how much it can do. The model was 99MB, so it will take some time to load the model. If you trained it on 20 MB Resnet50, it will not take a lot of time to load the model.
Remember, once the model is loaded into your browser, you can do the model inference offline since everything is in your bowser.
Click me to see the Reactjs Github-Page Example
It was trained on the dataset that I built.
- Imagenet example:
If you want to see the imagenet Reactjs example, you can check it out here. It was built in another way, which was using torch-serve.