For the model, it was the Distilbert Bert Encoder.
If anyone is interested, I may upload the code later. However, there are many better demos out there. But if anyone needs help on a similar project, please feel free to post your question here.
Due to the nature of React Native, it is cross-platform, so its javascript code can work on Android and IOS (generally).
For the ram usage, I did not test it. But I will do it later. However, I see standard Bert requires 2GB for 512 sequence lengths of the input. But this Distilbert Bert should require smaller ram usage than the standard Bert.
Keep that in mind, It is not really difficult to deploy the Trax model on Andorid and IOS with React Native if someone comes form the NLP course at Deep Learning.ai.