How to avoid loading large data

Hi, I have a question involving some speech to text conversion application. The instance of data used for training the model is 16 mb but the real time data instance is 1600 mb and this is creating a bottleneck for the system as we can load fewer data instances at the real time. If the size of the realtime data instances can be reduced we can handle more users.

My question is can we use data normalization for the realtime data as well like we do for training data? Or any other solution? Can anyone guide me, thanks in advance

Hi, @Shuja_Ur_Rehman_Toor!

Maybe data format and data types play an important role in this case. Are you using the lightest type when loading data? Casting to a less memory consuming format (like float64 to float16) can be a good solution if it doesn’t affect the performance.

2 Likes

Thanks for your reply @alvaroramajo. Will try different techniques and see if it works :+1: