FIrst of all, in most cases, adding more layers to a neural network will also increase the number of parameters.
Large model means that it has the potential to learn more complex relationships from the data, in that sence, it could mitigate the risk of adding data.
On the other hand, adding too much data to a model not “large enough” could lead to overfitting, meaning it memorizes the training data too well and performs poorly on unseen data.