Can someone explain why an RNN is depicted like this in the Cost Function for RNNs lecture? I thought at each timestep only one input is send to the RNN-cell that it’s mapped to (in many to many RNNs). This picture reminds me of general DNNs, or is it what it’s referred to here?
https://www.coursera.org/learn/sequence-models-in-nlp/supplement/KBmVE/cost-function-for-rnns