Can someone explain why an RNN is depicted like this (Cost Function)?

Can someone explain why an RNN is depicted like this in the Cost Function for RNNs lecture? I thought at each timestep only one input is send to the RNN-cell that it’s mapped to (in many to many RNNs). This picture reminds me of general DNNs, or is it what it’s referred to here?

https://www.coursera.org/learn/sequence-models-in-nlp/supplement/KBmVE/cost-function-for-rnns

Hi @saidubbaka

The image is just a simple Muli-layer Perceptron trying to show how the categorical cross entropy works. As you mentioned, RNNs have diffetent structures but their structure is not shown.

Hope it helps! Feel free to ask if you need further assistance.