Softmax cross-entropy loss

why softmax cross-entropy loss does not include any reference of y[i](the target)?

Your question is not clear to me. First, we use softmax to find the \hat{y} and then use cross-entropy to find whether \hat{y} matches y or not.

Best,
Saif.

For softmax entropy loss we include reference of y[i]
Our formula is -y[i]log(p(i)) (for y[i]=1)
Here p[i] is predicted and y[i] is truth value