I am trying to implement the linear-activation forward (relu):
{mentor edit: code removed}
What does this error means and how can I fix it?
forget that, I made a mistake in the Z, linear_cache =…
Thanks for the update.
A reminder: Please do not share your code on the forum - that’s not allowed by the Code of Conduct.
Posting a description of the issue, and a screen capture image of any error messages, is usually sufficient for diagnosis.
1 Like