Week 4, linear-activation forward

I am trying to implement the linear-activation forward (relu):
image
{mentor edit: code removed}
What does this error means and how can I fix it?
image

forget that, I made a mistake in the Z, linear_cache =…

Thanks for the update.

A reminder: Please do not share your code on the forum - that’s not allowed by the Code of Conduct.

Posting a description of the issue, and a screen capture image of any error messages, is usually sufficient for diagnosis.

1 Like