Week 1 , Programming Assignment: Convolution Model Application

The first invocation of ReLU is incorrect. The second one is closer, but maybe more complicated than you need. This is the Keras Functional API, so each layer function happens in two steps:

You invoke the layer function to define the parameters that you want. That gives you back an instantiated function.

Then step 2 is that you invoke that function with an explicit input tensor. It gives you back an output tensor or whatever the defined outputs of that layer function are.

So just saying

A1 = tfl.ReLU(Z1)

is not what you want. That probably throws an error or just defines A1 as a function, not a tensor.

Here’s a thread with a good introduction to the Sequential and Functional APIs.

1 Like