You can see the bug in the code that @priyamthakkar8 posted above. Notice that the first call to ReLU above that one is different. The syntax is pretty confusing here and they don’t do a good job of explaining it, but you are doing everything in two steps here:
- You call a TFL function to “instantiate” it with a given set of parameters.
- Then you “invoke” that function with an actual tensor as input and it outputs a tensor.
That’s why you see two sets of parens in the earlier correct expression:
The ReLU() is the “instantiation” step. That gives you back a function and then you invoke it with the input Z1.
Here is a great Discourse post from Ai_Curious which explains the Keras Sequential and Functional APIs. Way more useful than the minimal explanations in the course materials.