Wrong Answer in Course 4 Week 1 Convulutional Application

Your first Conv2D layer looks correct, but the ReLU and MaxPool layers after that are wrong: you do not pass them an input tensor. Here in this section we are doing the “Functional API”, right? It requires that you make two calls on each line: the first call is to the layer function to set the parameters and then you call the returned “instantiation” of the function with an input tensor. So looking at your code the variables A1 and P1 contain functions not tensors.

Here’s a thread from a fellow student that gives more explanation about how the Sequential and Functional APIs work.