I am getting an error in implementing the RELU function . please tell me where am I doing the mistake
Your problem is that you are not using “layer” functions correctly in some cases. They don’t give us as much instruction on how all this works as one might wish, but here’s a thread that gives some help with the Keras Sequential and Functional APIs. Please have a look at that first.
The specific problem with the ReLU layer function is that you are invoking it with the wrong arguments. What is supposed to happen is that first you invoke the “layer” function with the appropriate parameters and that gives you back the instance of the actual function. Then you invoke that with the input tensor. You’ve skipped the first step, which is why the error is thrown: it sounds pretty cryptic, but that is because it is expecting configuration parameters for ReLU, but you’ve given it an input tensor instead.
Note that you then have a different problem with the next MaxPool2D layer: you only give it parameters and no actual input tensor.
The simplistic way to say it is that there need to be two separate sets of parentheses anytime you are using a Keras Layer function with the Functional API. In the case of the Sequential API, you only have the first set of parens, because the invocation with the input tensor is “implicit”.