Syntax - A1 = tf.keras.layers.ReLU()(Z1)


I am a little confused about the syntax below. While I understand that we reference a layer in the second parenthesis, and how does that work from DL perspective, I am really confused about that from Python syntax perspective.

I have used Python for a while and I have not seen that syntax, is that TF specific?

A1 = tf.keras.layers.ReLU() ->(Z1)<-


In the first set of parentheses you instantiate the object, in the second set you place inside the variable that you want to perform operation on i.e. like object(z1)

1 Like

Just a few additions to @gent.spah 's comment.

A1 = tf.keras.layers.ReLU()(Z1)
is equivalent to
relu = tf.keras.layers.ReLU()
A1 = relu(Z1)

As described, the first one is to instantiate the object, and the second one is to pass Z1. Of course, you can write it by two lines, but we usually want to simplify a code.


Sorry if this is “gilding the lily”, but I can’t help myself :nerd_face::

Note that all the TF/Keras “layer functions” return a function as their return value. That’s what Gent and Nobu mean when they say “instantiate the object”. First you define the layer function and then you call (invoke) it with the input tensor. That’s why you need two layers of parens there. So, yes, this is TF specific: it depends on the definition of the Layer class in TF/Keras.

They don’t give us that much explanation of the Sequential and Functional APIs in the course material. There is plenty of documentation on the TF site and here’s a really nice thread here on Discourse that gives a nice walkthrough,

Functional APIs is, of course, TF specific, but, I suppose, is using basic Python capabilities, i.e, “callable” which makes a class instance callable. Here is an example.

I created a “test” object, and made it “callable” with adding “_call_() method”. (1st cell)
The 2nd cell is to instantiate the object, then, pass “10” in the 3rd cell. These two steps can be written in a single like just like the one in the 4th cell.

This is a same behavior as
A1 = tf.keras.layers.ReLU()(Z1)

1 Like