C2_W3_Assignment PSA on regularizers parameter

Exercise 5 of Part 6 of the assignment includes the following instructions:

  • Dense layer with 120 units, relu activation, kernel_regularizer=tf.keras.regularizers.l2(0.1)
  • Dense layer with 40 units, relu activation, kernel_regularizer=tf.keras.regularizers.l2(0.1)

I foolishly mistook the “l2” parameter of the regularizers as “12” and encountered “Invalid syntax” error.
So if you happened to type “tf.keras.regularizers.12(0.1)”, simply correct it to l2.

1 Like

Thanks

Twelve and “ell 2” look a lot alike.

1 Like

I am highly curious why the Keras authors didn’t use “L2” to remove ambiguity.

1 Like

Both l2 and L2 are available and point to the same thing :wink:

Screenshot from 2022-09-07 12-50-49

1 Like

Thanks.
I don’t really see why “l2” was considered a good idea.

1 Like