Week 2 assignment 1 ex 3

Is there any miss step in the notebook



According to the figure there is a ReLU activation but the notebook didnt specify it
also the code did specify there is ReLU activation block

# UNQ_C3
# GRADED FUNCTION: ResNet50

def ResNet50(input_shape = (64, 64, 3), classes = 6):
...
    
    # Define the input as a tensor with shape input_shape
    X_input = Input(input_shape)

    
    # Zero-Padding
    X = ZeroPadding2D((3, 3))(X_input)
    
    # Stage 1
    X = Conv2D(64, (7, 7), strides = (2, 2), kernel_initializer = glorot_uniform(seed=0))(X)
    X = BatchNormalization(axis = 3)(X)
    X = Activation('relu')(X)  #The Notebook didn't write this in the detail
    X = MaxPooling2D((3, 3), strides=(2, 2))(X)

...

Actually, it’s not much of an error - Stage 1 is already coded for you, and it does have a ReLU layer. It’s just not mentioned in the text.
I’ll add a support ticket to have “ReLU” added to the details text for Stage 1.

Thanks for the clarification, although I already done the assignment