Alpaca_Model Missing Layer

Good day,


I am missing a layer in my model. I appended a Dense 1 neuron layer to the model for the classification layer. I am a bit confused, since it is a binary classification problem, we need a ReLU activation or a Linear Activation correct? What’s the difference between Keras.layers. ReLU and Dense(1, ‘relu’)(x).

if its a binary classification then you probably need a sigmoid activated output (0 or 1) . A linear or relu give out a continuous output…

About the number of layers you should be able to check model.summary() and sort out that issue easily.


Sigmoid of course!

But I am missing the sequential layer, but I am unsure why. Why do we even need it if we already have the base model and the newly defined input layer. Also, what is tf_op_layer, I don’t recall adding those, are those the data augmentation layers?

I am not familiar with this assignment, but logically what I wrote initially should be helpful. Hopefully a mentor from DLS can have a look on this as well.

What they are doing for the test of your implementation is very “literal minded” here: they just do a string comparison of the output of the model summary. So there can be equivalent ways to express things that end up looking different in the summary. Using the explicit ReLU layer function as opposed to the generic Activation layer function with ReLU as the argument is one such example, although that’s not your problem here.

It looks like the way you have expressed what happens right after the input layer is different than what they are expecting. It’s supposed to be the data augmentation function, which is a Keras Sequential object, right? That was passed to you as an argument and you can just use it “as is”. Are you sure you didn’t use data_augmenter instead? That’s the function you invoke to get the function you actually need. They gave you that logic in the template code.

As to sigmoid, you don’t need to include that. Prof Ng always uses “from_logits = True” mode for the loss functions here, so we just output the “logits” (linear output) at the output layer and then let the cost function handle the sigmoid or softmax internally.

1 Like

Thank you so much. I managed to solve my problem.