Question on activation of Top Layer for Alpaca Classifier

Why does the assignment want a linear activation for the top layers instead of sigmoid. Wouldn’t it make more sense to use a sigmoid activation for a binary classifier?

(edited)
When the model is compiled, it doesn’t use a linear activation. The compilation uses binary cross-entropy with from_logits=True.