C4_W4_A1 assignment UNQ_C4 got wrong model

Hello,

The error for UNQ_C4 is as follows:

Wrong model.
Expected: Serial[
Serial[
Serial[
ShiftRight(1)
]
Embedding_33000_512
Dropout
Serial[
PositionalEncoding
]
Dup_out2
ReversibleSerial_in2_out2[
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
]
Concatenate_in2
LayerNorm
Dropout
Serial[
Dense_33000
]
]
LogSoftmax
].
Got: Serial[
Serial[
ShiftRight(1)
]
Embedding_33000_512
Dropout
Serial[
PositionalEncoding
]
Dup_out2
ReversibleSerial_in2_out2[
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
]
Concatenate_in2
LayerNorm
Dropout
Serial[
Dense_33000
]
]
Wrong model.
Expected: Serial[
Serial[
Serial[
ShiftRight(1)
]
Embedding_100_512
Dropout
Serial[
PositionalEncoding
]
Dup_out2
ReversibleSerial_in2_out2[
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
]
Concatenate_in2
LayerNorm
Dropout
Serial[
Dense_100
]
]
LogSoftmax
].
Got: Serial[
Serial[
ShiftRight(1)
]
Embedding_100_512
Dropout
Serial[
PositionalEncoding
]
Dup_out2
ReversibleSerial_in2_out2[
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderAttn_in2_out2[
Serial[
LayerNorm
]
SelfAttention
]
ReversibleSwap_in2_out2
ReversibleHalfResidualDecoderFF_in2_out2[
Serial[
LayerNorm
Dense_2048
Dropout
Serial[
FastGelu
]
Dense_512
Dropout
]
]
ReversibleSwap_in2_out2
]
Concatenate_in2
LayerNorm
Dropout
Serial[
Dense_100
]
]
0 Tests passed
2 Tests failed

However, the generated model is not expected. I checked the documents trax docu, but cannot find any possible solution. Seems to me I only need to assign the 4 inputs but it didn’t work. Could you please help me out? Thanks

Hello @Haijie_Zhang
Posting code in this forum is strictly against community guidelines. Kindly remove your code. Instead you can post the complete error trace here so that mentors can help you.

Hello,
i edited the post. Seems to me it is the structure of the model. But from the trax docu, I cannot figure out what else I can do.

Hi @Haijie_Zhang

My guess is you probably accidentally removed
, tl.LogSoftmax() line after tl.Serial()

The Assignment code for UNQ_C4:

# UNQ_C4
# GRADED FUNCTION
def ReformerLM(vocab_size=33000, n_layers=2, mode='train', attention_type=tl.SelfAttention):
    
    ### START CODE HERE ###
    # initialize an instance of Trax's ReformerLM class
    model = tl.Serial( 
                trax.models.reformer.ReformerLM( 
                # set vocab size
                None,
                # set number of layers
                None,
                # set mode
                None,
                # set attention type
                None
            )
            , tl.LogSoftmax() 
        )        
    ### END CODE HERE ###
    return model # tl.Serial(model, tl.LogSoftmax(),)


In trax.models.reformer.ReformerLM() when you are replacing None with your code, make sure to write the parameter name and then assign its value.
e.g.Instead of
vocab_size,
write
vocab_size = vocab_size,

3 Likes

@Haijie_Zhang I had the same issue. Fixed it after following @Aayush_Jariwala’s suggestion.

2 Likes