Name of course is Deep learning specialization
Week is 5 sequence models
Last programming assignment C5_W4_A1_Transformer_Subclass_V1.pynb
All codes running written all tests passed but after submission showing c4 call function failed & therafter every cellblock failed hence got 37 marks passing is 80
There is an error in your EncoderLayer() code. The grader tests your code with a different set of conditions and data than are used in the notebook’s tests.
This will make all of the other parts of the Transformer method also fail.
Are you saying that, there is a function that expects for some input parameters, but you want to call that function without providing those parameters?
If that is the case, it will not work.
By the way, you need to be more specific, because when you say “calling the function”, it is not clear which function you are talking about. Your UNQ_C4 failed the autograder, but that exercise uses a few functions and every one of them requires some input arguments and we cannot just remove them.
Did you send your code to @Deepti_Prasad via PM as asked? I believe you will need to send the code for UNQ_C4.
OMG It’s you @rmwkwok Raymond, couldn’t recognise you in your new dp . Looking great
As far the post query, I recently came across another learner whose codes for encoder layer were correct but because the way get angles, scaled dot product attention had an incorrect tf function call, it kept giving the learner error. Hence wanted to see preeti’s code first before responding anything.
Also on side note i want to ask if i m nt able to
pass still will i be getting completion DL
Specialization certificate as everything else i passed ,being a bureacrat i m facing time issues
Thanks
You would only getting DLS certificates when you pass all the test from the assignment, the submission page shows, you need to get minimum of 80% to pass the grader.
As far as time issue, Coursera biggest plus point is you can always take your time and push the date if within the subscription time period working on weekends.
Will respond in sometime, working on other learner’s issue.
I saw your exercise 4 codes, and the major difference I saw your recalled code statements differs, it seemed either you are using an old copy or you have edited the code cells totally
I am sharing my code cell screenshot of exercise 4 for you to compare the difference.
Your code cell include multi_attn_output and mine doesn’t.
Also based on mine code screen share for the self-attention, you do not require to add training parameter as dropout is automatically added by keras
calculate self-attention using mha(~1 line).
Dropout is added by Keras automatically if the dropout parameter is non-zero during training.
I sincerely suggest getting a fresh copy and then redoing the assignment in case there was editing done and would be beyond detection with your old codes recall.
Please don’t share codes on public post threads as it is against community guidelines. read my previous comment which will show your codes don’t match with my codes recall.
My codes show
self_mha_output
skip_x_attention
ffn_output for the multihead attention
fun for dropout application
encoder_layer_out
your codes
self_attn_output
self_attn_output with dropout application
mult_attn_output
ffn_output
ffn_output with dropout application
encoder_layer out
Posting codes, links to your assignments or posting asssignment notebooks on public post thread is against community guidelines. By doing so you are not following Code of Conduct and it can affect your continuity on the community discourse. Kindly remove the notebook assignment from here.
You can DM your assignment notebook via personal DM.