DLS C2W1 Initialization Grad Check

Course 2 Week 1 - Initialization exercise. All tests pass, but grader comes by 67%.
Grader Output is less than helpful:
[ValidateApp | INFO] Validating ‘/home/jovyan/work/submitted/courseraLearner/W1A1/Initialization.ipynb’
[ValidateApp | INFO] Executing notebook with kernel: python3
Tests failed on 1 cell(s)! These tests could be hidden. Please check your submission.

Hi, @vanbrackin.

Did you fix the error?

In case it helps others, be careful not to confuse Xavier initialization with He initialization.

Ramon,

Thank you for the guidance.

I understand the difference between Xavier & He initialization. Xavier uses sqrt(1./layers_dims[l-1])

while He uses sqrt(2./layers_dims[l-1]).

The problem here is that when I use [l-1], I get a division by zero error (when l = 1).

When I use np.random.randn(layers_dims[l],layers_dims[l-1])*np.sqrt(2/(l)) The output fails the test.

Error: Wrong output for variable W1.

[[-0.03098412 -0.33744411 -0.92904268 0.62552248]] [[-0.04381817 -0.47721803 -1.31386475 0.88462238]]

Error: Wrong output for variable W2.

2 Tests passed

1 Tests failed

It’s only when I use: *np.sqrt(1/(l)) that I get “all tests pass”

image002.jpg

The error has already been fixed by @vanbrackin.

It is indeed possible to pass the public tests with an incorrect implementation by chance. I’ll open an issue about this.

Good luck with the rest of the course! :slight_smile:

Hi @vanbrackin
May try this:
np.sqrt(2./(l)) → np.sqrt( 2./ layers_dims[l-1])
Denominator should be the number of unit on layer [ l - 1], not the index of current layer l