DLS course 5 week 1 RNN forward

Of course there’s a “meta” point here as well: when something doesn’t work, it’s worth investing the additional effort to figure out why it didn’t work. There’s a lesson to be learned for future such endeavors.

Paul,

I agree there is a lesson here. A lesson that I try my best to figure out each and every time. I don’t just come on here to “get the assignment to work so I can move on.” I’m not being asked to do this for school or work. I’m doing these courses for my own knowledge.

I think there is a bit of survivorship bias that you’re not seeing in my questions and problems. 90% of the problems I have I figure out by working through on my own or rewatching the lectures or simply trial and error and then figuring out from what works why that would work.

5% of my remaining problems I go through this site and read everything that I can by searching my problem in multiple ways. And often that works, often its things YOU have said to other people that help me figure it out and I’m appreciative of you and others for that.

However there are 5% of problems, the ones I end up posting, that survive and make it to me posting. And I’m guessing many of them are simply I’m not understanding how the question in the assignment is posed or my brain is thinking one way because of some previous parts of the problem. I’m not coming on here and posting heres my code, fix it. Or, tell me a way to make the code work so I can get through the assignment.

I’ve been on here asking questions about problems and quizzes AFTER I’ve already passed them. This is for understanding, not to just pass. I’ve done additional “ungraded” parts of assignments to deepen my understanding.

I don’t have a computer science background. I work full time as a biomedical scientist. I don’t quite yet have all of the intuitions that a seasoned programmer might have. All the python and C++, data structures, algorithms that I have come from self learning online. So I may have what would appear to be silly gaps, or common sense gaps.

So if I’m having a problem with inititializing an array of a certain size to all zeros, trust that I’ve read the documentation, I’ve read ten other people work through similar problems. And I’ve worked through it with you several times. However, maybe the intuition isn’t quite there.

I think these types of problems are very well what would lead many people to drop trying to learn programming, or drop these courses. I’m very stubborn on the other hand. But feeling like they can’t figure out something as simple to everyone else as slicing into an array might make them feel inadequate and like they are impostors.

Just my point on “well have you considered figuring out why it works rather than just getting the right answer!?!?”

Thank you for the detailed explanation of your process. I hope this means that you’ve found the solution in this particular case, based on my last reply about printing the shape.

Paul,
When I used the print statement from your last reply it showed twice in my output. at first a0 shape was 5,10… then at the end of the statement it showed up again a0 shape as 7,8.

Worth noting that when I changed my ‘a’ and ‘ypred’ initialization to include dtype = float the whole thing ran perfect and passed all tests.

But to your earlier comment it is important to understand why. I think that it has to do with my earlier function rnn_cell_forward perhaps requiring floats rather than tuples or ints. So that might have been the problem. A simple data type error.

Secondly however I realize that a0 shape should be 5,10 if na and m are 5 and 10 in shape respectively. However I do not see why they would become 7 and 8 later on since I am noto adding any hidden units or subtracting any m number of training samples in the mini batch.

So I’ve gotten the all pass but these are two things I’m still curious about if you have any advice.

Thanks

There are two separate test cases with differing dimensions to make sure that you write your code in a general way, meaning not “hardcoding” any of the dimensions.

1 Like

That explains it! thank you!