DLS. W2_Assignment_1. Python Basics with Numpy. Ex.7 - Softmax mistake

Hello!

I think my softmax() function doesn’t work correct.

The error image:

Thanks

You are correct, it doesn’t work correctly.

The “maximum recursion depth” error means that you have written a function that calls itself. That’s generally bad - unless the instructions told you do do that. Which in this assignment, they did not.

Сould it be caused by the fact that I took the wrong argument when defining the softmax() function? I took ‘x’ but I think it should be implemented as matrix with shape m*n. What is my mistake?

{mentor edit: reply removed - it was the answer to a question that wasn’t asked}

No, that’s not the problem. Tom already explained the problem: you are calling softmax within the body of the softmax function. That is a mistake. There can be cases in which that is a useful thing to do (have a function call itself) and the technical term for that is “recursion”. But that is not something that I’ve ever seen used in a Machine Learning context. As Tom mentioned, you never do that unless they very specifically tell you to do that, which they did not do in this case.

You need to use lower level functions and numpy code to implement softmax within the softmax function.

1 Like

Sorry, my response was a bit off, because I focused on sigmoid(), not softmax().

From your error message, it looks like your code is doing fine where it calculates x_exp and x_sum. Those are the numerator and denominator terms in this calculation:
image

Then we get to line 29 - all that’s left to do is the element-wise division. You don’t need softmax() there again. That’s what’s causing the recursion. Just a division operator is sufficient.

2 Likes

Thank you @TMosh , @paulinpaloalto . I’ve got it.

2 Likes