Notes on some problems with the exercise page "Python Basics with Numpy"

I don’t know whether anyone is interested in notes, I have noticed that coursera pages seem to decay / become obsolete and then don’t get fixed (looking at you IBM), maybe Coursera charges for changing things?

The following may be noted in the page “Python Basics with Numpy”:


We read:

x = [1, 2, 3] # x becomes a python list object

basic_sigmoid(x) # you will see this give an error when you run it, because x is a vector.

Let’s be clear and keep using “list” instead of “vector”, or else note that a Python list is informally also informally called a vector or array (but if I understand correctly they are in fact none of that, they are just anytyped lists, not great).


We read:

In fact, if 𝑥=(𝑥1,𝑥2,…,𝑥𝑛) is a row vector then np.exp(x) will apply the exponential function to every element of x. The output will thus be: np.exp(x) = (e^{x_1}, e^{x_2}, ..., e^{x_n})

Should be

In fact, if 𝑥=(𝑥1,𝑥2,…,𝑥𝑛) is a list then np.exp(x) will apply the exponential function to every element of x. The output will thus be: np.exp(x) = (np.exp(t_x[0]), np.exp(t_x[1]) , np.exp(t_x[2]). Note the 0-based indexing in code.

The exercise coming directly after could then be extended:

import numpy as np

# example of np.exp
t_x = np.array([1, 2, 3])
print(np.exp(t_x)) # result is (exp(1), exp(2), exp(3))

# t_x is indexed on 0,1,2:
print(t_x.shape);
print([ np.exp(t_x[0]), np.exp(t_x[1]), np.exp(t_x[2]) ])
print([ math.exp(t_x[0]), math.exp(t_x[1]), math.exp(t_x[2]) ])

t_m = np.array([[1,2,3]])
print(t_m.shape);
print(np.exp(t_m))
print(np.exp(t_m[0,0]), np.exp(t_m[0,1]), np.exp(t_m[0,2]))

Directly above “Exercise 3 - sigmoid” there is an URL to the official documentation of Numpy. That URL is however, out of date.

It is for version 1.10.1 and we are using Numpy 1.18.4 as the statement “print(np.version)” shows.

But note that the notbook “Help” menu items links to Numpy documentation too, but it’s going too far: Release 2.2.

Maybe the notebook text should just says “look under help in the menu”.


Exercise 4 - sigmoid_derivative - is confused about naming.

Is it “sigmoid_derivative” or “sigmoid_grad” ?

Should one note that one does not really compute the “gradient” (which is a function and which depends on the parameter x by defintion) but the “value of the gradient” at x?


The speed comparison in “Vectorization” is not fully convincing. I don’t know whether the differences get swamped by the ancillary work that the Jupyter page must do? On my machine:

dot

Loopy : 0.10762800000008177ms
Vectorized : 0.11183700000017005ms

outer

Loopy : 0.20198999999987421ms
Vectorized : 0.10581700000011907ms

elementwise multiplication

Loopy : 0.1160769999999367ms
Vectorized : 0.10042299999990512ms

gdot

Loopy : 0.22494499999980988ms
Vectorized : 0.10102900000008574ms

Thanks for your suggestions.

1 Like