# Misspellings in the C1W4 assignment notebook

I’m just looking at the M4ML C1W4 assignment for the first time. There are a number of simple misspellings. I’m not a mentor for this course, so I don’t have access to the github repo but I’ll list them here and hope someone can forward this to the appropriate course folks.

Section 1:

`As you might recall, a **discrete dynamical system** descibes a system where,`

“descibes” should be “describes”.

`Each discrete dynamical system can me represented`

“can me” should be “can be”

Exercise 2:

`is also an eigenvector to the eigenvaue 1, so yo can simply scale`

“eigenvaue 1” should be “eigenvalue 1”

“so yo” should be “so you”

`Long-run probabiltites of being at each webpage:`

“probabiltites” should be “probabilities”

`Here is a fun fact, this type of a model`

“fun fact,” should be “fun fact:” so I guess that’s not technically a misspelling but a grammar error.

Section 2

`To perfrom the dimensionality reduction`

“perfrom” should be “perform”

Section 2.1

`transforming them to black and withe`

“withe” should be “white”

Section 2.3

`Now you are all set compute the eigenvalues`

“all set compute” should be “all set to compute” I think anyway …

`Each of the eigenvectors you found will represent one principal components.`

Shouldn’t it be “one principal component.”? Either that or “one of the principal components.”

Maybe this sounds a bit petty, but I thought that the very first graded section was kind of insulting. How exactly is it educational for me to actually have to “hand copy” all the elements of the P matrix? I lose two or three minutes of my life that I will never get back for exactly what educational value? The point here is that I know how to set the X0 vector and write the dot product, right? What else is there to learn here?

In the `center_data` function, they have us compute the mean vector and then use repeat and reshape operations to get it to equal the shape of the original matrix. Of course you don’t really need to do that: broadcasting works just fine. But maybe as a philosophical matter, mathematicians have a problem with that and prefer to manually do the repeat operation to implement the equivalent of broadcasting. I haven’t actually listened to the lectures yet, so I don’t know whether they discuss this point.

1 Like

But wait, there’s more!

Section 2.5

### 2.5 Analizing the dimensionality reduction in 2 dimensions

“analizing” should be “analyzing”

Section 2.6

`A natural question arrises`

“arrises” should be “arises”

Section 2.7

`As you can see, the explaied variance falls pretty fast,`

“explaied” should be “explained”

`Most of this reconstructions look pretty good`

“this” should be “these”

`you can plot the cummulative explained variance`

“cummulative” should be “cumulative”

`look after the recostruction`

should be “reconstruction”

`you can play around with different amount of explaied variance`

“explaied” again

`You can elso explore how the reconstruction for different images looks like.`

“elso” should be “also”. I think “looks like” should just be “looks” there.

`As you can see, PCA is a really usefull tool for dimensionality reduction. In this assignment you saw how it works on images, but you can aplly the same principle to any tabular dataset.`

“usefull” should be “useful”.

“aplly” should be “apply”.

1 Like

Thank you so much for your feedbacks. I will open a git issue related to it and we will fix it as soon as possible.

Regards,
Lucas