C2_W4 subtitle errata

I noticed some mistranscriptions in the transcript (subtitles, captions) for the course videos. Below is the errata for the 4th week. I’ll post the errata for the other weeks in those groups.

The format for the errata is as follows. The approximate time is in braces, then the mistranscribed word or phrase follows, followed by a hyphen, then followed by the correct word or phrase. So like this:

[time] mistranscribed word or phrase - correct word or phrase
Video 1 - decision tree model
[0:07] One to the learning algorithm is this very powerful why we use many applications - One of the learning algorithms that is very powerful and widely used in many applications

Video 2 - learning process
[0:49] Tangent - ten
[2:15] No other - node
[2:45] One captain for dogs - one cat and four dogs
[6:26] The salon, and - to split on
[7:25] It gets herself - to get to itself
[8:15] priority - purity
[9:06] sense - set
[9:15] maybe dogs to other three adults - mainly dogs, 2 out of 3 are dogs
[9:34] Errands - algorithms
[9:57] sweating - splitting
[10:12] G - tree
[10:21] Average - algorithm
[10:44] Two complicates - too complicated
[10:52] Atoms - algorithms

Video 4 - choosing a split
[6:27] In purity - impurity
[6:34] Way to entropy - weighted entropy

Video 5 - putting it together
[1:16] Clause - class
[8:03] slit - split
[8:16] Start splitting - stop splitting
[8:39] Leaf note - leaf node

Video 6 - one hot encoding
[3:37] Just in the side - just was an aside
[5:08] New network - neural networks

Video 7 - continuous valued features
[5:02] Note - node
[5:36] Recursive, li - recursively

Video 8 - regression trees
[0:21] The this three feature - these three valued features
[3:36] spit - split
[8:46] Year shaped feature to spot on - hear shape feature to split on

Video 9 - using multiple decision trees
[0:13] Arrow - algorithm

Video 11 - random forest algorithm
[0:04] something - sampling
[0:22] On sample - ensemble
[3:01] back decision tree - bag decision tree
[3:41] Notes near the root note - nodes near the root node
[4:03] Note - node
[4:05] Slit - split
[5:18] Something - sampling
[5:30] Something - sampling
[5:52] a view - with you
[6:07] our room - algorithm

Video 12 - XGBoost
[0:07] On samples - ensemble
[0:41] album - algorithm
[0:46] size them - size m
[0:49] Something - sampling
[2:44] something - sampling with
[3:17] Something - sampling
[3:37] album - algorithm
[4:27] russet - boosted
[5:17] Something - sampling
[5:21] Ways - weights

Video 13 - when to use decision trees
[2:11] Notes - nodes
[2:43] cut - cat

Video 14 - Chris Manning interview
[5:49] on the thesis - honors thesis
[5:50] past tents - past tense
[7:08] language images - languages
[8:10] [Inaudible] - a day
[8:27] waiting - weighting
[11:20] On this - not an
[14:55] New networks - neural networks
[15:48] [Inaudible] - Bert
[17:02] several names are mentioned but not in the transcript
[18:09] is the - is to
[18:10] paintbrush is too - paintbrush is to
[18:32] Burton GPT - Bert and GPT
[18:50] hit - hid
[21:56] Knicks word - next word
[23:56] LOP, the last language model - NLP, the large language model
[27:43] M R P - NLP
[28:08] problem engineering - prompt engineering
[31:04] New network - neural network
[33:44] automation magnitude - order of magnitude
[34:24] Our rooms - algorithms
[39:48] your network - neural network
[41:42] cow course - calc course
[42:50] of P torch - or PyTorch
[43:01] Neuro network - neural network
[46:57] NFP - NLP
[46:59] NFP - NLP

Thank you, @Thomas_A_W! I will file a report to the course team about this.

Raymond